Nvidia GeForce RTX 2070 Super vs. AMD Radeon RX 5700 XT: 2020 Update

The matter of the fact is that nvidia has more features than the AMD card, features that will be useful even with next generation of games, that is RTX and DLSS. The RX5700XT, other than cheaper price, doesn't have much going for it. It has crappy drivers with constant blue/black screens, flickering, visual bugs and what not. So any sane person will spend the extra dollar and get a quality product, like the RTX 2070 Super instead of cheap product that will become obsolete once consoles are out (the amd card doesn't support RTX + other features in DX12)
 
Last edited by a moderator:
We see Minecraft ray traced demo in Xbox X right? Since the videogame will run every game at 60fps@4k and Rtx2080ti run the same title at 40fps without dlls I think RDNA 2 will have a improved solution to selected Nvidia dlls games.
This is the worth time to buy new vga ever.
Let's wait for rdna2/ampere gpu and buy a 5700xt for U$200 or 2070s for U$230
I wonder if AMD will sell performance upwards of 12 teraflops rdna2.0 gpu above the cost of the whole system in next gen consoles.
 
25% more expensive which only a 7-9% performance increase doesn't really warrant the 2070 Super vs 5700 XT.
And you have 0% support for raytracing on the current Navi cards which automatically make them a non starter for anyone looking at gaming beyond this year.

I'll take tech that has a future over dead end good deals today.
 
All the woosies crying here are just cheap a$$ AMD fanboys. The matter of the fact is that nvidia has more features than the AMD card, features that will be useful even with next generation of games, that is RTX and DLSS. The RX5700XT, other than cheaper price, doesn't have much going for it. It has crappy drivers with constant blue/black screens, flickering, visual bugs and what not. So any sane person will spend the extra dollar and get a quality product, like the RTX 2070 Super instead of cheap product that will become obsolete once consoles are out (the amd card doesn't support RTX + other features in DX12)
8 wouldn't buy anything right now whatsoever but if I was forced to it would be a rtx card which actually has a future in Dx12 ultimate and rtx and DLSS vs the dead end that is current Navi.
 
As for the DLSS comment. No, you can't simply use RIS and that makes it all okay. Nvidia also has an image sharpening technology which is arguably better, so that neutralizes RIS. Moreover, Nvidia's image sharpening filter can be used with DLSS.

Technically FreeStyle can be used with DLSS 2.0 but that's if the game is on both the FreeStyle white list and has DLSS 2.0 support. You guys said in your RIS vs FreeStyle article:

"But there is a whitelist of games that it works with, so titles like Hitman 2 and Resident Evil 2 aren’t supported, for example."

If you are looking to save performance, which approach is best?

You guys did an article with Nvidia FreeStyle vs RIS: https://www.techspot.com/review/1884-amd-ris-vs-nvidia-freestyle-vs-reshade/

and one with DLSS 2.0: https://www.techspot.com/article/1992-nvidia-dlss-2020/

but nothing comparing the new DLSS 2.0 to any of the existing technologies in image quality and performance. I would not mind seeing DLSS 2.0 + FreeStyle vs RIS, DLSS 2.0 vs FreeStyle. I realize that image quality is somewhat subjective but given high resolution images, we should be able to see material benefit.
 
Because:
1) They're on the same performance tier, and
2) The 2060S was already pitted against the 5700 XT. This leaves the higher-performing 2070S to Duke it out with the 5700 XT, which performs similarly.

Besides, I make use of RIS (or FidelityFX when available) in tandem with Radeon Boost (or the in-game dynamic resolution scaling) and integer scaling. DLSS 2 is great in Wolfenstein, for sure, but what I can do with AMD is more widely available across several games with little-to-no performance and visual impact. It's basic by comparison, but I think RDNA 2 will utilize similar technology to Turing at a more local level. Think in the CU level for cache and RT. But that's a different discussion altogether.
Nvidia ALSO has image sharpening (that's been reviewed to be a better option than RIS) and can offer this on all titles just like amd.

At this time amd has no answer for what DLSS 2. 0 can offer and what the consoles will offer is also unobtainable by the current amd cards.

Nothing from amd is worth buying right now (really nothing from anyone should be bought at the moment) but at least with Nvidia you have access to future facing technologies that will become more and more necessary on a rtx card where the Navi will never be more than they are as of today.
 
I would avoid using That example as you Don't typically see Nvidia improve that drastically in a generation. It's an outlier generation that definitely have Nvidia the room to price gouge Turing customers.

It's not really an outlier, it's just a full node jump.

Maxwell GTX970 was TSMC 28nm, Pascal (GTX10) went to TSMC 16nm. Big density gain, in addition to the architecture gains.

Turing (RTX20) was only a half node, not even that arguably. It was called '12nm' but it was only a very slightly improved 16nm, with modest transistor density gains. Basically 16nm+.

So there was never going to be a big jump from Pascal to Turing, especially since a huge amount of die area was handed over to the ray tracing and Tensor cores.

As far as we know Nvidia have gone to 7nm EUV. So thats second generation 7nm, or 7nm+. Whether it's Samsung's or TSMC that is a full node jump over TSMC 12nm. Another big density gain.

If the clocks scale well too and Radeon VII suggests they will, you're looking at another potentially large step on the scale of Maxwell to Pascal in 2016.

We see Minecraft ray traced demo in Xbox X right? Since the videogame will run every game at 60fps@4k and Rtx2080ti run the same title at 40fps without dlls I think RDNA 2 will have a improved solution to selected Nvidia dlls games.
This is the worth time to buy new vga ever.
Let's wait for rdna2/ampere gpu and buy a 5700xt for U$200 or 2070s for U$230

Nobody official has said every Series X game will run everything 4K 60FPS. Most developers may target it but it's not a guarantee.

That Minecraft ray traced demo ran 1080p and averaged over 30FPS, but not a lot more.

RTX2080Ti averaged 70FPS @ 1080p native on Tomshardware tests. The Series X demo didn't display any more performance than say an RTX2060 Super or RTX2070 which averaged ~45FPS.

Don't be thinking Series X is somehow much faster than existing RTX cards.
 
Last edited:
Reasons I chose RTX2070 Super(Gigabyte Windforce OC3X) over RX5700XT, when upgrading existing PC.

1. Already have 27" 1440P - 144 - G-Sync monitor
2. Don't want to deal with driver issues (Possibly no issues anymore, but don't want to risk it)
3. Was really impresses with DLSS 2.0 in Control. Have hopes that they will add it to more and more games. I mean 95%+ visuals for almost double the frame rate in the same resolution??!!!!??? If this thing takes off I have hopes of playing AAA titles in 4k (without Ray tracing) at 60 FPS +!!!!
4. Got Gigabyte Windforce OC3X 2070 Super on Ebay "new" for $450 (seller: New egg on ebay account). GPU-Z checks out. Card was in original box (box not sealed), card sealed in a plastic package with sticker "tested by gigabyte, I suspect it was an open box or refurb, but works awesome and still got same warranty (already registered). That is almost price of RX5700XT, and even if I got RX5700 ($320 with good cooler) and flashed it to RX5700XT, I still have above reasons.

If I was building a system from scratch I would possibly go with RX5700 and flash it to RX5700XT... That's $150 savings but if you are invested in G-Sync monitor already ($300 +)

I am really hoping AMD launches Big Navi with something similar to DLSS 2.0 that is supported across many games, without driver issues, with support for G-Sync and Free Sync at the same time (I know I am smoking too much here) and same good value as RX5700 / XT. Even if I won't go with AMD again, this will force N-Gridia to keep prices competitive (at high end level) and we all win.
 
But unless nVidia finds a way for something resembling plug and play implementation, it's a DOA.
I don’t think they’ll need to if the implementation of DLSS 2.0 is less work for a dev than it is for overall performance fine tuning. If one’s rendering code is very pixel bound then DLSS is potentially an easy fix. What Nvidia really need to work on is creating a tool kit for it, that allows code to be ‘drag n dropped.’
 
I don’t think they’ll need to if the implementation of DLSS 2.0 is less work for a dev than it is for overall performance fine tuning. If one’s rendering code is very pixel bound then DLSS is potentially an easy fix. What Nvidia really need to work on is creating a tool kit for it, that allows code to be ‘drag n dropped.’

From what I have been seeing, Nvidia are very much trying to make DLSS plug and play. It may need a little hand tuning game to game but nothing extraordinary.

It's a killer feature. Nvidia have been taking a battering from some quarters for ray tracing and weak DLSS support, but they are taking an approach where if you use DLSS and it works you can have your ray tracing cake and eat it. It's not too outrageous to say it's a revolution.

Slightly more than 1080p or 1440p render cost with native 4K plus TAA image quality. As Techspot have demonstrated a few times now, the gains to be had on the table are massive.
 
The question should be which one is more future proof to handle next generation graphics? Anything below spec of next gen consoles is a commodity at this point.
Also does anyone know when Crysis remastered is coming?
I have an AMD R9 card it holds its performance very well but my friends that had a Nvidia card by that time complain a lot. Compare my R9 380 to Nvidia's GTX 960 now in latest titles (they used to compare these cards at time) and you'll understand what I mean.
 
Please forgive my ignorance, as you may have mentioned why this is in past and I have just missed it is all..

Why in the 35 game round-up for testing, have you not included red dead redemption 2? Is it because of optimisation reasons or something else?

Thanks in advance for your time and replying to my query.
 
At the end of the day, whatever you paid for your video card and can sleep at night without too much buyer's remorse should be enough. For every example that one card is better than the other, there's an argument to be had which counters that position (whether it be subjective or objective). Either way, I enjoy these articles - so thanks TechSpot!
 
As a huge Minecraft player, I really can’t see any other choice outside of an RTX card for now. The RTX implementation on Minecraft is outstanding, far more comprehensive than we have seen on anything using tensor cores before. Although I am a java player, an RTX card might just getting me finally playing bedrock.
 
From what I have been seeing, Nvidia are very much trying to make DLSS plug and play. It may need a little hand tuning game to game but nothing extraordinary.

It's a killer feature. Nvidia have been taking a battering from some quarters for ray tracing and weak DLSS support, but they are taking an approach where if you use DLSS and it works you can have your ray tracing cake and eat it. It's not too outrageous to say it's a revolution.

Slightly more than 1080p or 1440p render cost with native 4K plus TAA image quality. As Techspot have demonstrated a few times now, the gains to be had on the table are massive.

I wouldn't quite count on that. 4A came out and said that retooling Metro Exodus for 2.0 from 1.x would take too much time and distract from other projects. And that's for a title that already had a lot of DLSS work done to it.

Control is the *best* case scenario for implementation as they've had NV consultation heavily involved in the title since day one. They're one of the first studios to get something out the door with this, but it's a title that was built almost exclusively with Gameworks tools from go.

It's not like DLSS 2.0 was dropped on the world 4 weeks ago and it was just plug and play for the titles that have it. It's been tested and refined with those titles for months, with those studios, to get it where it is now and still isn't a drop in for title that already have DLSS hooks.
 
I don’t think they’ll need to if the implementation of DLSS 2.0 is less work for a dev than it is for overall performance fine tuning. If one’s rendering code is very pixel bound then DLSS is potentially an easy fix. What Nvidia really need to work on is creating a tool kit for it, that allows code to be ‘drag n dropped.’

And that is a compelling argument, if they can refine it to be that easy. Will be interesting if this affects crossplatform title development. 4k engine rendering will be abandoned and 1080p again becomes the default resolution for target.

PC gamers get to enjoy upscaling and all the console users are confined to 1080p for the next 6-8 years, due to no DLSS support.
 
Why in the 35 game round-up for testing, have you not included red dead redemption 2? Is it because of optimisation reasons or something else?
RDR2 was included, see the performance breakdown section. Almost no difference between the two GPUs despite the price difference.

Also, for more data check out:
 
Could DLSS 2.0 be the first feature Nvidia makes that is not a useless gimmick? I would be impressed if the game support was decent but they always struggled in that area.
I mean lowering resolution, then using image sharpening has been around for decades. They say ai is part of it. But I doubt it's any different then ris and lowering your res.
 
Clear as day which GPU has consistent performs and subsiquently worth the money.

No one wants to spend money on a new GPU just to find out the one game they like most runs like its on last gen's entry level GPU.
 
I mean lowering resolution, then using image sharpening has been around for decades. They say ai is part of it. But I doubt it's any different then ris and lowering your res.

Yeah let me just give you some idea about DLSS, these 3 images all use Nvidia Freestyle IS
Resolution scale 80%
Native
DLSS
DLSS actually look better than native resolution, all the jaggies are gone, and this is with all the settings maxed out. Nvidia DLSS 2.0 white page mention about Nvidia use 16k images to train the AI network, that's why they are called super sampling for a reason.
 
Last edited:
Is it mentioned anywhere in this article which variation of the cards they're using?
Is it the base 5700 XT for example? With the basic blower-style cooler? And if it's one with a custom cooler, which one?
Can have a pretty big impact on performance.
 

No laptop can perform better than a custom desktop pc build. I can buy a laptop here as well for less than 2k with better hardware than listed in your laptops above.

You should learn a lot more about hardware man.
 
Could DLSS 2.0 be the first feature Nvidia makes that is not a useless gimmick? I would be impressed if the game support was decent but they always struggled in that area.
It depends on how many titles implement it and how well it works in those titles. With a few outlier results where DLSS will give better FPS, I think that just using Nvidia's own image sharpening and an 80% resolution scaling should give you similar results. The same goes for AMD cards too.
 
It depends on how many titles implement it and how well it works in those titles. With a few outlier results where DLSS will give better FPS, I think that just using Nvidia's own image sharpening and an 80% resolution scaling should give you similar results. The same goes for AMD cards too.

Do you even have an RTX card to back up your claim lol. DLSS in quality mode can boost performance anywhere from 35-70% depending on the title while giving approximately equal visual to native resolution.
Let just leave the image sharpening subject aside because you can apply that to native resolution and DLSS too, so 80% resolution scaling with IS would objectively look worse than native resolution with IS.
I already post some uncompressed images from Mechwarrior 5 in the post above, you can download and compare between the 80% resolution scaling, native and DLSS images.
 
Last edited:
Back