Radeon RX 5700 XT vs. GeForce RTX 2060 Super: 2020 Update

I had for a short period a 5700xt Sapphire special edition OC and perfomance was low vs my 1080ti especially in low and minimum frame rate
I thought AMD fixed the low and minimum frame rate issues?
Their architecture has been 'somewhat kind of' based on clock speed, so "before" they always dipped badly but I thought that was addressed in the last few generations of Radeons?
 
Nope 1 month ago still not working! I'm quite ok with my 1080ti but I wanted to try the mighty 5700xt
I'm a little dissapointed!
 
Nope 1 month ago still not working! I'm quite ok with my 1080ti but I wanted to try the mighty 5700xt
I'm a little dissapointed!

a 1080 Ti to 5700XT is a downgrade.

That nv gpu is still 10-15% faster at stock clocks.

Surprised at your disappointment considering that information is on this site.

 
Last edited:
AMD needs to do better than 10% faster and 10% cheaper. The lack of DLSS is not helping AMD at the moment, and even if the driver issues have supposedly been resolved, the fact that it took so long for AMD to even acknowledge them is unacceptable.
 
AMD needs to do better than 10% faster and 10% cheaper. The lack of DLSS is not helping AMD at the moment, and even if the driver issues have supposedly been resolved, the fact that it took so long for AMD to even acknowledge them is unacceptable.

For the...four games that have the new DLSS? The first pass at DLSS was pretty much crap. Only four games support 2.0. How is it a major let down when almost nothing actually uses it?
 
I thought AMD fixed the low and minimum frame rate issues?
Their architecture has been 'somewhat kind of' based on clock speed, so "before" they always dipped badly but I thought that was addressed in the last few generations of Radeons?

His issue might have nothing to do with driver issues and everything to do with comparing it to a 1080 Ti. That goes double if he's playing at higher resolutions like 4K, which the 5700 XT simply isn't made for. As an example, the 5700 XT is 30% behind the 2080 Ti while at 4K it's 48% behind. When you see scaling like that on a mid range card, that performance difference is more attributable to it being a mid range card instead of the architecture itself. A high end navi will see linear performance scaling at higher resolutions.

The guys on this site show other numbers Ghost Recon and other games!
https://gamegpu.com/action-/-fps-/-tps/ghost-recon-breakpoint-test-vulkan

I had for a short period a 5700xt Sapphire special edition OC and perfomance was low vs my 1080ti especially in low and minimum frame rate

Kindly publish your data along with your post. You are claiming to have lows and min. Single Pass benchmark or Tripple pass benchmark? Test config, resolution, ect. If you are merely stating that you observed one off instances of performance numbers you randomly gleamed while playing from likely different parts of the map/game then also state so.
 
Last edited:
His issue might have nothing to do with driver issues and everything to do with comparing it to a 1080 Ti.
Agreed.
I meant no pun on my comment, from reading Steve's excellent reviews, it seems that they both have great low/mins, alteast on paper, which doesn't always transfer perfectly to the real life experience.
Something a few people who 'write about it more then they actually use it' might not understand.
 
Agreed.
I meant no pun on my comment, from reading Steve's excellent reviews, it seems that they both have great low/mins, alteast on paper, which doesn't always transfer perfectly to the real life experience.
Something a few people who 'write about it more then they actually use it' might not understand.

Unfortunately somethings are very hard to represent with data. This reminds me of when Ryzen 1000 series launched and people were saying they ran games smoothly but there was never any data to back the observation up.

There is definitely some weight to an observation many people are making but I personally feel like that if I do make mention of it, it's properly noted as a widely seen observation.

This is also why techspot will run articles after seeing many people having the same issue, like the recent AMD driver problem. At least that is one thing they can partially verify.
 
Last edited:
Only ingame benchmark guys at 2k res on Shadow of tomb raider, AC Odyssey, Ghost Recon Breakpoint 5700xt is fall behind 1080ti
 
"AMD has yet to come up with their own DLSS-like technology and it'd be interesting to see if that will require new hardware."
... So we're seriously just going to pretend that Radeon Image Sharpening doesn't exist... -_-

For those that don't remember, this is the same site that found that tech superior to DLSS 1.0, so suddenly pretending that it doesn't exist & AMD can never further improve it further feels like an insult to my intelligence as a reader/watcher. Especially considering it works with most all games when DLSS 2.0 at this point literally only works with 4.
 
Unfortunately somethings are very hard to represent with data. This reminds me of when Ryzen 1000 series launched and people were saying they ran games smoothly but there was never any data to back the observation up.

There is definitely some weight to an observation many people are making but I personally feel like that if I do make mention of it, it's properly noted as a widely seen observation.

This is also why techspot will run articles after seeing many people having the same issue, like the recent AMD driver problem. At least that is one thing that can partially verify.
There was plenty of data to back up Ryzen 1000's superior frame-times (but worse framerates) & .1% lows in many modern games to Intel quad-core's at the time... You must not have been searching well or something, because that exact topic was EXTENSIVELY tested by numerous people.
 
For the...four games that have the new DLSS? The first pass at DLSS was pretty much crap. Only four games support 2.0. How is it a major let down when almost nothing actually uses it?

Pointing out that a new feature has yet to be widely adopted is not a very solid foundation for a counterargument in this field. Especially given DLSS 2.0 was designed with universality (one network for all games) in mind; adoption should occur much quicker this time around. But, I’ll humor this opinion and imagine 3 years from now DLSS is still the same disaster it was when it launched. It still doesn’t speak to driver quality.
 
You must not have been searching well or something, because that exact topic was EXTENSIVELY tested by numerous people.

This review says otherwise:


Please provide said plentiful data that might prove otherwise.

There was plenty of data to back up Ryzen 1000's superior frame-times (but worse framerates)

Wrong, as gamersnexus will demonstrate:


The 7700K consistently bested the 1800X in both frame times and average FPS by a significant margin.

If I had a nickle for every time someone said they were absolutely correct and that all the data said they were right all the while providing no links to said data I would be a billionaire by now.
 
Pointing out that a new feature has yet to be widely adopted is not a very solid foundation for a counterargument in this field. Especially given DLSS 2.0 was designed with universality (one network for all games) in mind; adoption should occur much quicker this time around. But, I’ll humor this opinion and imagine 3 years from now DLSS is still the same disaster it was when it launched. It still doesn’t speak to driver quality.

Here's the problem: RTX has been out a year and a half and how many games had RTX capability when they were released? Less than 10 (less than 5?). A few got it 6 months after release, after most people already played them. DLSS 1.0 didn't even work as well as a decent sharpening filter but still cost a few frames. Burned twice on 2 distinguishing features. The rest of the GPU is damn good though, if a little overpriced compared to the competition.

Nvidia fixed DLSS with v2.0 and it looks good. That's fine but after the previous failures, they need to prove that it's worth investing in, and that comes with *widespread* game implementation, not tech demonstrations and a couple of games. That's simply not here yet and may never be.

Much like AMD with Ryzen, when you're asking people to part with their money, they don't want promises, they want results.
 
If I had a nickle for every time someone said they were absolutely correct and that all the data said they were right all the while providing no links to said data I would be a billionaire by now.

This reminds me of some guy insisting Dell Ultrasharps have the best image quality by design even after I shown that his beloved 60Hz only U2719D got outperformed by a cheaper Acer Nitro VG271U 144Hz in everything except the stand on Rtings.

 
Nvidia fixed DLSS with v2.0 and it looks good. That's fine but after the previous failures, they need to prove that it's worth investing in, and that comes with *widespread* game implementation, not tech demonstrations and a couple of games. That's simply not here yet and may never be.

Much like AMD with Ryzen, when you're asking people to part with their money, they don't want promises, they want results.

It needs to work in every game IMO. Some people seem to think this is going to revolutionize games by reducing performance requirements but we saw the exact same song and dance from Nvidia before.
 
I thought it will be an upgrade!
but no!
As good as the 5700 XT is, the raw numbers of the 1080 Ti paint a clear picture:

5700 XT vs 1080 Ti
Shader units = 2560 vs 3854
TMUs = 160 vs 224
ROPs = 64 vs 88

Yes, the 5700 XT has a default boost clock much higher than the 1080 Ti's (1905 vs 1582) but I used to have a card that used the same chip as the Ti, and it would routinely boost to 1850 MHz. Obviously not all cards are the same, but the 1080 Ti was, and still is, an extremely capable GPU.
 
Back