GeForce RTX 2070 Super vs. Radeon RX 5700 XT: 37 Game Benchmark

Seeing that on average the 5700XT is around $185 cheaper Im not so sure the 2070 Super is all that appealing @ 1080 and 1440 P.
Seeing that on average the 5700XT is around $185 cheaper Im not so sure the 2070 Super is all that appealing @ 1080 and 1440 P.
I myself had that canondrin. I run a machine with a 1080ti for gaming, working, etc. Some of my renderings take 48 hours to produce (lumion videos), so I started to plan a "spare" pc, for renderings ony and went for the 5700xt (to give it a chance, even though I considered buying a used 1080ti ) paired with the ryzen 9 3900x, it outperformed the ti machine by almost 30% for work and 15% for gaming... , what a pleasant surprise :)
 
I'd buy the 2070s simply because I don't trust AMD hardware. I've owned three AMD GPUs. The first one,. Radeon 9800 Pro back in the day doesn't even count because that was ATI back then. The other two? Went up in smoke inside my rig.

The moral of the story is you get what you pay for. 5700xt at $400 sounds tasty but then you got AMD saying it's okay if the thing runs at 110c. Yeeeaaaa no. Gtfo AMD.
I'd buy the 2070s simply because I don't trust AMD hardware. I've owned three AMD GPUs. The first one,. Radeon 9800 Pro back in the day doesn't even count because that was ATI back then. The other two? Went up in smoke inside my rig.

The moral of the story is you get what you pay for. 5700xt at $400 sounds tasty but then you got AMD saying it's okay if the thing runs at 110c. Yeeeaaaa no. Gtfo AMD.
I am using a 5700xt paired with a 3900 or demanding renderings (supposed to be a sort of "spare" machine while the 1080ti one was rendering) it outperforms the 1080ti by about 30% all around at less than 80c average , a pleasant surprise :)
 
lol yeah sure with AMD Q2 2019 net income of 35M vs Nvidia net income of 552M I don't know who are pricing the other out of market muahhahah
https://www.anandtech.com/show/14745/nvidia-q2-fy-2020-earnings-report-continuing-crypto-disruption
https://www.anandtech.com/show/14691/amd-quarterly-earnings-report-q2-fy-2019
Nvidia Gross Margin 60% vs AMD 40%.
After 2.5 years 1080Ti still remain unchallenged by AMD, this is the longest in the history of Nvidia vs ATI, yeah I guess Nvidia is in a tough spot here.

I don't know if you know this but 7nm cost almost twice as much as 14nm per mm2
amd-iedm-2017-23-768x434.png

https://www.extremetech.com/computi...both-solution-and-symptom-to-a-larger-problem

Even Intel is evaluating their cost effective strategy when moving to 10nm node. AMD has nowhere to go but TSMC 7nm to compensate for the deficit in efficiency but that doesn't mean AMD can produce chips any cheaper than Nvidia or Intel lol (Intel also maintaining gross margin of 60%).

And this is how close 5700XT and 2060 Super are when they are both overclocked, Techspot just doesn't have the gut to do it (probably afraid of triggering AMD fans)

What a messy argument.
Again, you are just talking and using charts (out of contex), that any Business student would be laughing at your post. Your points are not based on reality, but a string of ideas you put in a line and said: see ??

Secondly, why post a chart and not know what the person on stage said, about how Her Company is planning on tackling the Cost per Yielded mm^2 ?? Why not go watch the CEO's (Dr Su) speech? And find out why 252mm^2 -vs- 545mm^2 is always a win for the smaller chip, even if the upfront cost are higher, when it has nearly the same performance.

BTW, when TSMC & AMD, or when Samsung & Nvidia sign a deal, it comes down to per wafer. The chips & chip yields are a completely different negotiation per taping, etc. Moving your company to a completely new node can be super expensive & cost prohibited (see NVidia reject the high cost of TSMC's process and went with Samsung), but that is all factored into how long they plan on taking to recoup that massive upfront cost (ROI).

You have to spread that up front cost out, over as many chips as possible and AMD is one of the only Companies than can do this, because they make both CPUs and GPUs on 7nm. So, Dr Su is going to use Economy of Scale that the 7nm process gives her company and use that advantage to undercut her competitors, on price. Just like she is doing with other markets, only difference here is GPU for games. AMD is targeting specifically the gaming market, unlike big business of Enterprise/AI/Mining/Compute/etc of their Pro cards & Vega.

Because of this, Dr Su is about to disrupt the pricing in the Gaming GPU market. And this will suffocate nvidia, because they have nothing new to respond with, for well over a year.

A specific chip just for gaming. They are smaller and more compact and using new RDNA architecture. Nvidia might not be able to compete with RDNA and that is the billion dollar question and what we all will find out in about a years time when Nvidia comes out with Ampere at 7nm. How big will it be ??
 
Last edited:
If $.10 of electricity over the course of the year means that much to you, the 2070s is going to be WAY too expensive to purchase for you.
It's more than that. With some tuning , the 2070s does the job at 145-160Watts, which is more like $100 a year on saving when used like I do (24/7)
 
REVIEW MISSLEADING

Ok it's a nice review but it needs finishing off... they are both the best custom cards of their type but for that extra money you can't stop at saying only 9% more performance. The trio card is by far quieter than the red devil, it also draws less power from the wall and runs at a lower temp, also if you like rgb and looks it's a lot nicer, plus the after care from MSI is far better than the poor service from power color... and that's without the most important of all RAY TRACING and GSYNC with nvidia's support too.. if can afford it like myself the extra cash is well worth it.
 
A 2060 or up only makes sense if you;
  1. Play games with Ray tracing effects on a 1080p or higher monitor
  2. Are using this card 24/7 for number crunching, folding, mining. All activities where the cost of electricity exceeds the purchase price of the card.
If you're just a casual gamer, get AMD.
They're not as energy efficient, but for someone who plays less than 3 years, 8 hours a day, the $200 on cheaper purchase price will never be bought back in electric cost.
 
Not sure I understand your grammar here as your use of question marks is 'questionable', but I will bite too. What the h*ll are YOU talking about? Do you seriously not understand my point? How much does it cost to game at 4K on PC? Now how much on console? Do you still need more explanation?
Yes, kinda

The xbox one X is equivalent to this pc: https://pcpartpicker.com/list/vHm6hg

The rx 580 can play games at 4k 30 fps at low (and sometimes medium) settings

The ps4 pro gpu is probably equivalent to an rx570
 
I game exclusively on PC, but I don't understand why people that play games like Forza 4, Battlefield, Fortnite and COD don't just play it on console, save yourself wads of cash, and play on a large screen 4K TV with surround sound? Just about every game tested here is as good or better to play on console (not just graphics, ease of gameplay, compatibility, no cheaters, etc).

While appreciating a rare sighting of an open minded PC gamer, I'm even more baffled by the irony of the comment.
The games you mention are actually the very examples of games needing the precision of keyboard and a mouse, or steering wheel, to be played at their best.
 
Must say I don’t think Nvidia pulling ahead in Forza is as big a deal as the writer makes out. How many people are buying a GPU based on how well it does in Forza? Yeah I understand it’s a title that traditionally favours AMD but it’s not exactly a big hit game that people base their hardware buying decisions on. I can’t see many people who were about to buy a 5700XT for Forza spill their coffee because of this swing.

Personally I’d buy the 2070S over the 5700XT because it’s faster, cooler, quieter, has CUDA, supports GSYNC & RTX and I would rather be at the mercy of Nvidia driver support than AMD. But if I were budget restricted (which I rarely am as I am enthusiast!) then I’d probably pick the 5700XT.
 
I paid $399 for Powercolor's 5700 XT AXRX card (dual fan). It's cool, quiet and fits in my mITX case comfortably. 1440p performance is flawless, no complaints.

I considered an RTX 2060 Super, but I really didn't want to pay more for less when the 5700 XT suits my needs perfectly. If I really wanted ray tracing, then I don't think I would have been content with the 2060 Super anyway.
 
Would be interesting to see if the 5700XT had more memory bandwidth if it's 4k results would be higher.
At 4K, the card is far more shader and ROP limited, than bandwidth; so for a 5700 XT it probably won't help out all much. At 1440p though, it would certainly be of a benefit to have more bandwidth.

You can get a sense of this by looking at 1440p vs 4K results for something like a Radeon VII.


ACO.png



ACO.png


Although those two article are different test runs, the 1440p results are close enough to be comparable. Look at the Radeon VII against the Vega 64, and you can see that the former is around 20% faster than the latter at 1440p. Some of this is down to the VII's faster clock speed (so the ROP output rate is higher) but the majority of it is down to the VII having more than twice the memory bandwidth of the Vega 64.

Then look at the 4K results and you can see that the performance tanks badly, despite the VII having 1 TB/s of bandwidth. The reason being that the increase in pixels that require processing, 4K vs 1440p, means the GPU is spending a lot more core cycles running through them, rather than writing finished pixels to the memory.
 
It's more than that. With some tuning , the 2070s does the job at 145-160Watts, which is more like $100 a year on saving when used like I do (24/7)
You are telling a normal user that if he uses his gaming PC 24/7 he will pay 100$ less compared to the AMD system? Dude that's just .... I'm sorry but I'm going to be blunt, it's stupid.
The 10$ per year is for a few hours of gaming per day. If your PC is gaming 24/7 then 100$ is just spare change compared to your yearly electricity costs.

And even if we believe that you care about the power usage and that you have your PC open 24/7, I don't think the total cost is that high anyway. It would suggest that you have a game open all the time and you don't sleep for more than 2-4 hours a day.
 
People were asking if this MSI 2080 “Super” was really just a sub par GPU, while the rx 5700 XT was possibly overclocked.

After searching google, it turns out the MSI card IS overclocked (hence the “X”). It also says it in the product description.
“G-SYNC Turing Architecture Overclocked Graphics Card (RTX 2060 Super Gaming X).”
There is NO WAY the rx 5700 XT was OC. They tested the original single blower style that’s been out. Newer ones are just now coming out.
I’m a very happy owner of an rx 5700 XT 50th Anniversary Edition. This is also not an OC, it is a separate card-advertised 10% faster. I’m reality, it’s at least 20% faster than the XT.
Running comparison on pcbemchmark.com, my GPU is on par with a GTX 1080, and RTX 2080. The 2080 “Super” takes the win by the smallest fraction of frame rates. I paired it with a Ryzen 3900X (12core/24thread) CPU all cores steady at 4.1GHz. X570 platform, so it also utilizes every PCIe lane at the same time, for insane performance increase. My CPU/GPU/MOBO/RAM/M.2SSD are ALL ranked 99th percentile. Maybe they’re just built to work together. Oh wait! They ARE! And my setups costs about $3,000 less than anything INTEL & Nvidia or Gtx could hope to throw at it. It destroys intels $2,000 “server” class processor. Smashes a RTX 2080 “SuperDupertyDoo” and the m.2 is so fast! Ludicrous Speed GOOO!
 
*Reasonably* tight budget, as in, if they had a grand to spend overall. 200 for the CPU, 400 for the GPU, rest for PSU, Mobo, Case, SSD etc...

For the money, it would perform admirably, specially at 1080p.

We have very different definitions of "reasonable" "tight" and "budget". Someone on a real budget is getting "good enough" for 1080p or lower. You can do that for $400-$450 total. Even an RX 470 is g2g for 1080p high and you get those on ebay for $65. And if you've got anything intel with 4 cores and 8GB RAM back thru an i7 920, then the total is $65.
 
Last edited:
You are telling a normal user that if he uses his gaming PC 24/7 he will pay 100$ less compared to the AMD system? Dude that's just .... I'm sorry but I'm going to be blunt, it's stupid.
...Blablabla... I'm an ldiot... blablabla..
...more goat excreta here...

A 2060 or up only makes sense if you;
  1. Play games with Ray tracing effects on a 1080p or higher monitor
  2. Are using this card 24/7 for number crunching, folding, mining. All activities where the cost of electricity exceeds the purchase price of the card.
If you're just a casual gamer, get AMD.
They're not as energy efficient, but for someone who plays less than 3 years, 8 hours a day, the $200 on cheaper purchase price will never be bought back in electric cost.

Like said, in number crunching / folding / mining tasks, where you use the card 24/7, you can get nearly the same performance at a mere 125Watts.
So yes, I was wrong. Turns out the 2070s have the same performance at 125Watts, not just 136 or 145Watts, compared to their stock wattage (215W).
That's a significant savings compared to the 220 watts (or more) on AMD.

But like said before, if you'd only read what I wrote, you'd know that it's a 24/7, full load situation.
 
You can throw barbs at my old girl all you want. :D
I've gotten 10 years and counting of top tier top level gaming performance out of it, and it still rocks the latest games at High Settings, Very High Settings, or sometimes maxed @ 1440p/144Hz. I am sure there is a bottleneck here or there but I don't feel the slightest hiccup or slowdown paired with my GSync 1MS HP Omen, its silky smooth gameplay and my Pascal is doing most the work at 2560 X 1440 anyways. DX12 games will really tax the CPU though.
When I upgrade its going to be like time traveling. :)

This is a great revisit of it:

Newer chips are a little to a lot faster but I don't care, it still does what I need it to.
I bought my 930 around the time Sandy Bridge released....its been THAT long.
Talk about bang for the buck.

Nehelem/Bloomfield are the most impressive CPUs to date in my book. I've got an i7 960@4.3Ghz since 2009 with a 1070Ti that still plays games in 2019 at high to ultra settings very well.
 
Being the better value product simply means you are slower. We are talking about performance / high end cards here, better value is meaningless. If someone wants better value he goes for the 570, which im pretty sure wipes the floor in terms of value with both of them. In the high end 1% of performance causes a >1% increase of price

Plus we have to take into account RT, which looks amazing in control btw.

It depends on how much better value and if the performance meets some needed threshold. Even if it's equal value, it might be worth choosing the cheaper product.
 
Got my MSI 5700 XT gaming X for 500$ CAD... which equal to 375$ USD. No way Nvidia can be justified.
 
I've never been a fan of the 'Super' cards.
The 5700XT is possibly the better buy here but that's not saying much, the RTX 2070 Super is overpriced, Newegg has regular RTX 2070's right now for $450 (469, 450 after rebate)
https://www.newegg.com/msi-geforce-...Vhp6fCh2hbwmDEAYYASABEgJNN_D_BwE&gclsrc=aw.ds

Very interesting how the Radeon falls more and more behind as the resolution increases, even though its only 5-10%. If looking at just raw performance the 5700XT is the better buy, but unfortunately you also have to factor in AMD's inferior software, features and stability.

That's probably due to GDDR6 cost cutting instead of HBM2. AMD was traditionally better as the resolution increases due to higher memory bandwidth.
 
You know whats most baffling? Why reviews and game benchmark comparisons never mention whats the best GPU for the future. I believe that this is, by far, thee most important aspect of any GPU. Its not about what its doing now but what the trend of projection see's for the future of a GPU. AMD, and we all know this, has the upper hand in this aspect. Again, read my previous reply... AMD GPUs are in current PS4 Pro/Xbox-X, but even more important, and is worthy of mention in ANY DANG REVIEW, is the fact that these upcoming next generation of consoles (PS5/Xbox Next) are sporting AMD Navi GPUs under the hood for both upcoming consoles. 99% of next generation of "Game Engines" currently in development are being built from the ground up to take full advantage of the specs inside these two upcoming next generation of consoles. And we all know what this means, once again, is that AMD's PC GPUs are gonna flourish once again for the foreseeable future.

Its always a bit bizarre to see and read reviews on these GPUs, going head-to-head with eachother in these reviews yet not a peep is mentioned about which GPU is gonna fair better in the long run, I mean we already know the answer, those of us in this hobby we all love and cherish but the "average joe", which is, by far the biggest segment of GPU purchasers, haven't got a clue to which will give the best longevity, yet this most important aspect is, bizarrely always pushed aside, as if its taboo to mention.
Myself being a life long AMD fan aside, I haven't read so far any comment that is as useful to someone looking at buying their first or upgrading their current rig, as this. What you've said is not fan boy hype but cold clear facts that are bang on the money!! Most people probably don't follow the ebb & flow of gaming architecture or console design & wouldn't be aware but as you stated it simply can't be ignored that AMD are everywhere now. For a very long time Nvidia/Intel held the crown & we all know they made us pay for it hahaha but with AMD's perseverance & blazing releases of both Ryzen CPU's & Vega/Navii GPU's, inventing & utilizing HBM/HBM2 as well being the only one's to harness 7nm it makes perfect sense that more & more Tech/Gaming companies are getting on board & backing AMD's very obvious rise to full power. I myself have always marveled that very few reviews (if any) ever relate to the future of hardware through probable software updates. I understand it's gotta be about what's the best today but advise &/or insight should be given as to what will be the best tomorrow or in 6 months or who the next wave of gaming architecture supports or what sort of hardware our next gen of consoles are packing. Again, as you clearly stated, AMD's older GPU tech have just bridged too many gaps in Tier levels simply because of Software Updates alone to be ignored now & this is something worth stating, even as a side note, in any & all AMD GPU reviews. Don't get me wrong Nvidia make blazing cards & deliver exceptional performance before & after updates of their own but it just can't be ignored that in the last few years more and more developers of games/consoles are turning to Team Red & getting on that AMD bandwagon & this should be remembered when picking your hardware.

I'll finish by stating I think your 2 replies were very refreshing to read & should be flagged as a must read or at least placed higher within the comments because casual gamers or gaming/build enthusiasts simply need more factual, dead on info not just about the hardware available today but how it will perform tomorrow with reference to the trends of the past. Bottom line, AMD are back, here to stay & definitely haven't finished changing the game. If anything it's only just begun. All the best :)
 
Not sure I understand your grammar here as your use of question marks is 'questionable', but I will bite too. What the h*ll are YOU talking about? Do you seriously not understand my point? How much does it cost to game at 4K on PC? Now how much on console? Do you still need more explanation?
Well, one thing to take note of is that on console, the discounts are nothing like a steam sale, also, you don't get access to many indie games, extremely long load times are abound on console, and at least in the case of the xbox, you have to pay at least 5 dollars per month just to play online, trust me, it adds up and that's why I will be making the change to pc, at least for some games as my pc sucks
 
Back