RTX 2080 Ti Super is apparently exclusive to the (not always working) GeForce Now RTX

It wont happen. Even if, somehow, AMD made a 2080ti Navi card, it wont come out until 2020, and people forget nvidia is STILL 14nm. All the benefits AMD is getting from 7nm can be applied to the RTX 3000 series too. If Navi were to compete, wed just get the 3000 series and AMD would take bargain placement for the fourth generation in a row now.

I've pointed this out in other comments multiple times now but AMD's die size (5700 XT) is less then half of Nvidia's mainstream offerings. If Nvidia is able to do 7nm with it's massive chips, AMD is likely able to do 7nm+ with it's own. It doesn't make sense historically either, Nvidia has never been one to rush to a new node. That goes double right now as Nvidia's die sizes have only gotten bigger. Nvidia has had enough trouble making their current cards on their current node even this late in cycle, not even talking about 7nm. Go and look at stock of Super cards. The only refresh I know of that has such poor stock.
And what happens if Nvidia jumps straight to 7nm+? Is AMd gonna pull 5nm out of its arse?

AMd is still reliant on its node advantage for GPUs. This is a dangerous game to play, considering nvidia has lots of money to throw at better processes from TSMC. This also assumes second gen RT cores will be as large as first gen cores.

For some reason everyone assumes that, because zen exists, that AMD has this huge lead in IPC everywhere. News flash, AMD's GPU division is still lagging behind Nvidia.
 
And what happens if Nvidia jumps straight to 7nm+? Is AMd gonna pull 5nm out of its arse?

AMd is still reliant on its node advantage for GPUs. This is a dangerous game to play, considering nvidia has lots of money to throw at better processes from TSMC. This also assumes second gen RT cores will be as large as first gen cores.

For some reason everyone assumes that, because zen exists, that AMD has this huge lead in IPC everywhere. News flash, AMD's GPU division is still lagging behind Nvidia.

No that is not true.
It seems that a good many people don't know much about RDNA and don't understand that going to 7nm was only part of what took place. (Just so you know, the 2080S shrunk down on 7nm is still a very big chip, much bigger than the Mi60 Vega20 chip @ 7nm.)

Navi10 is punching above it's weight not because it is on 7nm, because it is using a different architecture. They both give a boost in gaming. But if you look at Transistor count, RDNA is more efficient at games and will become even more so when these developers learn to take advantage of it's new uArch.

Again, even shrunk down to 7nm, Turing still can't compete on performance per size. AMD will always have the price advantage here, if a shrunk down Turing is all Nvidia is going to do with 7nm they are doomed.

Also, TSMC is already sampling 5nm and AMD is their first customer on this node. I am quite sure we will see a 5nm GPU in a year and a half. About 6 months after Nvidia releases Ampere on 7nm.
 
And what happens if Nvidia jumps straight to 7nm+? Is AMd gonna pull 5nm out of its arse?

AMd is still reliant on its node advantage for GPUs. This is a dangerous game to play, considering nvidia has lots of money to throw at better processes from TSMC. This also assumes second gen RT cores will be as large as first gen cores.

For some reason everyone assumes that, because zen exists, that AMD has this huge lead in IPC everywhere. News flash, AMD's GPU division is still lagging behind Nvidia.

No that is not true.
It seems that a good many people don't know much about RDNA and don't understand that going to 7nm was only part of what took place. (Just so you know, the 2080S shrunk down on 7nm is still a very big chip, much bigger than the Mi60 Vega20 chip @ 7nm.)

Navi10 is punching above it's weight not because it is on 7nm, because it is using a different architecture. They both give a boost in gaming. But if you look at Transistor count, RDNA is more efficient at games and will become even more so when these developers learn to take advantage of it's new uArch.

Again, even shrunk down to 7nm, Turing still can't compete on performance per size. AMD will always have the price advantage here, if a shrunk down Turing is all Nvidia is going to do with 7nm they are doomed.

Also, TSMC is already sampling 5nm and AMD is their first customer on this node. I am quite sure we will see a 5nm GPU in a year and a half. About 6 months after Nvidia releases Ampere on 7nm.
Why are you assuming that nvidia is just going to stick with turing? Do you think Nvidia is as incompetent as intel? Also, how do you know the 2080 super is a huge chip on 7nm if such a chip has not been made?

You are making a LOT of assumptions here.

this is what we do know:
https://www.anandtech.com/show/14618/the-amd-radeon-rx-5700-xt-rx-5700-review/15
https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/28.html

The 5700xt, at best, accomplishes 2070 super performance while drawing more power. This would mean, logically, that a larger chip at 2080 super or 2080ti level could begin to run into issues with powering the chip under full boost, or keeping the GPU cool. Even third party 5700xt coolers run rather warm. Now, this is with both the jump with RDNA and 7nm behind them. Seems to me, even if 7nm only was responsible for 15% of RDNA's increase in efficiency, then that same bonus can still be applied to nvidia and automatically put them back in the performance lead. Let's also not forget that nvidia is likely not going to just shrink turing. They have likely been working, if logic is still a think, on turing's replacement, on 2nd gen RT cores, and other enhancements. Even if nvidia only gets a 10% performance boost out of turing's replacement, that would still allow them to stay comfortably ahead of AMD on performance, and based on the 2070 super's power draw, they would be more efficient while doing so.

RDNA is not the equivalent of Turing yet. And dont forget, RDNA 2 will have hardware RT on board, so say goodbye to tiny dies. Compare the size/performance of the 5700xt to the 1660ti, and AMD's advantage begins to rapidly diminish. If AMD wants hardware RT, they are going to have to pay for it in die space, now they dont have the size advantage VS nvidia.

TLDR: Navi merely competes with turing now, a die shrunk turing would still pull ahead, assuming nvidia has done 0 development on a successor or more efficient turing V2 design, ala pascal.
 
And what happens if Nvidia jumps straight to 7nm+? Is AMd gonna pull 5nm out of its arse?

AMd is still reliant on its node advantage for GPUs. This is a dangerous game to play, considering nvidia has lots of money to throw at better processes from TSMC. This also assumes second gen RT cores will be as large as first gen cores.

For some reason everyone assumes that, because zen exists, that AMD has this huge lead in IPC everywhere. News flash, AMD's GPU division is still lagging behind Nvidia.

lol Nvidia is not going to get it's massive dies working on an unmature node. They are having problems producing their high end chips now, let alone 7nm and especially 7nm+.

Why are you assuming that nvidia is just going to stick with turing? Do you think Nvidia is as incompetent as intel? Also, how do you know the 2080 super is a huge chip on 7nm if such a chip has not been made?

It's more then TWICE the size for the same performance level. Nvidia is NOT going to get a 2X reduction from 7nm. Therefore, Nvidia will continue to have big dies. Even in the best case scenario assuming perfect scaling, you are looking at 60%. You won't get perfect scaling though as things like cache, of which there are multiple levels on a GPU, scale very poorly. 38% is a pretty safe estimate. By the way all of this assumes that Nvidia keeps everything the same. No additional CUDA cores or RTX parts.

well out of 3 avg for 1080p, 1440p and 4k you picked 1080p where 2080 ti is only "32% faster than 5700XT" lol, and it is an aib 5700XT nonetheless. You made it the point of your rant about Navi can easily be made 30% faster.

show me where Techspot use 1080p for GPU review hm ?
https://www.techspot.com/review/1896-msi-radeon-5700-xt-evoke/
https://www.techspot.com/review/1870-amd-radeon-rx-5700
https://www.techspot.com/review/1791-amd-radeon-vii-mega-benchmark/
https://www.techspot.com/review/1865-geforce-rtx-super/
https://www.techspot.com/article/1702-geforce-rtx-2080-mega-benchmark/

New 5700XT vs 2070 Super benchmark

In none of them there is any 1080p benchmark. So who is being ignorant ? You just dont compare high end GPU by their 1080p performance, just dont.

You do realize that TechPowerUp link I provided had 1080p benches right?

I should also point out that they are not alone, GamersNexus does 1080p as well https://www.gamersnexus.net/hwreviews/3498-sapphire-rx-5700-xt-pulse-review

And of course, techspot
https://www.techspot.com/review/1777-geforce-1050ti-vs-radeon-570/
https://www.techspot.com/review/1811-nvidia-geforce-gtx-1060/
https://www.techspot.com/review/1797-nvidia-geforce-gtx-1060-ti/

But I guess you know more then them. You can't come up with a legitimate reason why 1080p benchmarks are bad.
 
Last edited:
lol Nvidia is not going to get it's massive dies working on an unmature node. They are having problems producing their high end chips now, let alone 7nm and especially 7nm+.



It's more then TWICE the size for the same performance level. Nvidia is NOT going to get a 2X reduction from 7nm. Therefore, Nvidia will continue to have big dies. Even in the best case scenario assuming perfect scaling, you are looking at 60%. You won't get perfect scaling though as things like cache, of which there are multiple levels on a GPU, scale very poorly. 38% is a pretty safe estimate. By the way all of this assumes that Nvidia keeps everything the same. No additional CUDA cores or RTX parts.



You do realize that TechPowerUp link I provided had 1080p benches right?

I should also point out that they are not alone, GamersNexus does 1080p as well https://www.gamersnexus.net/hwreviews/3498-sapphire-rx-5700-xt-pulse-review

And of course, techspot
https://www.techspot.com/review/1777-geforce-1050ti-vs-radeon-570/
https://www.techspot.com/review/1811-nvidia-geforce-gtx-1060/
https://www.techspot.com/review/1797-nvidia-geforce-gtx-1060-ti/

But I guess you know more then them. You can't come up with a legitimate reason why 1080p benchmarks are bad.

hah the fact that Techspot use 1440p and 4K testing for 2060 Super, 2070 Super, 5700, 5700XT and any faster GPU; 1080p and 1440p for 1660Ti and 1080p only for 1660 and slower GPU already explain why. Well if all you have ever had are AMD gpu I guess 1080p is a perfectly fine resolution for you, not that I'm looking down at you or anything.

Btw I'm not saying testing at 1080p is bad, just that you cannot compare high end GPU by their 1080p performances, it's like comparing F1 cars in go-cart track.

Back to topic, Control is a perfect game for RTX, time for 2080 Ti to shine some godly rays.
 
Last edited:
hah the fact that Techspot use 1440p and 4K testing for 2060 Super, 2070 Super, 5700, 5700XT and any faster GPU; 1080p and 1440p for 1660Ti and 1080p only for 1660 and slower GPU already explain why. Well if all you have ever had are AMD gpu I guess 1080p is a perfectly fine resolution for you, not that I'm looking down at you or anything.

Btw I'm not saying testing at 1080p is bad, just that you cannot compare high end GPU by their 1080p performances, it's like comparing F1 cars in go-cart track.

The RX 5700 XT, GTX 2060 super, and GTX 2070 super are not high end so your logic fails ya there. xx70 nvidia is mid range and xx60 is upper budget. If they were high end they would be an xx80, xx80 Ti, or Titan. None of which they are. Just because Nvidia is selling you midrange cards at high end prices doesn't change the naming scheme and their position in Nvidia's lineup. The 970 wasn't considered high end and it provided a much bigger boost then the 2070 super ever will. The 960 was squarely budget. Ditto goes from the 770, 760, 670, 660, ect.

You still have yet to give a specific reason as to why you can't compare a mid range graphics card to a high end graphics card at 1080p. FYI, your analogy is bad. At 1080p resolution a video card can be pushed to it's max. If it couldn't, then obviously GamersNexus and TechPowerUp would not publish reviews where the CPU was bottlenecking the card. You can't push F1 Cars to their max on a go-cart track and thus your analogy is worthless. Once again, you think you know more then professional reviewers when you can't even make a half competent analogy.
 
The RX 5700 XT, GTX 2060 super, and GTX 2070 super are not high end so your logic fails ya there. xx70 nvidia is mid range and xx60 is upper budget. If they were high end they would be an xx80, xx80 Ti, or Titan. None of which they are. Just because Nvidia is selling you midrange cards at high end prices doesn't change the naming scheme and their position in Nvidia's lineup. The 970 wasn't considered high end and it provided a much bigger boost then the 2070 super ever will. The 960 was squarely budget. Ditto goes from the 770, 760, 670, 660, ect.

You still have yet to give a specific reason as to why you can't compare a mid range graphics card to a high end graphics card at 1080p. FYI, your analogy is bad. At 1080p resolution a video card can be pushed to it's max. If it couldn't, then obviously GamersNexus and TechPowerUp would not publish reviews where the CPU was bottlenecking the card. You can't push F1 Cars to their max on a go-cart track and thus your analogy is worthless. Once again, you think you know more then professional reviewers when you can't even make a half competent analogy.

It's funny that you have 4000+ posts on this website yet you are very unfamiliar with this very website's content, are you sure you are not trolling around ?

https://www.techspot.com/bestof/graphics-cards/

Best Mainstream GPU ($200 or less): Radeon RX 570 vs. Radeon RX 580 vs. GeForce GTX 1650
Best Mid-Range GPU ($300 or less): GeForce GTX 1660 / Ti vs. Radeon Vega 56 vs. GeForce GTX 1060
Best High-End 1440p GPU ($400+):
Radeon 5700 XT vs. Radeon 5700 vs. RTX 2070 Super vs. RTX 2060 Super vs. RTX 2060
Best High-End 4K Gaming GPU (Over $600):
GeForce RTX 2080 Ti vs. GeForce RTX 2080 vs. GeForce RTX 2080 Super

Perhaps you could email and ask Steve why he doesn't benchmark High-End GPU at 1080p hm ?
Oh well it's a pointless discussion since you are trolling anyways, just gonna enjoy Control and I have to say that having RTX On make this game so life like.
 
Last edited:
It's funny that you have 4000+ posts on this website yet you are very unfamiliar with this very website's content, are you sure you are not trolling around ?

https://www.techspot.com/bestof/graphics-cards/

Best Mainstream GPU ($200 or less): Radeon RX 570 vs. Radeon RX 580 vs. GeForce GTX 1650
Best Mid-Range GPU ($300 or less): GeForce GTX 1660 / Ti vs. Radeon Vega 56 vs. GeForce GTX 1060
Best High-End 1440p GPU ($400+):
Radeon 5700 XT vs. Radeon 5700 vs. RTX 2070 Super vs. RTX 2060 Super vs. RTX 2060
Best High-End 4K Gaming GPU (Over $600):
GeForce RTX 2080 Ti vs. GeForce RTX 2080 vs. GeForce RTX 2080 Super

Perhaps you could email and ask Steve why he doesn't benchmark High-End GPU at 1080p hm ?
Oh well it's a pointless discussion since you are trolling anyways, just gonna enjoy Control and I have to say that having RTX On make this game so life like.

Oh really? TechSpot's own description of the RTX 2060 says otherwise

"The RTX 2060 is the best value mid-range offering"

https://www.techspot.com/products/graphics-cards/nvidia-geforce-rtx-2060-6gb-gddr6-pcie.188332/

Mind you TechSpot is going off current performance with those numbers while I was referring to the product model name and it's placement in the stack. Arguing based off the model number or performance are both correct, either are acceptable answers and that's why you see the labeling change based on the site you visit. You seem to be more focused on diverting from the fact that you can't respond on point then actually realizing things like this. Just as an example, PCWorld has both the 5700 and 2060 Super labeled as mid range

https://www.pcmag.com/compare/370011/nvidia-geforce-rtx-2060-super-vs-amd-radeon-rx-5700-which

Of course I provided multiple examples of other websites that labeled those cards as mid range earlier as well and that tested at 1080p. You are grasping onto your last straw as hard as you can.

By the way, you've still yet to provide a single reason to not benchmark at 1080p. This is the 5th time I've asked, I'll assume after your next post that if you can't come up with a specific reason (instead of a wildly incorrect analogy with no supporting argument) that you in fact can not.
 
Oh really? TechSpot's own description of the RTX 2060 says otherwise

"The RTX 2060 is the best value mid-range offering"

https://www.techspot.com/products/graphics-cards/nvidia-geforce-rtx-2060-6gb-gddr6-pcie.188332/

Mind you TechSpot is going off current performance with those numbers while I was referring to the product model name and it's placement in the stack. Arguing based off the model number or performance are both correct, either are acceptable answers and that's why you see the labeling change based on the site you visit. You seem to be more focused on diverting from the fact that you can't respond on point then actually realizing things like this. Just as an example, PCWorld has both the 5700 and 2060 Super labeled as mid range

https://www.pcmag.com/compare/370011/nvidia-geforce-rtx-2060-super-vs-amd-radeon-rx-5700-which

Of course I provided multiple examples of other websites that labeled those cards as mid range earlier as well and that tested at 1080p. You are grasping onto your last straw as hard as you can.

By the way, you've still yet to provide a single reason to not benchmark at 1080p. This is the 5th time I've asked, I'll assume after your next post that if you can't come up with a specific reason (instead of a wildly incorrect analogy with no supporting argument) that you in fact can not.

Not only that you lack critical thinking but also reading skill pal, where did I mention that 2060 is high end GPU ? or perhaps it was the 2060 Super, 2070, 2070 Super, 5700, 5700 XT and faster GPUs as Techspot pointed out ? You were pretty desperate to quote other tech website when Techspot doesn't agree with you, maybe you should just go trolling other tech website (or you already did...).

About 1080p testing, not only that it can be bottlenecked by CPU when tested on High End GPU (common knowledge), it is also irrelevant information as anyone who looking to purchase High End GPU are going to play at 1440p and above. You don't even have a 1080p screen yourself, you just wanna make 2080Ti look worse by comparing it to the like of 5700XT at 1080p while it's true performance are shown at 1440p and above. If you are saying 5700XT is a 1080p card and should be tested at 1080p, everyone is gonna get a good laugh. Next time just pick 1440p like a sane person would for comparison.
 
Last edited:
Not only that you lack critical thinking but also reading skill pal, where did I mention that 2060 is high end GPU ? or perhaps it was the 2060 Super, 2070, 2070 Super, 5700, 5700 XT and faster GPUs as Techspot pointed out ? You were pretty desperate to quote other tech website when Techspot doesn't agree with you, maybe you should just go trolling other tech website (or you already did...).

:joy:

Here I will quote your entire comment and bold your own words so you can find them

--xx--

"It's funny that you have 4000+ posts on this website yet you are very unfamiliar with this very website's content, are you sure you are not trolling around ?

https://www.techspot.com/bestof/graphics-cards/

Best Mainstream GPU ($200 or less): Radeon RX 570 vs. Radeon RX 580 vs. GeForce GTX 1650
Best Mid-Range GPU ($300 or less): GeForce GTX 1660 / Ti vs. Radeon Vega 56 vs. GeForce GTX 1060
Best High-End 1440p GPU ($400+):
Radeon 5700 XT vs. Radeon 5700 vs. RTX 2070 Super vs. RTX 2060 Super vs. RTX 2060
Best High-End 4K Gaming GPU (Over $600):
GeForce RTX 2080 Ti vs. GeForce RTX 2080 vs. GeForce RTX 2080 Super

Perhaps you could email and ask Steve why he doesn't benchmark High-End GPU at 1080p hm ?
Oh well it's a pointless discussion since you are trolling anyways, just gonna enjoy Control and I have to say that having RTX On make this game so life like."

--xx--

You know someone is loosing an argument when they have to make personal attacks multiple times in the same comment. You spend more time typing personal comments then you do on topic :joy:.

About 1080p testing, not only that it can be bottlenecked by CPU when tested on High End GPU, it is also irrelevant information as anyone who looking to purchase High End GPU are going to play at 1440p and above. You don't even have a 1080p screen yourself, you just wanna make 2080Ti look worse by comparing it to the like of 5700XT at 1080p while it's true performance are shown at 1440p and above. Anyway I just hope AMD pay for better troll, not one without any reading skill such as you.

Yeah, because you know GamersNexus or TechPowerUp totally didn't think about bottlenecking :joy:. Oh wait why don't you run to message them that their results are invalid. :joy: Did you even look at the GamersNexus and TechPowerUp reviews? You claim bottlenecking is occurring but you can't find a single shred of evidence. Look at those reviews, how many instances of plateauing do you see? I see ample charts where there is zero evidence of bottlenecking.

https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-evoke/9.html

What about GamersNexus?

https://www.gamersnexus.net/hwreviews/3498-sapphire-rx-5700-xt-pulse-review

Nope, the top end cards are clearly not being capped at a specific point by bottlenecking.

There are certainly some games that can bottleneck at 1080p but that is far more rare then games that aren't. To imply that every 1080p review is invalid and wrong though? That's of the same logic as saying a 9900K's results at 1080p are worthless. FYI plenty of eSports gamers and regular PC gamers play at 1080p with a RTX 2060 Super and higher. It's not far beyond the imagination that people with high refresh rate monitors want to maximum frame rate, which requires both CPU and GPU grunt. Your assumption shows how insulated you are inside your own world.
 
:joy:

Here I will quote your entire comment and bold your own words so you can find them

--xx--

"It's funny that you have 4000+ posts on this website yet you are very unfamiliar with this very website's content, are you sure you are not trolling around ?

https://www.techspot.com/bestof/graphics-cards/

Best Mainstream GPU ($200 or less): Radeon RX 570 vs. Radeon RX 580 vs. GeForce GTX 1650
Best Mid-Range GPU ($300 or less): GeForce GTX 1660 / Ti vs. Radeon Vega 56 vs. GeForce GTX 1060
Best High-End 1440p GPU ($400+):
Radeon 5700 XT vs. Radeon 5700 vs. RTX 2070 Super vs. RTX 2060 Super vs. RTX 2060
Best High-End 4K Gaming GPU (Over $600):
GeForce RTX 2080 Ti vs. GeForce RTX 2080 vs. GeForce RTX 2080 Super

Perhaps you could email and ask Steve why he doesn't benchmark High-End GPU at 1080p hm ?
Oh well it's a pointless discussion since you are trolling anyways, just gonna enjoy Control and I have to say that having RTX On make this game so life like."

--xx--

You know someone is loosing an argument when they have to make personal attacks multiple times in the same comment. You spend more time typing personal comments then you do on topic :joy:.



Yeah, because you know GamersNexus or TechPowerUp totally didn't think about bottlenecking :joy:. Oh wait why don't you run to message them that their results are invalid. :joy: Did you even look at the GamersNexus and TechPowerUp reviews? You claim bottlenecking is occurring but you can't find a single shred of evidence. Look at those reviews, how many instances of plateauing do you see? I see ample charts where there is zero evidence of bottlenecking.

https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-evoke/9.html

What about GamersNexus?

https://www.gamersnexus.net/hwreviews/3498-sapphire-rx-5700-xt-pulse-review

Nope, the top end cards are clearly not being capped at a specific point by bottlenecking.

There are certainly some games that can bottleneck at 1080p but that is far more rare then games that aren't. To imply that every 1080p review is invalid and wrong though? That's of the same logic as saying a 9900K's results at 1080p are worthless. FYI plenty of eSports gamers and regular PC gamers play at 1080p with a RTX 2060 Super and higher. It's not far beyond the imagination that people with high refresh rate monitors want to maximum frame rate, which requires both CPU and GPU grunt. Your assumption shows how insulated you are inside your own world.

Care to explain how 2080 Super is 19% faster at 1080p but 24% faster at 1440p hm ? The relative performance remain within margin of error for GPUs slower than 5700XT but the faster ones will make bigger distinction when increasing the resolution. 1080p to 1440p: Radeon VII 0 --> 6%, 2080 13 --> 17%, 2080Ti 32 --> 43%. You know then 5700 XT is the absolute limit before CPU bottleneck kick in.

https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-evoke/28.html

Or are you gonna say the 5700XT is not designed for 1440p gaming ? If so then Navi is definitely a fail, ain't nobody paying 450usd for a 1080p GPU.
 
Last edited:
Care to explain how 2080 Super is 19% faster at 1080p but 24% faster at 1440p hm ? The relative performance remain within margin of error for GPUs slower than 5700XT but the faster ones will make bigger distinction when increasing the resolution. 1080p to 1440p: Radeon VII 0 --> 6%, 2080 13 --> 17%, 2080Ti 32 --> 43%. You know then 5700 XT is the absolute limit before CPU bottleneck kick in.

https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-evoke/28.html

Or are you gonna say the 5700XT is not designed for 1440p gaming ? If so then Navi is definitely a fail, ain't nobody paying 450usd for a 1080p GPU.

WTF are you talking about, at 1080p cards faster then the RX 5700 XT saw significant gains. If the RX 5700 XT was the max before bottlenecking kicked in the RTX 2080 Ti wouldn't be a full 32% ahead, it'd be less then 5%. Do you not understand what bottlenecking is?


Care to explain how 2080 Super is 19% faster at 1080p but 24% faster at 1440p hm ?

Have you considered the possibility that it is memory bandwidth related? It's entirely possible the 5700 XT needs additional memory bandwidth to do better at higher resolutions. Of course, this is just a guess. It could be any number of things. This was part of my point though. 1080 results are good as they show raw GPU horsepower without potential memory subsystem bottlenecks that a mid-range cards like the 5700 XT might have at higher resolutions. In otherwords, higher resolutions poorly demonstrate the 5700 XTs raw horsepower. In particular in the context I was using it in, which was to get a feel for big navi. Obviously big navi is going to upgrade the memory subsystem and increase the number of raster engines (or whatever AMD calls the Nvidia equivalent). In such a case performance increases derived from higher memory performance when comparing the 5700 XT to the 2080 Super are completely worthless. In fact it's downright pointless to compare the 5700 XT to the 2080 Super in the context of getting an idea of the base performance of an architecture at 1440p and 4K. 1080p is by far the better indicator, as proven by the fact that the 2080 Super does gain addition performance at higher resolutions, clearly indicating that something unrelated to the raw horsepower of the GPU itself.
 
WTF are you talking about, at 1080p cards faster then the RX 5700 XT saw significant gains. If the RX 5700 XT was the max before bottlenecking kicked in the RTX 2080 Ti wouldn't be a full 32% ahead, it'd be less then 5%. Do you not understand what bottlenecking is?

Have you considered the possibility that it is memory bandwidth related? It's entirely possible the 5700 XT needs additional memory bandwidth to do better at higher resolutions. Of course, this is just a guess. It could be any number of things. This was part of my point though. 1080 results are good as they show raw GPU horsepower without potential memory subsystem bottlenecks that a mid-range cards like the 5700 XT might have at higher resolutions. In otherwords, higher resolutions poorly demonstrate the 5700 XTs raw horsepower. In particular in the context I was using it in, which was to get a feel for big navi. Obviously big navi is going to upgrade the memory subsystem and increase the number of raster engines (or whatever AMD calls the Nvidia equivalent). In such a case performance increases derived from higher memory performance when comparing the 5700 XT to the 2080 Super are completely worthless. In fact it's downright pointless to compare the 5700 XT to the 2080 Super in the context of getting an idea of the base performance of an architecture at 1440p and 4K. 1080p is by far the better indicator, as proven by the fact that the 2080 Super does gain addition performance at higher resolutions, clearly indicating that something unrelated to the raw horsepower of the GPU itself.

Oh wow do you understand what AVG means ?
When you see result like these
ace-combat-7-1920-1080.png


anno-1800-1920-1080.png


battlefield-5-1920-1080.png


divinity-original-sin-2-1920-1080.png


far-cry-5-1920-1080.png


When the 2070Super, 2080, 2080 Super and 2080Ti got heavily bottleneck in 5 games out of 22, of course the avg will be skewered (5/22 = 22% lower score). That is how math work. It could be either CPU bottlenecking or the game engine not supporting higher FPS.

Let see how relative performance of every GPU slower than 5700XT react to increasing resolution 1080p - 1440p - 4k:
2070 97 - 96 - 98%
2060 Super 93 - 93 - 94%
5700 87 - 86 - 87%
Vega 64 82 - 82 - 83%
1080 83 - 81 - 82%
2060 83 - 81 - 81%
Vega 56 75 - 75 - 76%
...
https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-evoke/28.html

So you are telling me that 5700XT magically keep pace with every GPU below it (even in 4K) but somehow lose steam when compare to cards faster than it ? is this some sort of voodoo magic ?

Radeon VII 100 - 106 - 110%
1080 Ti 105 - 107 - 109%
2070 Super 107 - 109 - 112%
2080 113 - 117 - 121%
2080 Super 119 - 124 - 130%
2080 Ti 132 - 143 - 153%

Now remember 2060 Super, 2070, 2070 Super and 2080 all have the same memory configurations and bandwidth as 5700XT (8GB - 448GB/s) but 2070 Super and 2080 become increasingly faster when upping the resolution while 2060 Super and 2070 remain the same ? This is why I said 5700XT is the absolute limit the 5Ghz 9900K can handle across 22 games at 1080p.
 
Last edited:
Oh wow do you understand what AVG means ?
When you see result like these
ace-combat-7-1920-1080.png


anno-1800-1920-1080.png


battlefield-5-1920-1080.png


divinity-original-sin-2-1920-1080.png


far-cry-5-1920-1080.png


When the 2070Super, 2080, 2080 Super and 2080Ti got heavily bottleneck in 5 games out of 22, of course the avg will be skewered (5/22 = 22% lower score). That is how math work. It could be either CPU bottlenecking or the game engine not supporting higher FPS.

Let see how relative performance of every GPU slower than 5700XT react to increasing resolution 1080p - 1440p - 4k:
2070 97 - 96 - 98%
2060 Super 93 - 93 - 94%
5700 87 - 86 - 87%
Vega 64 82 - 82 - 83%
1080 83 - 81 - 82%
2060 83 - 81 - 81%
Vega 56 75 - 75 - 76%

Are you sure those are examples of bottlenecking?

The ace combat and anno 1800 clearly has some bottlenecking going on, whether that's due to the resolution or to the game engine I cannot tell. If you look at anno 1800 1440p results there is a pretty distinct drop off from the high end when going to the mid range. It's very atypical that we are seeing such a sudden large performance drop.

That said those are the only two atypical results. The other games you listed still see nice scaling at 1080p when going from the mid range to high end. Divinity original sin II's numbers at the high end are close but the 1440p and 4K results mirror the same pattern with slightly larger differences. It would be another matter if the high end cards suddenly gained a lot of performance at 1440p/4K but that's not what is happening.

If you excluded the two atypical games you'd hardly see any change in the average at all, a few percentage at best.
 
So you agree with Ace Combat and Anno 1800, good.

Now we check for atypical performance drop off when going from 1080p to 1440p in other games:

Divinity Original Sin 2
2060 Super 149 -> 105.1fps, so 42% drop in FPS
5700XT 146.1 -> 108.5fps, 35%
2070 153.4 -> 108.6fps, 41%
Radeon VII 146.7 -> 110.3, 33%
2080 176.5 -> 133fps , 32%
2080Ti 181.4 -> 163.8fps, 10%
2080Ti obviously got bottleneck at 1080p.
https://www.techpowerup.com/review/evga-geforce-rtx-2080-super-black/13.html

Far Cry 5
2060 Super 118.9 -> 91.8fps, 30%
2070 123.6 -> 95.4fps, 30%
5700 XT 128.8 -> 106.1fps, 21%
2080 137.4 -> 112.5fps, 22%
Radeon VII 130 -> 115.9fps, 12%
2080Ti 142.0 -> 134.1, 6% drop in fps
Both Radeon VII and 2080Ti got bottleneck. This title is known to be CPU dependent.
https://www.techpowerup.com/review/evga-geforce-rtx-2080-super-black/15.html

Civilization VI
5700XT 141.9 -> 115.1fps, 23%
2060 Super 132.0 -> 109.0fps, 21%
2080 162.1 -> 135.9fps, 20%
Radeon VII 104.4 - > 96.1fps, 9%
2080Ti 177.2 -> 162.0fps, 9%
Radeon VII and 2080Ti got bottleneck again here (Radeon VII just have low fps for some reason).
https://www.techpowerup.com/review/evga-geforce-rtx-2080-super-black/10.html

Assassin Creed Odyssey
2070 72.1 -> 51.7fps, 40% drop
5700XT 72.4 -> 52.1fps, 39%
2070 Super 74.8 -> 57.9fps, 30%
2080 77.7 -> 61.7fps, 26%
Radeon VII 71.2 -> 56.5fps, 26%
2080Ti 95.4 - > 78.4fps, 22%
2080, Radeon VII and 2080Ti have a little bottleneck here, AC O is famous for it's double DRM after all.
https://www.techpowerup.com/review/evga-geforce-rtx-2080-super-black/8.html

So 5 games out of 22 show clear bottleneck for the 2080Ti, the rest not as obvious but there are a few % here and there. 5/22 = up to 22% lower score for 2080 Ti.

2080Ti is a monster card that has 48% higher theoretical performance than the 2070 Super here (13.45 vs 9.062TFLOPS). I would say even at 1440p it would still get bottleneck, it is a 1200usd card after all and Intel hasn't exactly made any IPC improvements since Skylake. So how is it fair when you basically handicapped the 2080Ti to 1080p and compare it to much slower cards ? it's like taking F1 car to race in go-cart track.
 
Last edited:
Back