RTX 2080 Ti Super is apparently exclusive to the (not always working) GeForce Now RTX

Like I pointed out earlier, there will be no 10-15% extra performance. The RTX Titan is the biggest turing chip Nvidia has and it is not that much of an increase over the 2080 Ti.

I used 1080p as a metric because it is the most reliable measure of a GPU's performance. At higher resolution additional bottlenecks come into play and video memory can be a problem. It makes sense that a mid-range card would fall behind at higher resolutions, after all the 2080 Ti is equipped with a larger memory bus, more memory, and faster memory. There's a 100% chance AMD will provide more and faster memory on their higher end cards. It's also very likely they increase the bus size as well. So in effect, the 2080 Ti's additional lead at 1440p and 4K is the result of it being designed to run at those higher resolutions and having improved features GPU and non-GPU related to handle them. Something that would not be hard for AMD to alleviate. I would go as far to say that comparing 4K results of the two cards is misleading if your point is to show GPU efficiency because the RX 5700 XT isn't really designed to play at that resolution and doesn't have the expensive RAM that the 2080 Ti has (among other things).

This is nothing new though, even the 2070 Super looses significant performance at 4K compared to the 2080 Ti just the same as the RX 5700 XT. The reason is obvious, midrange cards perform best at the resolutions they were designed for.

it is useless to bench high end GPU at 1080p since you run into CPU bottleneck, it is basic knowledge, just like you don't benchmark CPU at 4k since you will run into GPU bottleneck.
Take this from 3900X review
FC_Ultra_1080p.png

See how close 2080 Ti and 2070 Super are ? how preposterous is that.
Now at 1440p, a 2fps difference at 1080p become 15fps (9900k). You don't even want to couple 2080Ti with 3900X at 1440p lol (heavily bottlenecked).
FC_Ultra_1440p.png


Titan RTX is already 10% faster than 2080Ti at the same TDP, couple with 15.5Gbps Vram 10-15% is not that far fetched.
 
Last edited:
Where do you live where their prices are almost the same?

In the US the RX 5700 XT retails for around $400 - 450. The RTX 2070 Super retails for around $500 - 900 (with the cheaper models out of stock). I don't know what's going on with the 2070 super stock here but many models are having inflated pricing due to it. Even on eBay you are looking at $580 for a new card but from a 3rd party vendor (non authorized reseller). I'd say $500 RTX 2070 Super vs $450 RX 5700 XT (aftermarket) would be a tough choice. If you can get it at $500 new then it might be worth it. Any more then that though and the value of the RX 5700 XT is hard to ignore. I should add that if you do consider AMD, you should take a look at Sapphire. Their 5700 XT pulse was reviewed by GamersNexus with an MSRP of $410. Not only is that price only $10 over MSRP, the included cooler works well. It leaves enough room to OC if you want to.


Thanks for pointing out the Pulse XT. It is about $466 converted. The cheapest I can find is the non-Pulse Sapphire @ $437 but that comes with the default blower version.

I forgot to equate my similarity statement with the Asus ROG Strix version of the 5700XT. Sapphire is $90 cheaper indeed.

I thought the Strix version might run it even cooler and more silent. But then instead of getting that I might go for the (any cheaper) 2070 Super because the price is similar with the Strix. (Strix XT is about $30 (converted) more expensive than the cheapest 2070 Super by Zotac (twin fan) and Galaxy Super EX. (Malaysia and Singapore region).

But the Sapphire Pulse is quite tempting coming in almost $100 cheaper.
 
it is useless to bench high end GPU at 1080p since you run into CPU bottleneck, it is basic knowledge, just like you don't benchmark CPU at 4k since you will run into GPU bottleneck.
Take this from 3900X review
FC_Ultra_1080p.png

See how close 2080 Ti and 2070 Super are ? how preposterous is that.

Titan RTX is already 10% faster than 2080Ti at the same TDP, couple with 15.5Gbps Vram 10-15% is not that far fetched.

1. Assuming there is a CPU bottleneck at 1080p, which for a majority of games there is not with the high end CPUs used in testing. But yes, it's basic knowledge yet every major tech outlet benches their GPUs at 1080p. Hmmm, I wonder who's right, you or the tech outlets.

2. Stop cherry picking information out of the review, especially single game results.

relative-performance_3840-2160.png


Would you look at that, on average the RTX 2070 Super is 41% slower at 4K then the RTX 2080 Ti. On the other hand it was only 25% at 1080p.

Who could have thought that testing a mid range card at 4K could skew the results equally for both vendors /s. :facepalm:

Also, mind pointing out where you got that 10% RTX Titan number? From what I've seen on Tom's review, it certainly isn't 10%. That's even considering that Tom's hardware is known for being pro-Nvidia with both it's test suite and it's "Just buy it" article.

https://www.tomshardware.com/reviews/nvidia-titan-rtx-deep-learning-gaming-tensor,5971.html
 
Thanks for pointing out the Pulse XT. It is about $466 converted. The cheapest I can find is the non-Pulse Sapphire @ $437 but that comes with the default blower version.

I forgot to equate my similarity statement with the Asus ROG Strix version of the 5700XT. Sapphire is $90 cheaper indeed.

I thought the Strix version might run it even cooler and more silent. But then instead of getting that I might go for the (any cheaper) 2070 Super because the price is similar with the Strix. (Strix XT is about $30 (converted) more expensive than the cheapest 2070 Super by Zotac (twin fan) and Galaxy Super EX. (Malaysia and Singapore region).

But the Sapphire Pulse is quite tempting coming in almost $100 cheaper.

$90 cheaper is definitely worth it. Sapphire is a good brand as well. Here's a review on it

https://www.gamersnexus.net/hwreviews/3498-sapphire-rx-5700-xt-pulse-review

It includes two BIOS, one normal and one quite. Which you uses depends on how quiet you want it.

It should be noted, for AMD cards junction temp isn't like GPU core temp. Edge temp is comparable to core temp on Nvidia cards. Junction temp is the hottest spot on the GPU and this feature was added starting with Vega where AMD added additional temp sensors. It is regularly above edge temp and that is completely normal. As far as I'm aware, current Nvidia cards do not have this feature yet but it does cause some people to unnecessarily worry about junction temp. GPUs always had hotspots, they just weren't being reported before.
 
1. Assuming there is a CPU bottleneck at 1080p, which for a majority of games there is not with the high end CPUs used in testing. But yes, it's basic knowledge yet every major tech outlet benches their GPUs at 1080p. Hmmm, I wonder who's right, you or the tech outlets.

2. Stop cherry picking information out of the review, especially single game results.

relative-performance_3840-2160.png


Would you look at that, on average the RTX 2070 Super is 41% slower at 4K then the RTX 2080 Ti. On the other hand it was only 25% at 1080p.

Who could have thought that testing a mid range card at 4K could skew the results equally for both vendors /s. :facepalm:

Also, mind pointing out where you got that 10% RTX Titan number? From what I've seen on Tom's review, it certainly isn't 10%. That's even considering that Tom's hardware is known for being pro-Nvidia with both it's test suite and it's "Just buy it" article.

https://www.tomshardware.com/reviews/nvidia-titan-rtx-deep-learning-gaming-tensor,5971.html

Stop proving me right then say something otherwise dude, it make you look "not very bright".
When you look at 1080p results
ace-combat-7-1920-1080.png

anno-1800-1920-1080.png

civilization-vi-1920-1080.png

divinity-original-sin-2-1920-1080.png

far-cry-5-1920-1080.png


Do these results explain why a 25% gap at 1080p become 41% at 4K to you ? or do I have to make a 72 pages post to prove why comparing high end GPU at 1080p is a bad idea.

Sure you can quote 1440p results but using 1080p benchmarks to prove 2080Ti is just "32% faster than 5700XT" is downright retarded.
 
Last edited:
Nvidia won't bring a 2080 Ti Super to market until it needs to. Otherwise $1,200 per card is far too lucrative for them to pass up if there are people still willing to pay it. I will say this, if AMD's Navi top end is anything like the mid range, expect steep price cuts from Nvidia and a return to sanity in the high end market.


Navi won't match up. The 2080Ti will continue to dominate as the King of the mountain.

If by some miracle AMD catches up, the 2080 Ti super will drop, re-kill the game and the whole cycle will begin again next year.
It doesn't need to match up to the 2080TI. They just need to bring 2080 Super performance (or close to it) at a lower price. But they need to undercut Nvidia by around 75-100$.
 
$90 cheaper is definitely worth it. Sapphire is a good brand as well. Here's a review on it

https://www.gamersnexus.net/hwreviews/3498-sapphire-rx-5700-xt-pulse-review

It includes two BIOS, one normal and one quite. Which you uses depends on how quiet you want it.

It should be noted, for AMD cards junction temp isn't like GPU core temp. Edge temp is comparable to core temp on Nvidia cards. Junction temp is the hottest spot on the GPU and this feature was added starting with Vega where AMD added additional temp sensors. It is regularly above edge temp and that is completely normal. As far as I'm aware, current Nvidia cards do not have this feature yet but it does cause some people to unnecessarily worry about junction temp. GPUs always had hotspots, they just weren't being reported before.
Thanks for the suggestion. Will go for it. After a long time back to Radeon.
 
It’s easily solved: just do what I did and buy the best.

Once you have the 2080Ti there’s no need to worry.

Jup...just buy the xx80 Ti, usually they are the most interesting products Nvidia has to offer. Bought 1080 Ti for 700usd then sell it at 600usd 18 months later, gonna sell my 2080 Ti when the Super comes out.
 
I find it incredible how can someone pay 1200 dollars on a graphics card. Like apple fans paying 1200 for an iPhone. This is why they keep milking us, these prices are truly outrageous. I won't be riding that crazy train.

I paid for two, do I get double the 'outrage'?

:)
 
Jup...just buy the xx80 Ti, usually they are the most interesting products Nvidia has to offer. Bought 1080 Ti for 700usd then sell it at 600usd 18 months later, gonna sell my 2080 Ti when the Super comes out.
Exactly

I treat video cards just like I treat my iPhones.

Year-long disposable pleasures.

I always buy the best so the residual is highest.

I traded up from Titan X to Titan Xp and then to 2080Ti.

These benchmarks serve only as reminders that I bought the best.
 
Ummmm: They can release the 3080Ti, the 3080, the 3070 and the 3060.

We are still a decent ways off from the launch of those cards. Most likely AMD will launch big Navi and then 6 months later Nvidia will launch the 3000 series. Given that they are rumored to be 7nm and their massive die size, the node is going to need to be pretty mature for decent yields.

As an example:

https://www.techpowerup.com/gpu-specs/geforce-rtx-2070-super.c3440
https://www.techpowerup.com/gpu-specs/radeon-rx-5700-xt.c3339

The RTX 2070 Super is a bit faster but the die size is more then double. As you can imagine, this massive difference in die size is what allows AMD to release it's GPUs on 7nm far before Nvidia. Yields are exponential, as die size increase yield decreases dramatically.

Given that the 2080 Ti is only 32% ahead of the RX 5700 XT AMD doesn't even need to match Nvidia's mid range die size to beat the 2080 Ti.

https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-evoke/28.html

Looking at the performance numbers and die sizes, a doubling of die size for 9% extra performance (RX 5700 XT vs 2070 Super) makes it very clear that AMD's chip does far more with far less die space. It will be interesting to see exactly what die size big Navi will be and how well it scales. All I have to say it that a Navi chip with the same die size as the RTX 2070 Super (more then double the current RX 5700 XT die size) would have to scale extremely bad to not match the 2080 Ti. In fact they would have to get a measly 32% scaling. That's worst case scenario and of course assuming they are even able to use that die size on the 7nm node at that time. Best case scenario is a high 90% scaling, which would have big Navi competing with the 3080 Ti. Like I said though, no idea if 7nm can take an approx 500nm2 die size or if the uArch can scale that well although AMD do bill it as a scalable uArch.

Kuddos for this post...
Navi10 (5700 Series) is only 251mm^2. And the outgoing Vega20 (Radeon Vii/Mi50) that is now eol for gaming, is 331mm^2.

40% - 50% more CU's = 56 and 64CU chips, similar to Vega64 but using this new RDNA stuff. If little navi is playing with the Radeon 7 now and they are both at 7nm, then how hard will big navi punch at Radeon 7's size..?

Navi10 +37% = big navi
 
it is useless to bench high end GPU at 1080p since you run into CPU bottleneck, it is basic knowledge, just like you don't benchmark CPU at 4k since you will run into GPU bottleneck.
Take this from 3900X review

See how close 2080 Ti and 2070 Super are ? how preposterous is that.
Now at 1440p, a 2fps difference at 1080p become 15fps (9900k). You don't even want to couple 2080Ti with 3900X at 1440p lol (heavily bottlenecked).

Titan RTX is already 10% faster than 2080Ti at the same TDP, couple with 15.5Gbps Vram 10-15% is not that far fetched.

The Titan is going to be irrelevant soon & that is what most people are talking about. You are relinquishing the past and reminiscing about something that is /eol soon. (SUPER FLOPS)

AMD's big-navi is coming to crush games at $599+
 
Part of me is glad they aren't releasing the 2080Ti Super. It gives my venerable 1080Ti even more value for even more time. I cannot believe this card has stayed at the top of the list for so damn long now. What a great investment.
 
The Titan is going to be irrelevant soon & that is what most people are talking about. You are relinquishing the past and reminiscing about something that is /eol soon. (SUPER FLOPS)

AMD's big-navi is coming to crush games at $599+

Yeah, sure, whatever...I'm not interested in bargain product that came out late and has nothing to offer but "slightly cheaper than competing product". Just gonna enjoy the upcoming RTX games myself, Control has good review, might be time to sell myself into Epic game store.
 
Stop proving me right then say something otherwise dude, it make you look "not very bright".
When you look at 1080p results
ace-combat-7-1920-1080.png

anno-1800-1920-1080.png

civilization-vi-1920-1080.png

divinity-original-sin-2-1920-1080.png

far-cry-5-1920-1080.png


Do these results explain why a 25% gap at 1080p become 41% at 4K to you ? or do I have to make a 72 pages post to prove why comparing high end GPU at 1080p is a bad idea.

Sure you can quote 1440p results but using 1080p benchmarks to prove 2080Ti is just "32% faster than 5700XT" is downright retarded.

You don't seem to realize the difference between me linking averages of all the games and you linking single cherry picked examples. If you don't know the difference that's on you. I'm going to once again link the averages, which tell the whole story

https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-evoke/28.html
 
Nvidia won't bring a 2080 Ti Super to market until it needs to. Otherwise $1,200 per card is far too lucrative for them to pass up if there are people still willing to pay it. I will say this, if AMD's Navi top end is anything like the mid range, expect steep price cuts from Nvidia and a return to sanity in the high end market.
It wont happen. Even if, somehow, AMD made a 2080ti Navi card, it wont come out until 2020, and people forget nvidia is STILL 14nm. All the benefits AMD is getting from 7nm can be applied to the RTX 3000 series too. If Navi were to compete, wed just get the 3000 series and AMD would take bargain placement for the fourth generation in a row now.
 
It wont happen. Even if, somehow, AMD made a 2080ti Navi card, it wont come out until 2020, and people forget nvidia is STILL 14nm. All the benefits AMD is getting from 7nm can be applied to the RTX 3000 series too. If Navi were to compete, wed just get the 3000 series and AMD would take bargain placement for the fourth generation in a row now.

I've pointed this out in other comments multiple times now but AMD's die size (5700 XT) is less then half of Nvidia's mainstream offerings. If Nvidia is able to do 7nm with it's massive chips, AMD is likely able to do 7nm+ with it's own. It doesn't make sense historically either, Nvidia has never been one to rush to a new node. That goes double right now as Nvidia's die sizes have only gotten bigger. Nvidia has had enough trouble making their current cards on their current node even this late in cycle, not even talking about 7nm. Go and look at stock of Super cards. The only refresh I know of that has such poor stock.
 
You don't seem to realize the difference between me linking averages of all the games and you linking single cherry picked examples. If you don't know the difference that's on you. I'm going to once again link the averages, which tell the whole story

https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-evoke/28.html

lol it's you who cherry picked the 1080p avg buddy, I just called you out for that. If 1080p is what you are gonna play at then high end Navi or 2080 Ti super are not meant for you, just buy a 5700 XT or 5700.
 
The only way a "Ti Super" is going to appear is if Nvidia throw out a larger TU chip, while Ampere still goes through development (in a similar manner to Intel with their Coffee Lake refresh). They would also have to release a new Titan RTX too, to offset the fact that there's little difference between a 2080 Ti and Titan RTX as things currently stand; a Ti Super would just displace the old Titan, making the product redundant.

The Titan RTX does have 24GB of VRAM which could be taken down to 12GB of VRAM for a 2080 Ti Super model. That combined with high precision FLOP nerfs could make a viable SKU between the 2080 Ti and the RTX Titan.
 
Looking at the performance numbers and die sizes, a doubling of die size for 9% extra performance (RX 5700 XT vs 2070 Super) makes it very clear that AMD's chip does far more with far less die space.


You know how you know which product is superior?

When it takes proponents several paragraphs to explain why its opposition is better.

You're pulling out tech specs (on paper) and all this esoteric info...wherein you should be able to simply say: _____ runs games better in maximum settings and resolution.

But you can't.

Nvidia RTX dominates absolutely.

So glad I bought the 2080Ti rather than take chances on inferior products.
I find it incredible how can someone pay 1200 dollars on a graphics card. Like apple fans paying 1200 for an iPhone. This is why they keep milking us, these prices are truly outrageous. I won't be riding that crazy train.

Apple? Yeah, sure. Now have you seen Samsung Android prices lately (the last 3+ years)?? They go for $1000 - $2000, wake up and smell the money!!

Funny how the Android fans only see Apple prices as a reason to complain about!
 
lol it's you who cherry picked the 1080p avg buddy, I just called you out for that. If 1080p is what you are gonna play at then high end Navi or 2080 Ti super are not meant for you, just buy a 5700 XT or 5700.

:facepalm:

One cannot cherry pick an average. By definition, cherry picking is taking one or more points of data out of a set, often times favorable to one's opinion / argument. An average is plurality of data, a complete set. In addition, I provide the link to the entire review in my comment to add context, you did not.

FYI the argument you are making with this line:

"If 1080p is what you are gonna play at then high end Navi or 2080 Ti super are not meant for you, just buy a 5700 XT or 5700."

Is the same that could be levied again the 9900K with 1080p reviews. You don't seem to understand that 1080p results are included in TechSpot's GPU reviews for a reason. I've already discussed this in previous comments though and you never directly replied to the points levied. At this point you are willfully ignorant and that's on you.
 
:facepalm:

One cannot cherry pick an average. By definition, cherry picking is taking one or more points of data out of a set, often times favorable to one's opinion / argument. An average is plurality of data, a complete set. In addition, I provide the link to the entire review in my comment to add context, you did not.

FYI the argument you are making with this line:

"If 1080p is what you are gonna play at then high end Navi or 2080 Ti super are not meant for you, just buy a 5700 XT or 5700."

Is the same that could be levied again the 9900K with 1080p reviews. You don't seem to understand that 1080p results are included in TechSpot's GPU reviews for a reason. I've already discussed this in previous comments though and you never directly replied to the points levied. At this point you are willfully ignorant and that's on you.

well out of 3 avg for 1080p, 1440p and 4k you picked 1080p where 2080 ti is only "32% faster than 5700XT" lol, and it is an aib 5700XT nonetheless. You made it the point of your rant about Navi can easily be made 30% faster.

show me where Techspot use 1080p for GPU review hm ?
https://www.techspot.com/review/1896-msi-radeon-5700-xt-evoke/
https://www.techspot.com/review/1870-amd-radeon-rx-5700
https://www.techspot.com/review/1791-amd-radeon-vii-mega-benchmark/
https://www.techspot.com/review/1865-geforce-rtx-super/
https://www.techspot.com/article/1702-geforce-rtx-2080-mega-benchmark/

New 5700XT vs 2070 Super benchmark

In none of them there is any 1080p benchmark. So who is being ignorant ? You just dont compare high end GPU by their 1080p performance, just dont.
 
Last edited:
I find it incredible how can someone pay 1200 dollars on a graphics card. Like apple fans paying 1200 for an iPhone. This is why they keep milking us, these prices are truly outrageous. I won't be riding that crazy train.


"Affordability" means different things to different people.

You see those computers in my profile photo? I have two Area 51 towers and a 15" and 17" laptop. Anyone who spends over $10,000 on computers for productivity isn't worried about spending $1200+ for the best graphics card on the market. My FTW3 2080Ti was just shy of the cost of my iPhone XS MAX512GB ($1567)

But they earn their worth in making my work and fun easier.
You assume that just because people have money, they spend it foolishly..? Everyone knows that the RTX for gaming are overpriced flops. Who cares about the best GPU on the planet, the price doesn't scale with performance and is a niche market. (Nvidia customers didn't get much, 3 years later with a rtx2080, over their 1080ti...)

Mainstream is where it is at... and AMD is killing Nvidia in the mainstream market with their new gaming-only architecture. RDNA's whitepapers dropped and Jensen has nothing to combat RDNA with. Go read them, we have not even seen the start of what is to come for RDNA.


Secondly, You can gleam all you want about wanting to own full TU-102 @ $1,400, but a $600 AMD's 5800 Series is going to wipe the floor in games with that overpriced server chip. Nobody is going to care if it's 5%-10% slower than a $1400 card, when it cost half as much.

Nvidia can not compete. Even if Nvidia reduces their TU-102 chip by $400 bucks, the 5800 Series will still be $300 cheaper. Nobody is going to spend $1k on a full Titan, when they can get a card @ $600 bucks that a few % slower. And we all know that Nvidia can not sell the full TU-102 chip for $1k... so win/win for AMD.

Nvidia is dead. Anyone seen Jensen?
 
Last edited:
Back