1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

RTX 2080 Ti Super is apparently exclusive to the (not always working) GeForce Now RTX

By mongeese · 64 replies
Aug 25, 2019
Post New Reply
  1. Nvidia recently announced that their beta game streaming service, GeForce Now, would get the upgrade from Pascal to Turing and with it, ray tracing – only it didn’t. Not only were users reporting that ray tracing options were absent, but the performance was up to 50% worse, leading the more inquisitive to investigate and discover their games were running on a new GeForce RTX T10-8 GPU -- along with a variety of new and inadequate CPU and memory configurations.

    Ten days ago we wouldn’t have been able to say what the T10-8 is, but last week’s AIDA64 update included a reference to it as a variant of Nvidia’s flagship TU102 die, the powerhouse behind the RTX 2080 Ti. Ears were perked and the conclusion that a 2080 Ti Super was coming was reached, yet it now appears that the T10-8 will exclusively power GeForce Now RTX.

    However, it took Nvidia less than a day to pull the plug and return users to the older Tesla P40 GPUs, with no ray tracing capabilities. Quite the whirlwind indeed. So, what could have happened?

    Shortly after Nvidia’s announcement users noticed the switch to the T10-8, either via playing games that list the hardware, noticing severe performance drops, or both. One user had Ghost Recon Wildlands detect a dual-core system with 7 GB of system memory and an RTX T10-8 GPU with 8 GB of memory, all running on Windows Server 2012. That last part may explain the issues with ray tracing: Windows 2012 doesn’t support Microsoft’s DXR ray tracing implementation, and Nvidia’s initial fix might’ve not been as good as it needed to be.

    A few other things didn't add up, like the GPU’s average usage during the Ghost Recon Wildlands 1080p benchmark at 81% on a dual core CPU and just 8 GB of memory, so it’s possible (and this is just speculation) each user was given half a T10-8 to game on. There’s some evidence for this: one user found a 48% reduction in Fire Strike score between the P40 (equivalent to the GTX 1080) with a regular CPU and RAM configuration, and their T10-8 system with slightly worse CPU and RAM numbers. A few users who apparently didn’t witness any performance drop report their T10-8 systems as having 16 GB of memory.

    What may have occurred is that Nvidia’s marketing team outdid their engineering team, building up enough hype about ray tracing on GeForce Now that every user decided to log on and test it out simultaneously, exceeding the number of supported systems. As an automatic fix, the software could have divided each system into two virtual ones, doubling the number of possible players but halving the performance.

    Since the initial release several days ago, however, the reports of poor performance and broken ray tracing have ceased, so it seems likely things are working as they should for most players.

    The bottom line is there’s no RTX 2080 Ti Super coming to the mass market for now. This isn't a huge surprise considering Nvidia can’t squeeze another configuration between the 2080 Ti (4352 cores) and the Titan RTX (4608 cores) without drastically sabotaging one. On the flip side, GeForce Now users should get a nice performance boost with the T10-8 once the bugs are ironed out.

    Permalink to story.

  2. Evernessince

    Evernessince TS Evangelist Posts: 4,183   +3,793

    Nvidia won't bring a 2080 Ti Super to market until it needs to. Otherwise $1,200 per card is far too lucrative for them to pass up if there are people still willing to pay it. I will say this, if AMD's Navi top end is anything like the mid range, expect steep price cuts from Nvidia and a return to sanity in the high end market.
  3. QuantumPhysics

    QuantumPhysics TS Evangelist Posts: 1,639   +1,233

    Navi won't match up. The 2080Ti will continue to dominate as the King of the mountain.

    If by some miracle AMD catches up, the 2080 Ti super will drop, re-kill the game and the whole cycle will begin again next year.
  4. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 11,584   +5,140

    That is the way I see it. Why drop a new king, if your old king still has it?
  5. Evernessince

    Evernessince TS Evangelist Posts: 4,183   +3,793

    Given that there isn't much room between the 2080 Ti and RTX Titan (the full chip), there isn't much room performance wise for the 2080 Ti super to gain. So IF AMD does manage to beat the 2080 Ti there isn't much Nvidia can do besides lowering pricing and a sub 6% performance increase.
  6. neeyik

    neeyik TS Guru Posts: 286   +246

    The only way a "Ti Super" is going to appear is if Nvidia throw out a larger TU chip, while Ampere still goes through development (in a similar manner to Intel with their Coffee Lake refresh). They would also have to release a new Titan RTX too, to offset the fact that there's little difference between a 2080 Ti and Titan RTX as things currently stand; a Ti Super would just displace the old Titan, making the product redundant.

    Theoretically, they could release an 8 GPC version of the TU102 - it would be a huge chip, though, about 1000m2, not to mention tanking a big power draw. The reference coolers that Nvidia use aren't great a coping with 250W, so it would need a new cooler too or a notable drop in voltage and frequency to keep it to something like 275W.
    Charles Olson and Evernessince like this.
  7. Evernessince

    Evernessince TS Evangelist Posts: 4,183   +3,793

    That's sort of what Nvidia did last gen. They released the Titan, then the faster 1080 Ti, and then an even faster Titan. Although it should be said that the original Pascal Titan was never the full chip to begin with so we knew Nvidia had a faster chip. This time, the RTX Titan is the full 4,608 shader units so in order to get more performance then that they would have to make an even bigger chip, which is extremely unlikely.
    Charles Olson and TempleOrion like this.
  8. neeyik

    neeyik TS Guru Posts: 286   +246

    I'd say it's unlikely for as long as Nvidia have the 2080 Ti as the 'fastest gaming graphics card' banner to wave about. Now we might not see a big Navi chip until next year, until TSMC have a 7+ process line fully running, by which time Nvidia & Samsung may well be ready with Ampere.
    Charles Olson likes this.
  9. QuantumPhysics

    QuantumPhysics TS Evangelist Posts: 1,639   +1,233

    Ummmm: They can release the 3080Ti, the 3080, the 3070 and the 3060.
  10. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 11,584   +5,140

    Yep, but until their sales are threatened, what would be the point? And with the recently released "Super" cards, I'd wager the 30xx cards are a good ways away.
  11. neeyik

    neeyik TS Guru Posts: 286   +246

    Given that it was a 2 year gap between Pascal and Turing (April 2016 to September 2018), and not far off 2 years between Pascal and Maxwell (February 2014), we're probably still a year off seeing anything at all relating to the GeForce 30 series.
  12. pcnthuziast

    pcnthuziast TS Evangelist Posts: 614   +210

    RTX 2080 ti super duper ultra deluxe when?
    Charles Olson and m3tavision like this.
  13. Evernessince

    Evernessince TS Evangelist Posts: 4,183   +3,793

    We are still a decent ways off from the launch of those cards. Most likely AMD will launch big Navi and then 6 months later Nvidia will launch the 3000 series. Given that they are rumored to be 7nm and their massive die size, the node is going to need to be pretty mature for decent yields.

    As an example:


    The RTX 2070 Super is a bit faster but the die size is more then double. As you can imagine, this massive difference in die size is what allows AMD to release it's GPUs on 7nm far before Nvidia. Yields are exponential, as die size increase yield decreases dramatically.

    Given that the 2080 Ti is only 32% ahead of the RX 5700 XT AMD doesn't even need to match Nvidia's mid range die size to beat the 2080 Ti.


    Looking at the performance numbers and die sizes, a doubling of die size for 9% extra performance (RX 5700 XT vs 2070 Super) makes it very clear that AMD's chip does far more with far less die space. It will be interesting to see exactly what die size big Navi will be and how well it scales. All I have to say it that a Navi chip with the same die size as the RTX 2070 Super (more then double the current RX 5700 XT die size) would have to scale extremely bad to not match the 2080 Ti. In fact they would have to get a measly 32% scaling. That's worst case scenario and of course assuming they are even able to use that die size on the 7nm node at that time. Best case scenario is a high 90% scaling, which would have big Navi competing with the 3080 Ti. Like I said though, no idea if 7nm can take an approx 500nm2 die size or if the uArch can scale that well although AMD do bill it as a scalable uArch.
  14. QuantumPhysics

    QuantumPhysics TS Evangelist Posts: 1,639   +1,233

    You know how you know which product is superior?

    When it takes proponents several paragraphs to explain why its opposition is better.

    You're pulling out tech specs (on paper) and all this esoteric info...wherein you should be able to simply say: _____ runs games better in maximum settings and resolution.

    But you can't.

    Nvidia RTX dominates absolutely.

    So glad I bought the 2080Ti rather than take chances on inferior products.
    Last edited: Aug 25, 2019
  15. BigBoomBoom

    BigBoomBoom TS Booster Posts: 66   +62

    Yea, but then Turing will be close to 2 years old and you know NVIDIA next gen will just drop right? If they move to 7nm, expect it to leave Navi in dust behind. NVIDIA will just keep pushing price higher and higher again, until AMD can actually fight back against its current gen and not last gen.

    It's like people keep forgetting Turing was released last year. 2018.
  16. Badelhas

    Badelhas TS Addict Posts: 104   +52

    I find it incredible how can someone pay 1200 dollars on a graphics card. Like apple fans paying 1200 for an iPhone. This is why they keep milking us, these prices are truly outrageous. I won't be riding that crazy train.
  17. QuantumPhysics

    QuantumPhysics TS Evangelist Posts: 1,639   +1,233

    "Affordability" means different things to different people.

    You see those computers in my profile photo? I have two Area 51 towers and a 15" and 17" laptop. Anyone who spends over $10,000 on computers for productivity isn't worried about spending $1200+ for the best graphics card on the market. My FTW3 2080Ti was just shy of the cost of my iPhone XS MAX512GB ($1567)

    But they earn their worth in making my work and fun easier.
    Knot Schure likes this.
  18. Evernessince

    Evernessince TS Evangelist Posts: 4,183   +3,793

    This is less a rebuttable and more "I can't rebut or comprehend the data presented so I'll divert the topic". The world has no obligation to make sense to you. Perhaps next comment you will actually present something of an argument? How about this, I'll summarize it to something you'll understand (or more likely intentionally misunderstand)

    - The RX 5700 XT provides almost twice the performance per square mm as Nvidia's RTX 2070 Super
    - Big Navi may provide up to twice the performance at twice the size

    It is possible for big Navi to compete with Nvidia's next gen chips if you look at the performance and die size of current Navi chips. Please go back and read that comment for how AMD might compete with Nvidia's next gen products. That's with zero improvements over the current stuff. AMD already has Navi+ planned 6 months after big Navi drops as well.

    Looking at AMD's previous flagships, Vega 64 for example had a die size of 495mm2. That a 97% increase over the RX 5700. So let's assume scaling is about 90%, that's 90% added onto the RX 5700 XT, which puts it far above the RTX 2080 Ti. Mind you don't quote me on the scaling, I don't have any math to back that number up nor am I aware of how well the architecture scales. I might have to look into that, seems like it would be a good article for TechSpot.
    Last edited: Aug 25, 2019
    Charles Olson and TempleOrion like this.
  19. billywilly

    billywilly TS Rookie

    Does it though? It costs almost as much as the rest of the computer for like a 5% reach over the 5700 XT. Unless you're playing something relying on nVidia specific acceleration (ie: Witcher 3), or crypto mining, I cannot see how someone could justify spending $1200 on a gaming GPU. You claim to be doing work station stuff with it but wouldn't you want a Quadro for that? I can see the justification when ray tracing finally takes off (probably won't be till the next Xbox/PS consoles drop) but for right now... not so much.

    I also think its silly that anyone is claiming allegiance to a GPU manufacturer. They're not your friends, they're businesses.
    TempleOrion likes this.
  20. amghwk

    amghwk TS Guru Posts: 571   +353

    I have been using Nvidia cards ever since I upgraded from my once-flagship Radeon X1950 XTX. From that time onwards I had owned Nvidia flagships - 8800 GTX, 9800GTX, and upto now using 980Ti. I have been thinking to upgrade to the 20xx series for sometime now, but the price of the Nvidia cards have definitely been blown out of proportion now.

    For instance, yes, the 2080Ti commands an absolute lead, but even it struggles in some games in 4K with everything maxed out. At that absurd asking price, I would have wanted it to cruise like on a butter at 4K with every f-king setting maxed out. But, nope. It's just super expensive, and I'm not going to waste hundreds of dollars for few frame leads. Yes, for even 20 or 30fps lead, the extra premium is not worth it.

    Affordability is one thing, but wise purchasing is totally another better way to both enjoy oneself and uphold the balance of the economy. Because whatever we do today will get back to us later. Just like how Nvidia and Apple and Samsung are now trying to push the absurd prices of their products on the consumers.

    Then comes 2080, but 2070 Super is currently making it an unnecessary purchase. And 2080 Super makes purchasing a 2080 even more useless, since the price is almost the same. But the difference between 2070 Super and 2080 Super is substantial enough to feel the pinch on the wallet.

    The only thing I am eyeing on now is the Radeon RX5700 XT. I always hated the single fan blower design, but now 3rd party boards have been out, and I'm eyeing on the Asus Strix's version of the 5700XT.

    Now the price is almost the same as a 2070 Super from other makers like Galax and Gigabyte.

    Wondering whether I should jump ship to the 5700XT after all this time, or proceed with the 2070 Super.

    I'm still using Intel CPU with the 8700K and see no reason to upgrade. And if Intel keeps the high price trend after a few years when I'm about to upgrade, I wouldn't think twice to get the latest Ryzen in the next round, if AMD continues it's competitive pricing.
    Last edited: Aug 25, 2019
  21. hahahanoobs

    hahahanoobs TS Evangelist Posts: 2,619   +945

    LMAO!!! What!?
    These price drop predictions everytime someone thinks AMD has a sure thing. You're embarrassing yourselves. Please stop.
    QuantumPhysics likes this.
  22. Evernessince

    Evernessince TS Evangelist Posts: 4,183   +3,793

    Never said it was a sure thing and put multiple disclaimers throughout my comments. Even the sentence you quoted of mine has an "if" in it. At no point do I indicate it's a sure thing and I extrapolate on my reasoning in my other comments.

    Dodging the topic and making hyperbolic statements backed up by nothing is your forte.
    TempleOrion likes this.
  23. Evernessince

    Evernessince TS Evangelist Posts: 4,183   +3,793

    Where do you live where their prices are almost the same?

    In the US the RX 5700 XT retails for around $400 - 450. The RTX 2070 Super retails for around $500 - 900 (with the cheaper models out of stock). I don't know what's going on with the 2070 super stock here but many models are having inflated pricing due to it. Even on eBay you are looking at $580 for a new card but from a 3rd party vendor (non authorized reseller). I'd say $500 RTX 2070 Super vs $450 RX 5700 XT (aftermarket) would be a tough choice. If you can get it at $500 new then it might be worth it. Any more then that though and the value of the RX 5700 XT is hard to ignore. I should add that if you do consider AMD, you should take a look at Sapphire. Their 5700 XT pulse was reviewed by GamersNexus with an MSRP of $410. Not only is that price only $10 over MSRP, the included cooler works well. It leaves enough room to OC if you want to.
    TempleOrion likes this.
  24. krizby

    krizby TS Enthusiast Posts: 87   +42

    Kinda nice of you to quote 1080p performance when 2080Ti is 46% and 56% faster than ref 5700XT at 1440p and 4k respectively. Are you planning on buy high end GPU to play at 1080p ?

    Just hoping if they ever release the 2080 Ti super it would be using the same PCB so I can reuse my 2080 Ti waterblock. Selling my old 2080 Ti at 200usd loss then get the Super for an extra 10-15% performance would be nice, gonna need all the fps I get to enjoy Cyberpunk 2077 when it release April 2020 :D.
    QuantumPhysics likes this.
  25. Evernessince

    Evernessince TS Evangelist Posts: 4,183   +3,793

    Like I pointed out earlier, there will be no 10-15% extra performance. The RTX Titan is the biggest turing chip Nvidia has and it is not that much of an increase over the 2080 Ti.

    I used 1080p as a metric because it is the most reliable measure of a GPU's performance. At higher resolution additional bottlenecks come into play and video memory can be a problem. It makes sense that a mid-range card would fall behind at higher resolutions, after all the 2080 Ti is equipped with a larger memory bus, more memory, and faster memory. There's a 100% chance AMD will provide more and faster memory on their higher end cards. It's also very likely they increase the bus size as well. So in effect, the 2080 Ti's additional lead at 1440p and 4K is the result of it being designed to run at those higher resolutions and having improved features GPU and non-GPU related to handle them. Something that would not be hard for AMD to alleviate. I would go as far to say that comparing 4K results of the two cards is misleading if your point is to show GPU efficiency because the RX 5700 XT isn't really designed to play at that resolution and doesn't have the expensive RAM that the 2080 Ti has (among other things).

    This is nothing new though, even the 2070 Super looses significant performance at 4K compared to the 2080 Ti just the same as the RX 5700 XT. The reason is obvious, midrange cards perform best at the resolutions they were designed for.
    TempleOrion likes this.

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...