Bottom Line: We'd Only Pay This Much

Rehashing our conclusion from the original RTX 2080 review would make no sense, rather we want to expand upon it. In summary though, the vanilla RTX 2080 is too expensive at launch to make sense. Even assuming a best case scenario where it sells at the $700 MSRP, a price you may never see in the short term, it doesn’t currently offer anything new in terms of regular gaming. We’ve already had this level of performance/price offered by the GTX 1080 Ti.

Right now, it's very much looking like the RTX 2080 Ti will be the only 20 series model worth purchasing unless we see a massive price correction or new technology implementation like DLSS becomes widely supported and works really well. At this point though, we know very little in regards to how well DLSS really works. We'll be digging into this soon and follow its development, but right now I wouldn’t purchase an RTX graphics card in the hope that DLSS is amazing.

Now, I said the 2080 Ti is the only 20 series model worth purchasing but I didn’t say that because it’s good value and I’m not actually even recommending you buy it. Personally I’d still just get the GTX 1080 Ti and compromise on visual quality settings at 4K. But if you’ve got bottomless pockets and you don’t want to compromise, then the experience offered by the 2080 Ti really is something else. So the gamble that ray tracing and DLSS end up being worthwhile features for this generation is less of an issue for the 2080 Ti. We already know you’ll be receiving never-seen-before 4K performance out of a single graphics card, so at least there is that.

Getting back to the RTX 2080, why you would pay a 20% premium over the 1080 Ti at present is beyond me. It'll be interesting to see how many gamers do and if in the near future Nvidia will be forced to lower the MSRP due to a slowdown in sales. But if gamers continue to snap them up at alarming rates, well we’re all doomed.

Next month the RTX 2070 is set to arrive and will likely be available for no less than $600, probably a tad more than that. I’m expecting GTX 1080-like performance, maybe a little better overall, but a similar experience to what we’re getting now for $470. We're working under the assumption that ray tracing is going to be a complete bust for the 2070 and it’s hard to say how well DLSS will work since we don’t know how well DLSS works period.

By the time the RTX 2070 is released next month we may very well still be in the dark regarding these features, so another big gamble there. But looking at the list of games that are advertised to support either ray tracing or DLSS, it’s already looking like it won’t be worth the gamble. It really is all about the pricing, the AIB MSRPs for these new Turing GPUs is the absolute maximum you should even entertain paying for one of the new cards.

If we ignore ray tracing and DLSS for a moment, and we really ought to given they are still an unknown quantity at this point, that leaves us with the raw gaming performance we just looked at...

Power consumption is a little better but overall much the same and overclocking headroom is no better. The GTX 1080 Ti has been doing its thing for a year and a half now and the 1080 has been around for a little over 2 years. To justify the existence of these new parts, the RTX 2080 Ti needs to cost no more than $800 and the RTX 2080, $600. At those prices they would be pretty exciting and worth buying in my opinion. Short of that, I would sit this generation out.

Back when Nvidia announced the specs and pricing of these new RTX GPUs a month ago, I said the following…

In any case just don’t pre-order, it’s not a smart idea. For now ray tracing looks to be more hype than anything and Nvidia are using it to hide the fact that they don’t actually have anything new to sell you, at least just yet. The real performance gains will come when they move to the 7nm process next year, so I don’t see the GeForce 20 series having a long lifespan, certainly nothing like what we saw with Pascal.

Nvidia simply isn’t going to accept giving up the performance crown to AMD and I expect that this is exactly what they will be doing with the GeForce 20 series next year, if they don’t act and push out 7nm GPUs as soon as possible.

So the GeForce 20 series looks to be a money grab in my opinion, a stop gap to 7nm if you will. Cash in now on the RTX hype and then deliver the real goods next year. After all, the RTX name change is designed to make you think your Pascal GTX 1080 Ti, GTX 1080 or GTX 1070 are now heavily outdated, you gotta dump that GTX dinosaur and get in on the RTX action.

I think this is why Nvidia’s pushing the RTX 2080 Ti out of the gate right away, rather than waiting a year like they did with the 1080 Ti. They aren’t in a position to juice this series over a 2-year period. In fact, in my opinion they don’t even have an entire year. So rather than milk it over a 24 month period, the plan is to ravish gamers wallets from day one.

I still believe all that to be true, with the exception of the 7nm part, since writing that article news surfaced that Turing was originally intended to be fabricated on a 10nm process from Samsung, but due to delays were forced back to TSMC's 12nm process.

This means next year it’s possible Nvidia will refresh Turing on Samsung's 8nm process, an extension of their 10 nm node, with a reduction in transistor size. If this is true then it would seem AMD has received priority on TSMC’s 7nm process, somehow forcing Nvidia to look elsewhere. It seems unlikely to me Nvidia would look elsewhere if they weren’t forced to.

It will be interesting to see how this plays out as Samsung has never fabricated such a large chip. The biggest thing they even made on their 14nm process was 150mm squared, so a little bigger than the GP107 die used by the GTX 1050 Ti. So paying a big premium for a product that’s probably going to have the lifespan of a Mayfly, won’t be the wisest investment you’ll ever make.

Shopping Shortcuts: