Nvidia GeForce RTX 2080 Super Review

I hope AMD's 5800XT (or whatever they may call it) is able to at least Match this for £100 less. Then hopefully, Nvidia will churn out something which can convince me to move on from the 1080Ti, just in time for "Next Gen" games to start arriving.
 
I bet GTX 1080 Ti owners are feeling really happy about their decision if even the newly released, same-price $700 2080 Super is only slightly faster...almost 2.5 years after the GTX 1080 Ti's release
 
Very scientific and reasonable tests and comments. Thank you again, Steven. I just bought Vega56 three weeks ago. It cost me $297 then and runs perfectly in all the games @1080p. Plus, it's much more effective for mining Monero and ETH than AMD's new 5700 younger brothers and NV's Turing 20x0 series.
 
It's better then nothing, big navi save us? It's pretty disparaging that bitcoin went down but GPU prices stayed the same.... The Radeon vii compute power is still pretty amazing, it's selling for more then the 2080 used. 650$+. If you bought it for gaming you can get your money back or hope that bitcoin booms.
 
That 1080Ti must be some regret for nVidia, I bet you we wont see that kind of card for a while :p:p
 
Most of the Pascal cards from the GTX1060 and upwards still look pretty decent today if you bought them on launch. You got your money's worth after all this time.

They were a good step above the GTX 9 series, because they went to a new process. I saw when they arrived the step to 16nm and realised it would likely be a long time before they were replaced with anything much better. Process improvements were already slowing. So I bought one and it's still difficult to justify replacing most of them without paying a premium.

GTX1080 is still comparable to brand new $350 cards well over three years since launch, and GTX1080Ti still comparable to $700 cards nearly two and a half years after launch.
 
I really wish AMD could create something more competitive than the Radeon VII at the $600+ range. It is great to see AMD and Nvidia competing in the $350 to $500 range (although the RX series looks more tempting to me unless/until ray tracing takes off), but as it currently stands, Nvidia only stands to cannibalize it's dominance at the high end of the market.

We are going to need some beefy new cards over the next few years as 4k becomes more popular and VR huds continue to improve their specs.
 
"Without a doubt, GTX 1080 Ti owners can rejoice at the news."

I guess... Thing is, as one of those "lucky" people, I'd actually prefer to have another jump in performance like the ones we previously got: Kepler -> Maxwell, Maxwell -> Pascal. Meaningful advancements are exciting. Look how sad and boring the CPU market had been before Ryzens arrived at the scene. Don't think tech enthusiasts are ever happy about such scenarios. That's just me though.

I do have one question about your testing methodology though: is there a particular reason you do not test/show frame times in your gaming tests ? Average fps is an OK'ish tool, at best, to gauge how good a particular piece of tech is. 1% lows are nice and provide a more detailed view but they are still not as good as frame time graphs.

I am a huge fan of your 30+ game tests. That's one of the main reasons I choose to visit your site. Thought I would throw that in just so it wouldn't look like I'm only here to criticise your work, which I actually appreciate and respect :)
 
"....Coming in at the same $700 price point as its predecessor, "

Typical nGreedia. They hope some of their fanboys won't notice there is a better clone for the same amount of money.....pathetic.
 
Super..... flop!
The 2080 SUPER has between 2%~8% increase in performance.... and it's a refresh of hardware, not software? lol...


Thank god for AMD's drivers then, because my Radeons got the same type of improvement in games in 1 year... and I didn't even have to buy, or pull the card.
 
I really wish AMD could create something more competitive than the Radeon VII at the $600+ range. It is great to see AMD and Nvidia competing in the $350 to $500 range (although the RX series looks more tempting to me unless/until ray tracing takes off), but as it currently stands, Nvidia only stands to cannibalize it's dominance at the high end of the market.

We are going to need some beefy new cards over the next few years as 4k becomes more popular and VR huds continue to improve their specs.

The Radeon VII is EOL (No longer being sold). Because in many games the 5700xt come close and in some instances it beats the R7.

Even if Ray tracing takes off in games next year, my RTX2080 won't be able to play those games any better than it does now. Turing has broken ray-tracing... and Nvidia needs a new GPU before they can try and sell "RTX On" bullcrap to real gamers.

Yes, AMD is coming out with the Radeon 5800 Series soon for the 4k market..
 
That 1080Ti must be some regret for nVidia, I bet you we won't see that kind of card for a while :p:p
Not really. AMD doesn't compete in the top end so Nvidia can get away with releasing 1080 Ti for the 3rd time.
 
Not really. AMD doesn't compete in the top end so Nvidia can get away with releasing 1080 Ti for the 3rd time.

But RTX2080 is not a GTX1080Ti that was my point, nVida cant seem to be able to really beat it for the same price, and for a $700 card it was surprisingly good value for money when it came out and it still perfoms well so people wont upgrade from it. Its the Windows XP of video cards and RTX is the Vista before SP1 ;-)
 
I do have one question about your testing methodology though: is there a particular reason you do not test/show frame times in your gaming tests ? Average fps is an OK'ish tool, at best, to gauge how good a particular piece of tech is. 1% lows are nice and provide a more detailed view but they are still not as good as frame time graphs.
Probably because time frame graphs can get very messy, especially if one is trying to compare multiple graphics cards; as standalone items, I'm not convinced they actually give any additional information that's worth knowing about. Unless one knows exactly what is being asked of the driver and GPU at any given point in a benchmark, frame times are no more useful than knowing the average frame rate. Personally, I like to see just two statistics displayed in benchmarking (one measure of central tendency and one measure of dispersion) and the TechSpot graphs meet that requirement.
 
So this is NVIDIA's lame a$$ excuse for users to upgrade. Anybody interested in RTX is better off going w/ a 2070 Super, nuff said. I on the other hand will still comfortably rely on my 1080TI for my graphic-intensive tasks ;-).
 
But RTX2080 is not a GTX1080Ti that was my point, nVida cant seem to be able to really beat it for the same price, and for a $700 card it was surprisingly good value for money when it came out and it still perfoms well so people wont upgrade from it. Its the Windows XP of video cards and RTX is the Vista before SP1 ;-)
Sure, I just think they can do better but they have no reason to. If your Windows analogy is true then we're gonna get some crazy good GPU in the future (Windows 7) :)
 
I hope AMD's 5800XT (or whatever they may call it) is able to at least Match this for £100 less. Then hopefully, Nvidia will churn out something which can convince me to move on from the 1080Ti, just in time for "Next Gen" games to start arriving.
That right there is what I think has brought this situation in the first place. Performance stagnation, less value at any given tier due to a monopoly, people want competition to justify and buy for the same company that is dominating the market even though the competitor is equal or better, that was the reason people bought Nvidia when AMD had similar or better products and now look at what we got, $1200 GPUs and people still think the same way and still make the same decisions so to repeat the cycle again.
 
I do have one question about your testing methodology though: is there a particular reason you do not test/show frame times in your gaming tests ? Average fps is an OK'ish tool, at best, to gauge how good a particular piece of tech is. 1% lows are nice and provide a more detailed view but they are still not as good as frame time graphs.
You're on the wrong site to be making such out of touch demands, you're better checking out Gamers Nexus for that type of analysis, lol ;)
 
Probably because time frame graphs can get very messy, especially if one is trying to compare multiple graphics cards; as standalone items, I'm not convinced they actually give any additional information that's worth knowing about. Unless one knows exactly what is being asked of the driver and GPU at any given point in a benchmark, frame times are no more useful than knowing the average frame rate. Personally, I like to see just two statistics displayed in benchmarking (one measure of central tendency and one measure of dispersion) and the TechSpot graphs meet that requirement.

Other sites seem to do a good job at making time graphs readable so don't really see that part as an issue.

I have to disagree with the statement that average fps are just as useful as frame time graphs. Intermitten spikes in frame delivery are going to be obfuscated by avg FPS measurement. Meanwhile on a frame time graph they will be visible for anyone to see.

Frame time testing proved/showed that though multi-GPU setups had high avg FPS numbers, sure, but they were also having serious frame time delivery problems causing "hitches" or jumps during gameplay. Can't show such problems using only avg FPS.
 
Back