GeForce RTX 2080 Ti & 2080 Mega Benchmark: 35 Game Benchmark Test

Looking at the tests I have only one question, what is the price difference between 1080 and 2080?
The increase in performance and the difference between 1080 when it was just released compared to 2080 is too small. This year it is simply better to spend money on something exciting, else.
So long Juang!
 
What I want to know, is whether or not RTX 2080 or 2080 Ti provide enough performance to guarantee smooth experience of web browsing + video watching on an 8K monitor (YouTube + VLC), before I order one.

Could you, please add at least some basic 8K benchmarks? I don't care much about gaming as such, because I know on 8K it is not realistic anyhow.
 
Last edited:
This is the first time I'll skip the flagship GPU of a series since I bought the GTX 680. The increase in performance over the 1080ti is simply not worth the massive increase in price. Sure it offers 60+ fps in most titles at 4k but with a G-sync monitor the 1080ti is still pretty solid. At 1440p the difference in performance is even smaller due to CPU bottleneck. If you own a 1440p144Hz monitor like I do and hoped that the 2080ti would finally allow you to enjoy gaming at around 144fps, well you're out of luck. Besides you're going to need a very, very fast CPU to not bottleneck it at this resolution (surprised even the 8700K, which is very fast, bottlenecks it at 1440p).
 
What I want to know, is whether or not RTX 2080 or 2080 Ti provide enough performance to guarantee smooth experience of web browsing + video watching on an 8K monitor, before I order one.

Could you, please add at least some basic 8K benchmarks? I don't care much about gaming.

Considering that a GTX 1050 has the necessary DisplayPort 1.4 output to handle 8K@60Hz video playback (https://www.nvidia.com/en-sg/geforce/products/10series/compare/), you don't need either of these GPUs for that.
 
Considering that a GTX 1050 has the necessary DisplayPort 1.4 output to handle 8K@60Hz video playback (https://www.nvidia.com/en-sg/geforce/products/10series/compare/), you don't need either of these GPUs for that.
He might mean GPU grunt to smoothly playback 8K video.
All I can say is, I tested 8k 60fps video from YouTube (In Chrome) and Hardware acceleration kicked in, GPU usage on a 1080Ti was about 90%, thing is, as far as I'm aware, the video Decoding Engine in the 1080Ti is the same as the rest of the line up, so a 1050 will perform exactly the same.
 
Reality is these cards are not going to sell in vast numbers (although they have a high margin) and the lower down the range you go, the more of them that sells and the performance matters to more people.

I await the RTX2070. I had said before I see it struggling in games against a GTX1080 just looking at the specs on paper, and that seems extremely likely now. In fact I can see an overclocked GTX1070ti being within reach of it. It's very possible at 1440p a $399 GTX1070ti will be within just a few percent of a $550+ RTX2070.

Pricing will be a serious problem for these cards for most people.
 
Unless you have more money than sense, the GTX 1080ti is still the way to go.

The whole ray tracing thing reminds me of the 3D TV's that were crammed down our throats but people rarely used. I'm sure it will be the norm in 3-4 years. But for now, not worth shelling out double the $$ for yet getting the same frame-rate as a 1080ti.

Props to Steve for putting together another excellent analysis.
 
What I want to know, is whether or not RTX 2080 or 2080 Ti provide enough performance to guarantee smooth experience of web browsing + video watching on an 8K monitor, before I order one.

Could you, please add at least some basic 8K benchmarks? I don't care much about gaming.

Considering that a GTX 1050 has the necessary DisplayPort 1.4 output to handle 8K@60Hz video playback (https://www.nvidia.com/en-sg/geforce/products/10series/compare/), you don't need either of these GPUs for that.


That is correct, playback is not the same as Gaming which calculates and renders on the fly
 
This whole overpriced GPU and "scarcity" scheme appears to be one big scam.

I was hoping for something better from NVIDIA. The ray Tracing is nice, but the new cards are way overpriced and not worth the cost. I am seriously considering buy a couple used 1080s off of ebay and waiting until Intel get's back in the game in 2020. The competition should help bring these prices back down.
 
1080 for me in the next few months unless the 2070 somehow beats it at a reasonable price (which seems unrealistic at this time). If you wanted to be nice to your PS, then the 2080 may be preferable to the 1080 Ti if you're building new or upgrading.

If buying new, IMO a 20 series instead of a similar price/performance 10 series is a better option *for the future* if games adopt DLSS and get a performance boost from it. If they don't, you've still paid the same for the same performance. You're not out anything. If DLSS does add performance, then you get a benefit for no increase in cost.
 
The whole ray tracing thing reminds me of the 3D TV's that were crammed down our throats but people rarely used. I'm sure it will be the norm in 3-4 years.

In which universe are 3D TVs the norm??

On subject though, I've been around long enough to know that any new (r)evolutionary graphics tech (hardware T&L, programmable shaders) takes a few gens before achieving acceptable performance penalty leading to widespread adoption in software.

The big question is when will consoles implement it? Will AMD have an affordable hybrid ray tracing hardware ready for PS6/Xbox Next Next? Or will Nvidia become the new preferred GPU supplier for mass produced low power boxes?

Because once consoles are ray tracing capable, we will begin to see the rendering tech implemented in the majority of games as standard.
 
I just can't justify paying $900 for a GPU that's already 1.5 years old, has had it's day and drivers matured vs a $1000 GPU that has BETA drivers and un-optimized games benchmarked on it. I would suspect the 20 series will get faster in the coming months as the drivers are finalized and have even the slightest time to mature.

I just can't accept Nvidia took 2 years and this is the best they can do, if it is, well then I can always keep waiting for something better. However at the same time I've been holding out with my 970 way too long and been itching to use my 4K monitor and now 4K TV to their potential something which the 2080 seems quite capable of doing. I can always use the EVGA step up program I'm currently eligible for, which would cost $10 less than a 1080ti or roughly $740 CAD which seems like a pretty good deal for a RTX 2080 no?
 
Great article. Thanks for the thorough coverage benchmark wise.

Agree with the assessment. This helped me decide on my upgrade path.
Currently using a 760 and a 660 in 2 computers, but was able to nail down a couple 1060's for a great price on ebay. This should hold me over until they get 4k more standardized at a reasonable price.
 
Ultimately these cards aren't going to see a huge performance boost (the 2080 is better than the 1080ti by all accounts). What you will see actually having a net benefit is stuff that makes use of the tensor cores. Testing them against traditional cards is going to see about the same jumps we usually see in a new card being released and doesn't really test what these cards are ultimately all about.

I'd be curious to see if people can make use of the tensor cores to help with things other than raytracing.
 
This is like the CPU evolution... eventually silicon limitations are reached and each new generation is just a bit faster than the previous one. They should get rid of silicon and switch to germanium.
 
Is there any indication of how much power will be consumed once something, a game/bench, will be utilizing the additional cores on these cards? Will all 3 'sections' of the card be able to hum along full blast without hitting power/heat limitations? Or was this not even a concern for Nvidia because the useful life of the cards will be up by time games/coding catch up to what these can do?
 
If you own a 1440p144Hz monitor like I do and hoped that the 2080ti would finally allow you to enjoy gaming at around 144fps, well you're out of luck..

This is not true.
As long as the monitors supports G-Sync, it will work fine and be much smoother.
The GPU doesn't have to hit 120/144/165 FPS at all times to validate the monitors high refresh rate.

[ Besides you're going to need a very, very fast CPU to not bottleneck it at this resolution (surprised even the 8700K, which is very fast, bottlenecks it at 1440p.

I don't think your understanding what you are seeing.
 
I’m getting fed up of hearing complaints about the price. Nvidia have stated that they aren’t discontinuing the 10 series. These cards are designed to sit above the 10 series and give those who have more money more things to choose from like cards with ray tracing and deep learning capability. Something I personally as a tech enthusiast think is genuinely amazing! Real time ray tracing for crying out loud! That is obviously no gimmick. In 5-10 years time it will be a normal feature and our games will look amazing for it.

Can someone please explain to me why Nvidia should sell their products for less? They have no competition and they aren’t a charity.

No one has to buy these cards, if you don’t like the price don’t buy one?
 
This is the first time I'll skip the flagship GPU of a series since I bought the GTX 680. The increase in performance over the 1080ti is simply not worth the massive increase in price. Sure it offers 60+ fps in most titles at 4k but with a G-sync monitor the 1080ti is still pretty solid. At 1440p the difference in performance is even smaller due to CPU bottleneck. If you own a 1440p144Hz monitor like I do and hoped that the 2080ti would finally allow you to enjoy gaming at around 144fps, well you're out of luck. Besides you're going to need a very, very fast CPU to not bottleneck it at this resolution (surprised even the 8700K, which is very fast, bottlenecks it at 1440p).
No way in hell your cpu bottlenecks anything. What are you on about.

I have a 1080 with 4770k and playing games at 1440p smooth as hell no issue at all. What nonsense.
 
Back