Nvidia GeForce GTX 1080 Ti Review: Think Titan XP, just a lot more affordable

It's too expensive for a graphics card but the price reduction of the 1080 is a good thing, it should drop the price of the 1070 and maybe the 1060 as well. Those are the sweet spot cards for performance/price.
 
The 4K performance is impressive.
Now you can get a single GPU that's 'somewhat affordable' for 4K gaming, or 2K @ 100+ Hz.

Nice overclock on your 7700K BTW, although if you can get 4.9GHz you can get 5.0GHz!
God I envy your job... now send me your sample 1080Ti as an early birthday present and tell the supplier your dog ate it. They will understand.
 
Whelp, now I just have to wait for the CBPs to release their versions and pick up 2 for SLi....
Should be quite a nice upgrade from my SLI 780 Ti setup.
 
Nice, I think my power supply can still handle that... now I get to wait to see how the partner cards turn out... If, as mentioned and wished for earlier, Gigabyte takes it's G1 cooler and slap it on the 1080Ti, I'll probably get it ha ha ha.
 
It's too expensive for a graphics card but the price reduction of the 1080 is a good thing, it should drop the price of the 1070 and maybe the 1060 as well. Those are the sweet spot cards for performance/price.
I think it did - I picked up a great deal in the USA on a MSI GTX Gaming X 6G for $229.99 with $10 MIR, bringing it down to $219.99. My state's usage tax is 4% so if I remember to claim it next April I will owe $8.80 in taxes, covered by the MIR.

I am a happy boy upgrading from a GTX 760 and checking the tracking number a couple times a day. Friday before 8PM can't come soon enough.
 
I don't understand how AMD can leave high end to Nvidia for well over year. Are resources that tight over at AMD?
 
Why no reviews comparing to the 980 ti? I need to know if I should upgrade or wait a gen.
Here's a shout-out to Gamer's Nexus for a few titles:
http://www.gamersnexus.net/hwreviews/2830-nvidia-gtx-1080-ti-fe-review-and-game-benchmarks/page-5
And the conclusion
For owners of a GTX 980 Ti, this is a sizeable upgrade, but you’ve still got mileage left on your card. It’s the 780 Ti that would be interesting to look at, and unfortunately that’s also a victim of time limitations. Still, based on just extrapolating performance, that’s probably where an upgrade makes the most immediate sense. It just comes down to what resolution you’re trying to play on.
 
Why no reviews comparing to the 980 ti? I need to know if I should upgrade or wait a gen.

I was thinking the same thing. The 980ti was the hottest card going for a long time and still competes well against the 10xx series cards.
 
@Steven Walton

Steve.

I did notice any comments to the contrary so I can only assume that you tested 1080Ti with Asynchronous Compute DISABLED.

GTX 1080 actually slowed down when running DX12 benchmarks as NVidia does not support mulit-core gaming with Async Shader Pipelines or Asynchronous Compute Engines. Both which are protected by AMD patents and a major performance multiplier while playing DX12 games that use 3d engines.

I am also going to assume that you did not DISABLE that feature with the AMD dGPU cards tested?

Please confirm that you did NOT DISABLE Asynch Compute!! Your silence on this will be the answer that you did indeed DISABLE Asynch Compute.

Another great feature of DX12 that you did not seem to mention were bench scores using 2 dGPU cards. Not in SLI or Crossfire but rather Explicit Multi-Adaptor (EMA) which is NATIVE to DX12. Of course DX11 supports SLI and Crossfire, not as good though and the game needs to be heavily coded for multi-gpu and of course DX11 only allows serial execution of multi-core gaming if you can get past 2 cores actually doing any work. Again DX12 allows multi-core gaming natively.

EMA of course allowed DX12 to use all GPU resources available to it and of course offers massive scalability.

It would appear you also did not compare multi-gpu scalability either.

You also did not run the Star Swarm benchmark. Why? Nor did you run 3d Mark API Drawcall Feature Overhead test. Again WHY?

Not really a comprehensive review now is it?
 
I don't understand how AMD can leave high end to Nvidia for well over year. Are resources that tight over at AMD?

Actually 2x RX 480 will outperform GTX 1080 for far less money while running DX12 3d game engines.

TechSpot handlers Intel and nVidia really do not want you to know this or they would have run not only SLI and Crossfire with DX11 but Explicit multi-adaptor tests in DX12.

DX12 EMA is awesome for the consumer. But Techpsot downplays that as NVidia is really on optimized for the obsolete DX11 API hence the heavily weighted DX11 scores and manipulated DX12 scores. Many DX12 features were DISABLED.
 
Actually 2x RX 480 will outperform GTX 1080 for far less money while running DX12 3d game engines.

TechSpot handlers Intel and nVidia really do not want you to know this or they would have run not only SLI and Crossfire with DX11 but Explicit multi-adaptor tests in DX12.

DX12 EMA is awesome for the consumer. But Techpsot downplays that as NVidia is really on optimized for the obsolete DX11 API hence the heavily weighted DX11 scores and manipulated DX12 scores. Many DX12 features were DISABLED.

So will SLI 1060, but, you know, suck less at legacy DX11 games... so how about you respond to my comment, instead of going on conspiratorial rants...
 
It would appear you also did not compare multi-gpu scalability either.
How dare they.

Not really a comprehensive review now is it?
It's too bad Techspot's staff aren't as skilled or mature as you.
I am looking forward to reading your comprehensive review and teaching us more about DX12 performance and features those crappy GTX cards just can't handle.

Actually 2x RX 480 will outperform GTX 1080 for far less money while running DX12 3d game engines.
Holy crap really?!
 
Last edited:
It's too expensive for a graphics card but the price reduction of the 1080 is a good thing, it should drop the price of the 1070 and maybe the 1060 as well. Those are the sweet spot cards for performance/price.

The 1070 and the 1080 will drive 1440p monitors at ultra-high graphics settings fast enough to please any but the most extreme gamers.

Even the Titan XP and the 1080 Ti can't drive 4K monitors much past 60 FPS - if you can find 4K monitors capable of more than 60 hz (I think they're still a bit rare). Worse, 4K in 27" or 28" diagonal monitors is just too hard to read unless your nose is pressed against the glass. Too much resolution for the size.

So the sweet spot for most gamers really is to be found at 1440p and with the 1070 or 1080 GPUs. It's for that reason that the most significant recent announcement from nVidia wasn't the 1080 Ti, but the big price drop for the 1080. In this generation of gaming, the 1080 is all the card you could need for gorgeous 1440p gaming; while none of the GPUs really show off well at 4K. And we need bigger, faster 4K monitors if we're going to game on them, anyway.

4K gaming isn't quite here yet, in my view. When you can get 42" 4K monitors at 120 hz and next-gen GPUs to drive them, 4K will be hard to resist, even at the exorbitant prices we'll be likely to see. Meantime, 1440p gaming is great. And it just got more affordable.
 
The 4K performance is impressive.
Now you can get a single GPU that's 'somewhat affordable' for 4K gaming, or 2K @ 100+ Hz.

1440p widescreen curved monitors, typically 34" diagonal, roughly speaking, demand about half the GPU power of 4K. This is about as close to 2K gaming as you can get in today's market, I think. You can drive one of those monitors at ultra graphical quality quite well with a GTX-1080.

Problem: there aren't any G-sync compatible 1440p widescreen curved monitors on the market, yet. They're all Freesync compatible. AMD doesn't offer any GPUs yet that can drive them well enough, but that should change when Vega gets here.

Rumor has it that Samsung's quantum dot monitors, including their 34" widescreen 1440p monitor, will be available for G-sync later this year. But that's only a rumor. I don't think Samsung has made any official announcements for it yet.

For gamers wanting to get into 1440p widescreen today, they're kinda screwed. You can drive one with an nVidia Pascal card, but you won't get any display sync. You can get display sync with an AMD GPU, but the frame rates will be terrible. Or you can wait, either for Vega to get to the market, or for monitor vendors to offer G-sync in 1440p widescreen.

Be sure to pair G-sync monitors with an nVidia card, or Freesync monitors with an AMD card. If you neglect the proper pairing, you'll lose out on display sync, and that's a really nice technology for gamers to have.
 
Back