Nvidia GeForce GTX 1080 Ti Review: Think Titan XP, just a lot more affordable

@Steven Walton

Steve.

I did notice any comments to the contrary so I can only assume that you tested 1080Ti with Asynchronous Compute DISABLED.

GTX 1080 actually slowed down when running DX12 benchmarks as NVidia does not support mulit-core gaming with Async Shader Pipelines or Asynchronous Compute Engines. Both which are protected by AMD patents and a major performance multiplier while playing DX12 games that use 3d engines.

I am also going to assume that you did not DISABLE that feature with the AMD dGPU cards tested?

Please confirm that you did NOT DISABLE Asynch Compute!! Your silence on this will be the answer that you did indeed DISABLE Asynch Compute.

Another great feature of DX12 that you did not seem to mention were bench scores using 2 dGPU cards. Not in SLI or Crossfire but rather Explicit Multi-Adaptor (EMA) which is NATIVE to DX12. Of course DX11 supports SLI and Crossfire, not as good though and the game needs to be heavily coded for multi-gpu and of course DX11 only allows serial execution of multi-core gaming if you can get past 2 cores actually doing any work. Again DX12 allows multi-core gaming natively.

EMA of course allowed DX12 to use all GPU resources available to it and of course offers massive scalability.

It would appear you also did not compare multi-gpu scalability either.

You also did not run the Star Swarm benchmark. Why? Nor did you run 3d Mark API Drawcall Feature Overhead test. Again WHY?

Not really a comprehensive review now is it?
Sounds like you should create your own site and do your own review. Then we can all stop by and tell you how to do your job! Deal?

PS: I couldn't give less of a sh*t about the missing tests you're whining about, so I won't be joining you- but good luck! :-D
 
@Steven Walton

Steve.
I did notice any comments to the contrary so I can only assume that you tested 1080Ti with Asynchronous Compute DISABLED.

GTX 1080 actually slowed down when running DX12 benchmarks as NVidia does not support mulit-core gaming with Async Shader Pipelines or Asynchronous Compute Engines. Both which are protected by AMD patents and a major performance multiplier while playing DX12 games that use 3d engines.

I am also going to assume that you did not DISABLE that feature with the AMD dGPU cards tested?

Please confirm that you did NOT DISABLE Asynch Compute!! Your silence on this will be the answer that you did indeed DISABLE Asynch Compute.

Another great feature of DX12 that you did not seem to mention were bench scores using 2 dGPU cards. Not in SLI or Crossfire but rather Explicit Multi-Adaptor (EMA) which is NATIVE to DX12. Of course DX11 supports SLI and Crossfire, not as good though and the game needs to be heavily coded for multi-gpu and of course DX11 only allows serial execution of multi-core gaming if you can get past 2 cores actually doing any work. Again DX12 allows multi-core gaming natively.

EMA of course allowed DX12 to use all GPU resources available to it and of course offers massive scalability.

It would appear you also did not compare multi-gpu scalability either.

You also did not run the Star Swarm benchmark. Why? Nor did you run 3d Mark API Drawcall Feature Overhead test. Again WHY?

Not really a comprehensive review now is it?

No, no…. no, please assume my silence is due to the fact that I don’t engage with halfwits. If you had attempted at all to ask your question in a polite and respectful manner I would have happily addressed it in detail. Instead I’m going to give you the time you deserve and move on to someone else’s question.

Wow. Gee...just $500? That's still more expensive than most modern full computers.

It is but we are obviously playing on another level with this hardware.

I was thinking the same thing. The 980ti was the hottest card going for a long time and still competes well against the 10xx series cards.

The GTX 1080 Ti release caught me in the middle of updating the list of games I test with so I didn't get to the previous generation stuff yet. I will add more GPUs soon. That said the GTX 980 Ti is very similar to the GTX 1070.
 
@Steven Walton

Steve.

I did notice any comments to the contrary so I can only assume that you tested 1080Ti with Asynchronous Compute DISABLED.

GTX 1080 actually slowed down when running DX12 benchmarks as NVidia does not support mulit-core gaming with Async Shader Pipelines or Asynchronous Compute Engines. Both which are protected by AMD patents and a major performance multiplier while playing DX12 games that use 3d engines.

I am also going to assume that you did not DISABLE that feature with the AMD dGPU cards tested?

Please confirm that you did NOT DISABLE Asynch Compute!! Your silence on this will be the answer that you did indeed DISABLE Asynch Compute.

Another great feature of DX12 that you did not seem to mention were bench scores using 2 dGPU cards. Not in SLI or Crossfire but rather Explicit Multi-Adaptor (EMA) which is NATIVE to DX12. Of course DX11 supports SLI and Crossfire, not as good though and the game needs to be heavily coded for multi-gpu and of course DX11 only allows serial execution of multi-core gaming if you can get past 2 cores actually doing any work. Again DX12 allows multi-core gaming natively.

EMA of course allowed DX12 to use all GPU resources available to it and of course offers massive scalability.

It would appear you also did not compare multi-gpu scalability either.

You also did not run the Star Swarm benchmark. Why? Nor did you run 3d Mark API Drawcall Feature Overhead test. Again WHY?

Not really a comprehensive review now is it?

25hcvpy.png
 
Now if Oculus would just release a version that actually looks good and graphics cards get better AND cheaper...I'll open my wallet then. For now these baby step improvements are laughable.
 
The 4K performance is impressive.
Now you can get a single GPU that's 'somewhat affordable' for 4K gaming, or 2K @ 100+ Hz.

1440p widescreen curved monitors, typically 34" diagonal, roughly speaking, demand about half the GPU power of 4K. This is about as close to 2K gaming as you can get in today's market, I think. You can drive one of those monitors at ultra graphical quality quite well with a GTX-1080.

Problem: there aren't any G-sync compatible 1440p widescreen curved monitors on the market, yet. They're all Freesync compatible. AMD doesn't offer any GPUs yet that can drive them well enough, but that should change when Vega gets here.

Rumor has it that Samsung's quantum dot monitors, including their 34" widescreen 1440p monitor, will be available for G-sync later this year. But that's only a rumor. I don't think Samsung has made any official announcements for it yet.

For gamers wanting to get into 1440p widescreen today, they're kinda screwed. You can drive one with an nVidia Pascal card, but you won't get any display sync. You can get display sync with an AMD GPU, but the frame rates will be terrible. Or you can wait, either for Vega to get to the market, or for monitor vendors to offer G-sync in 1440p widescreen.

Be sure to pair G-sync monitors with an nVidia card, or Freesync monitors with an AMD card. If you neglect the proper pairing, you'll lose out on display sync, and that's a really nice technology for gamers to have.

There are 3440x1440 IPS curved G Sync monitors @100Hz available from Asus and Acer. Acer also has 2560x1080 curved ultrawide TN panel @200Hz. Really the next steps are HDR, panel tech (quantum dot, ips etc.) and refresh speeds.
 
Problem: there aren't any G-sync compatible 1440p widescreen curved monitors on the market, yet. They're all Freesync compatible.

You did enough research to decide to post about 1440p ultrawides, but you somehow overlooked the existence of the Predator X34 and the ROG PG348Q?
 
Compared to the GeForce GTX 1080 Founders Edition board, the GTX 1080 Ti’s power subsystem has been substantially enhanced. If you recall, all GeForce GTX 10 Series GPUs were equipped with a dual-FET power supply on both the GPU and memory, which provided cleaner power to these components compared to prior GPUs. This improved power efficiency, reliability, and overclocking.

For the GTX 1080 Ti Founders Edition we’ve incorporated a 7-phase 2x dual-FET power design that’s capable of supplying up to 250 amps of power to the GPU.
That's not how things work mate, you don't get "clean power" just because you have "dual-FET" configuration. Clean power means usually either a reduced reactive power in AC system (not the case here as you have a DC supply if I am not mistaken) or power supplied by a renewable source like from solar panels. Efficiency has nothing to do with clean power. Also, you cannot have 250 Amps of power, Ampere are units to measure current and 250 Amps is a pretty big value, certainly the power bus will look interestingly large. Maybe you meant 250W (Watts) ?
 
That's not how things work mate, you don't get "clean power" just because you have "dual-FET" configuration. Clean power means usually either a reduced reactive power in AC system (not the case here as you have a DC supply if I am not mistaken) or power supplied by a renewable source like from solar panels. Efficiency has nothing to do with clean power. Also, you cannot have 250 Amps of power, Ampere are units to measure current and 250 Amps is a pretty big value, certainly the power bus will look interestingly large. Maybe you meant 250W (Watts) ?

These words come from Nvidia engineers, they know more than me so I took them at their word.
 
Last edited:
That's not how things work mate, you don't get "clean power" just because you have "dual-FET" configuration. Clean power means usually either a reduced reactive power in AC system (not the case here as you have a DC supply if I am not mistaken) or power supplied by a renewable source like from solar panels. Efficiency has nothing to do with clean power. Also, you cannot have 250 Amps of power, Ampere are units to measure current and 250 Amps is a pretty big value, certainly the power bus will look interestingly large. Maybe you meant 250W (Watts) ?

These words come from Nvidia engineers, they know more than me so I took them at their word.

Consider this, the rated Vcore on the GPU is in the 1.0v - 1.25v range and the power consumption is in the 250w range. Divide wattage by the low end voltage and you get this 250amp number, which isn't as impressive given the card is rated at 250 watts in the first place, but at least it means there is overhead available if you want to overclock and push the card by increasing Vcore.
 
Why no reviews comparing to the 980 ti? I need to know if I should upgrade or wait a gen.
The answer is simple: Nobody cares about the GTX 980 Ti anymore, it's at least a century old in tech terms. As for your decision to upgrade, only you can make that... possibly with a little help from your bank manager.
 
Doom is obviously super-optimized and Deus Ex seems to not be so. Either way, card looks great but at 1440p, my 1070 doesn't look too shabby.
 
Great review, thanks for including the RX and their GTX counterparts. I have a RX480 4Gb and after 6 months I can see that I made the right choice against the GTX 1060 3Gb. It's weird to see the RX470 beating the GTX 1060 3gb in some of the games tested.
 
The 4K performance is impressive.
Now you can get a single GPU that's 'somewhat affordable' for 4K gaming, or 2K @ 100+ Hz.

1440p widescreen curved monitors, typically 34" diagonal, roughly speaking, demand about half the GPU power of 4K. This is about as close to 2K gaming as you can get in today's market, I think. You can drive one of those monitors at ultra graphical quality quite well with a GTX-1080.

Problem: there aren't any G-sync compatible 1440p widescreen curved monitors on the market, yet. They're all Freesync compatible. AMD doesn't offer any GPUs yet that can drive them well enough, but that should change when Vega gets here.

Rumor has it that Samsung's quantum dot monitors, including their 34" widescreen 1440p monitor, will be available for G-sync later this year. But that's only a rumor. I don't think Samsung has made any official announcements for it yet.

For gamers wanting to get into 1440p widescreen today, they're kinda screwed. You can drive one with an nVidia Pascal card, but you won't get any display sync. You can get display sync with an AMD GPU, but the frame rates will be terrible. Or you can wait, either for Vega to get to the market, or for monitor vendors to offer G-sync in 1440p widescreen.

Be sure to pair G-sync monitors with an nVidia card, or Freesync monitors with an AMD card. If you neglect the proper pairing, you'll lose out on display sync, and that's a really nice technology for gamers to have.
"Problem: there aren't any G-sync compatible 1440p widescreen curved monitors on the market, yet."

Ahem, took me about five seconds to find one via Google on Newegg:

Acer Predator X34 Curved IPS NVIDIA G-sync Gaming Monitor 21:9 WQHD Display 100Hz Refresh Rate
https://www.newegg.com/Product/Prod...SmxYP_0S9P8Aoo-d5CozcaAih78P8HAQ&gclsrc=aw.ds
 
These price points always reek of an Engineers view of prices, not a Marketing Manager's. I bought video cards yearly for 12 years straight, but when cards that cost $8/unit to develop and design, and $6 to manufacture, then offered at over $122.50 I had to draw the line. I estimate for me personally not buying a video card for the past 12 years AMD has lost well over $1000 in net profit, and probably closer to $1500. Coupled to no need to upgrade a CPU when I am not going to overpay for a graphics card they are out a few thousand...for me...a single customer. Clear, though anecdotal, evidence shows that at least 1 out of 5 people that play games stopped buying PC's because of the v/card prices some of which bought consoles (not me), and some that just said mere engineers should not have anything to do with management decisions whatsoever.
 
And what would I use this card for? Oh! Must be gaming, 'cause there are all the gaming benchmarks. For the same money, I could have a sparkly new iPhone of Samsung Galaxy Android phone. Or I could buy groceries for my family for almost 2 months. Or gasoline to drive my car for a couple of months. Or... Or... Or... Frivolous! Or maybe I never learned to appreciate the value of gaming.

How about rendering of solid objects to be fabricated, another task that likes the best graphics and the fastest CPU? Or other relevant non-trivial non-gamer benchmarks?
 
The 4K performance is impressive.
Now you can get a single GPU that's 'somewhat affordable' for 4K gaming, or 2K @ 100+ Hz.

1440p widescreen curved monitors, typically 34" diagonal, roughly speaking, demand about half the GPU power of 4K. This is about as close to 2K gaming as you can get in today's market, I think. You can drive one of those monitors at ultra graphical quality quite well with a GTX-1080.

Problem: there aren't any G-sync compatible 1440p widescreen curved monitors on the market, yet. They're all Freesync compatible. AMD doesn't offer any GPUs yet that can drive them well enough, but that should change when Vega gets here.

Rumor has it that Samsung's quantum dot monitors, including their 34" widescreen 1440p monitor, will be available for G-sync later this year. But that's only a rumor. I don't think Samsung has made any official announcements for it yet.

For gamers wanting to get into 1440p widescreen today, they're kinda screwed. You can drive one with an nVidia Pascal card, but you won't get any display sync. You can get display sync with an AMD GPU, but the frame rates will be terrible. Or you can wait, either for Vega to get to the market, or for monitor vendors to offer G-sync in 1440p widescreen.

Be sure to pair G-sync monitors with an nVidia card, or Freesync monitors with an AMD card. If you neglect the proper pairing, you'll lose out on display sync, and that's a really nice technology for gamers to have.

say what! there are heaps of 34" G-Sync monitors out there in the market for years now. Sure, many do come with Freesync more often I think. Acer's 34" Predator, and Asus's got one also. LG has a few models, with Freesync and I think one or two with Gsync.
 
And what would I use this card for? Oh! Must be gaming, 'cause there are all the gaming benchmarks.
The GTX line from Nvidia are GPUs for budget, mainstream, and enthusiast PC gamers; hence, that's why there are gaming benchmarks.
For the same money, I could have a sparkly new iPhone of Samsung Galaxy Android phone. Or I could buy groceries for my family for almost 2 months. Or gasoline to drive my car for a couple of months. Or... Or... Or... Frivolous! Or maybe I never learned to appreciate the value of gaming.
A Hyundai Sonata starts at $21,950,a BMW 5-series starts at $51,200, and a Porsche Panamara starts at $85,400. All 3 vehicles have 4 doors, wheels, an engine of some sorts, and all provide transportation. They differ in their performance.

While one can argue all one needs is the Sonata those who want the BMW/Porsche do so for varying reasons.
How about rendering of solid objects to be fabricated, another task that likes the best graphics and the fastest CPU? Or other relevant non-trivial non-gamer benchmarks?
If you're interested in those benchmarks you likely should be looking into the Quadro and FirePro lines. If you're a "hobbyist" looking to save money on the GTX line there are likely forums with users like yourself to get this information after they're in the wild for a time.
 
And what would I use this card for? Oh! Must be gaming, 'cause there are all the gaming benchmarks. For the same money, I could have a sparkly new iPhone of Samsung Galaxy Android phone. Or I could buy groceries for my family for almost 2 months. Or gasoline to drive my car for a couple of months. Or... Or... Or... Frivolous! Or maybe I never learned to appreciate the value of gaming.

How about rendering of solid objects to be fabricated, another task that likes the best graphics and the fastest CPU? Or other relevant non-trivial non-gamer benchmarks?

I have to imagine you aren't our target audience....
 
Back