Nvidia GeForce GTX Titan X Review: Bloody fast, surprisingly efficient

Steve

Posts: 3,044   +3,153
Staff member

nvidia geforce gtx titan review nvidia geforce gpu graphics card titan x gtx titan x

A year after the original Titan's release, Nvidia followed up with a full 2880-core version known as the Titan Black, which boosted the card's double-precision performance from 1.3 to 1.7 teraflops. A month later, the GTX Titan Z put two Titan Blacks on one PCB for 2.7 teraflops of compute power. Unfortunately, this card never made sense at $3,000 -- triple the Titan Black's price.

Since then, the GeForce 900 series arrived with the GTX 980's unbeatable performance vs. power ratio leading the charge as today's undisputed single-GPU king. We knew there would be more ahead for Maxwell and six months after the GTX 980's release, Nvidia is back with the Titan X, a card that's bigger and more complex than any other.

There's plenty to be psyched about here with headline features including 3072 CUDA cores, 12GB of GDDR5 memory running at 7Gbps and a whopping 8 billion transistors. The Titan X is a processing powerhorse.

Read the complete review.

 
As someone who tends to bleed green after being turned away from ATI/AMD with all the headache I had with the HD 5790 back in the day (hands-down the worst card I've ever owned), I can definitely see myself going back to AMD if Nvidia doesn't release something more reasonable in the next quarter.

Seriously Nvidia, what were you thinking with pricing this thing at $1000 USD (probably closer to $1300 USD with store markup and shipping + taxes)? I could easily get close to the same performance of this card with SLI'd 970s or a single 295X2 for hundreds less.

Very disappointing with this launch, especially given that the main advantage, double-precision power, has been taken away, effectively neutering this card and killing part of its market segment. Not even sure who this is marketed to? If their going for insane performance then I would think a $700 dual-gpu with integrated water cooling would take that spot.

Looking to see how the 390X performs at launch, will be interesting to see what Nvidia responds with. This competition is healthy for growth. Just wish Nvidia would stop trying to milk us with these $1000+ cards.
 
Some of the results are worrying.

Maxwell doesn't seem to like higher resolutions and it definitely shows on these higher end cards. What's the point of spending a grand if your card starts droping frames like crazy above HD? They should have upped the memory bus, right now it just seems like a huge waste.

The temperature range is very large and points to either poor cooling of the GPU or improper fan control. Maybe Nvidia will get the issue sorted out but 30-84 is pretty bad, especially in controlled conditions like these. It could easily reach dangerous levels for people without AC, proper cooling, or overclockers. For being so efficient, I can't see why the temperatures are so out of whack.

On top of that it can't even beat the 295x2 in most cases. On top of the other critical flaws above, they also failed to grab the performance crown.

It's possible that AMDs R9 300 series single card flagship is going to beat the Titan X in performance, especially when it comes to above HD. Given the usual price of that card is almost always around $450, Nvidia is only going to be able to charge this much until AMDs next gen comes out. Nvidia already knows of the impending release and that's why it's offering the witcher 3 for free w/certain cards to try and entice as many as possible before AMD's new cards drop.

To me, this card fails to deliver. It HAS to provide good performance on high resolutions (you are going to be keeping it for awhile, 4k monitors abound), it HAS to be the fastest thing on the market if it's priced like the fastest thing on the market, and it HAS to have good temps and overclocking.
 
As someone who tends to bleed green after being turned away from ATI/AMD with all the headache I had with the HD 5790 back in the day (hands-down the worst card I've ever owned), I can definitely see myself going back to AMD if Nvidia doesn't release something more reasonable in the next quarter.

Seriously Nvidia, what were you thinking with pricing this thing at $1000 USD (probably closer to $1300 USD with store markup and shipping + taxes)? I could easily get close to the same performance of this card with SLI'd 970s or a single 295X2 for hundreds less.

Very disappointing with this launch, especially given that the main advantage, double-precision power, has been taken away, effectively neutering this card and killing part of its market segment. Not even sure who this is marketed to? If their going for insane performance then I would think a $700 dual-gpu with integrated water cooling would take that spot.

Looking to see how the 390X performs at launch, will be interesting to see what Nvidia responds with. This competition is healthy for growth. Just wish Nvidia would stop trying to milk us with these $1000+ cards.

I think that you are the target market. This card is priced like the fastest thing on the market but it doesn't perform that way. They want their fans to shell out big.

The 4k performance on this card is the most disappointing part for me. The 295x2 can achieve decent frames in many games at 4k but this Titan X just doesn't seem to like it. For a card that you're going to be keeping for some time, 4k performance should at least be respectable.
 
Something is wrong with these benchmarks that put the GTX 970 at lower performance than a GTX 780.
Hell the GTX 980 had a 66% performance lead over the 970 in Dying Light benchmark posted here.
I know that is wrong because I get much better results on my own GTX 970 in Dying Light at the same settings and resolution.
 
Something is wrong with these benchmarks that put the GTX 970 at lower performance than a GTX 780.
Hell the GTX 980 had a 66% performance lead over the 970 in Dying Light benchmark posted here.
I know that is wrong because I get much better results on my own GTX 970 in Dying Light at the same settings and resolution.

I looks like the Dying Light results are the only ones that show the GTX 970 much slower than the GTX 980. This is the first time we have used this game for benchmarking a GPU so I will double check those results today to try and find out what was going on.

It also looks like the GTX 980 was at most 50% faster than the GTX 970 in the Dying Light test, still that seems like too much so again I will look into this.
 
The temperature range is very large and points to either poor cooling of the GPU or improper fan control. Maybe Nvidia will get the issue sorted out but 30-84 is pretty bad, especially in controlled conditions like these. It could easily reach dangerous levels for people without AC, proper cooling, or overclockers.
84C is the default thermal limit - it will go no higher unless the user chooses the +10% option (91C).
On top of that it can't even beat the 295x2 in most cases. On top of the other critical flaws above, they also failed to grab the performance crown.
1. You're worried about the Titan X overheating, yet the 295X2 has a noted issue with excessive VRM temperature. 84C is cause for concern for Titan X, but 107+C for the 295X2 isn't?
2. Hardly surprising the Titan X isn't in the same league as a dual GPU card made on the same process node. The Titan X is heavily constrained (the power consumption is an indicator) and the card loses out on every other metric 50% less silicon, lower bandwidth, lower GPU resources.
It's possible that AMDs R9 300 series single card flagship is going to beat the Titan X in performance, especially when it comes to above HD.
Very possible, but the 390X won't be pitted against Titan X. Nvidia will drop 6GB or vRAM (freeing up ~30-40W of the power budget) and launch a GTX 980 Ti with AIB designs straight off the bat if history is any indicator. The 390X probably will shade the Titan X - but how it fares against a custom 980 Ti Classified/HoF/Strix with higher clocks, input power, better cooling, and better power delivery options remains to be seen.
This card is for the early adopters, the OCD benchmarking crowd, and anyone planning on 3D rendering in 4K.
 
nVidia should have learned from AMD on this one - to get the performance, you have to water cool. No other way. Room temperature air blowing over a very hot card that is generating heat at crazy levels will never be cool enough to maintain proper performance.

Like others, $1K for a card that isn't even the best is a waste of money. However, once some people start posting some water-cooled benchmarks, then there might be some differences showing up.

Heck, even my cross-fired water cooled 290X would beat or tie this and I only paid $1K for them and the water blocks 9 months ago.

Also, not sure what the author was referring to when he said
though with the headaches that often arise from multi-GPU gaming
. Unless it's a console port or old drivers, you shouldn't be having problems. While I almost exclusively play Blizzard games (so not a wide variety of developers to test with), I've never had any issues.
 
I hope anyone running a GeForce card has MFAA enabled in the nVIDIA Control panel. I know for the sake of GPU reviews all cards need to be on the same playing field, but at home you will get better performance with MFAA vs MSAA. Pretty much every game supports it now since the last couple of drivers.
 
Also, not sure what the author was referring to when he said
though with the headaches that often arise from multi-GPU gaming
. Unless it's a console port or old drivers, you shouldn't be having problems.
Probably the same thing that has plagued multi-GPU since its inception - multi-GPU driver support for new titles. Assassins Creed: Unity, Dead Rising 3, and Borderlands: The Pre-Sequel aren't exactly CrossfireX friendly at the moment, and new games are heavily reliant upon timely driver updates.
nVidia should have learned from AMD on this one - to get the performance, you have to water cool. No other way. Room temperature air blowing over a very hot card that is generating heat at crazy levels will never be cool enough to maintain proper performance.
Water cooling the GPU and neglecting the power delivery circuits is only halfway alleviating the issue (as the 295X2 thermograph video I posted earlier indicates). Nvidia has instituted a BIOS level lock on the upper limit on thermals. Anyone wanting to fully explore the limits of the architecture isn't going to d*ck around with a hybrid cooler - a full cover block (whether aftermarket or EVGA's HydroCopper) and a custom BIOS are really the only serious options.
 
Last edited:
Excellent review @Steve, seems to be one heck of a card though I am a bit surprised in the end more on the overclocking results than anything. Seems each card varies a bit more than I would have thought and I had even hoped it was going to go a bit further personally. Still a beast of a card and with that 12gb of Vram no one can complain about ram being the limit :p.
More like, water was the only way they could sell a dual Hawaii GPU card, because of how hot they were.
Beg to differ
 
Disappointed in the 4K benchmarks for sure. I've been looking in to the new ASUS Swift 4K monitor that's coming out soon as a long term solution, and was hoping to see a single GPU deliver decent performance (45 minimum FPS average 50+ in Crysis 3,) because G-Sync will smooth out the sub 60FPS. Now I know that's asking a lot as it's the most demanding PC title, but it would show the card to be pretty future proof. I suppose 1 more architecture to go before that's a thing.

I'd also like to see 4K benchmarks run WITHOUT MSAA or AA at all. At 4K resolutions there's no need for AA. Every website runs games at Max Settings with AA. I'd like to see what detail settings are playable for each card at 4K (Average FPS 50+ as playable.) Just a thought to buck the trend or pure resolution scaling and inform us of what the card is actually capable of in real world situations.
 
Excellent review @Steve, seems to be one heck of a card though I am a bit surprised in the end more on the overclocking results than anything. Seems each card varies a bit more than I would have thought and I had even hoped it was going to go a bit further personally. Still a beast of a card and with that 12gb of Vram no one can complain about ram being the limit :p.

Beg to differ

That's a custom vendor card, not AMD reference design. Thanks for playing!

lol@that 1000w minimum PSU requirement....
 
Last edited:
This is a card for fools who love to impulse buy. Over 1080p performance is disappointing to say the least. Price... for a purely gaming card is ridiculous. In... what, three months time, Nvidia will most likely release a 980ti that will most likely outperform the Titan at a lower cost.
 
I buy whatever happens to be the best at the time when I need it (and isn't priced stupidly). This time it looks like Nvidia will be coming up short in both categories.

Still, it is encouraging to see that the Titan X's 46.3% performance advantage over the 290X often translates directly to frames-per-second, though it seems to depend on how poorly-optimized the 3d engine is (Metro Redux). That gives the 390X's 49.2% advantage some heft. I just hope AMD doesn't think people like to pay over $600 for cards; personally, that's the tipping point between "I might buy this" and "it may as well be $1000".
 
84C is the default thermal limit - it will go no higher unless the user chooses the +10% option (91C).

1. You're worried about the Titan X overheating, yet the 295X2 has a noted issue with excessive VRM temperature. 84C is cause for concern for Titan X, but 107+C for the 295X2 isn't?
2. Hardly surprising the Titan X isn't in the same league as a dual GPU card made on the same process node. The Titan X is heavily constrained (the power consumption is an indicator) and the card loses out on every other metric 50% less silicon, lower bandwidth, lower GPU resources.

Very possible, but the 390X won't be pitted against Titan X. Nvidia will drop 6GB or vRAM (freeing up ~30-40W of the power budget) and launch a GTX 980 Ti with AIB designs straight off the bat if history is any indicator. The 390X probably will shade the Titan X - but how it fares against a custom 980 Ti Classified/HoF/Strix with higher clocks, input power, better cooling, and better power delivery options remains to be seen.
This card is for the early adopters, the OCD benchmarking crowd, and anyone planning on 3D rendering in 4K.

I well aware that the whole GCN 2.0 line has a heat issue, I was just going off of the data presented in this article. Under the same setup, the titan has a wider temperature range according to this article.

If the Titan X is too constrained, Nvidia should have pushed it further. It's supposed to be king of the hill. Not being able to match AMD's top card and costing more is pretty bad in my eyes. Power consumption is not really a concern for people with top end systems so I don't know why they would release a top end card that's so restrained.
 
Last edited:
If the Titan X is too constrained, Nvidia should have pushed it further. It's supposed to be king of the hill. Not being able to match AMD's top card and costing more is pretty bad in my eyes.
An incoming single GPU card dethroning the incumbent top tier dual GPU card. Do you actually realize how rare that actually is? The last time it happened was seven years ago when the GTX 280 barely dethroned the 9800 GX2 - you have to go back almost a decade to the arrival of the 8800 GTX to see a decisive result.
The GTX 480/580 didn't outperform the HD 5970
The HD 7970 didn't outperform the HD 6990 / GTX 590
The GTX Titan didn't outperform the HD 7990 / GTX 690....and nor did the GTX Titan Black...and nor did the R9 290X

But you expect the Titan X to outperform the incumbent 295X2, a card made on the same process node that possesses 90% more bandwidth, 46% more die area, 83% more texture address units, 33% more raster ops, and water cooling
Power consumption is really a concern for people with top end systems so I don't know why they would release a top end card that's so restrained.
I'm going to assume you meant the opposite of what you actually said, in that power consumption isn't that much of a concern.
Now, having said that, the people buying the card - whether benchmarker, OCD shopaholic, or 4K Octane enthusiast, won't give a flying fig for benchmarks showing the card to X or Y at stock configuration. They'll either be eyeing up EKWB's newsletter for the arrival of a FC block, studying the PCB layout for hard-modding the GPU voltage, or be buying the card for the 12GB framebuffer.
 
An incoming single GPU card dethroning the incumbent top tier dual GPU card. Do you actually realize how rare that actually is? The last time it happened was seven years ago when the GTX 280 barely dethroned the 9800 GX2 - you have to go back almost a decade to the arrival of the 8800 GTX to see a decisive result.
The GTX 480/580 didn't outperform the HD 5970
The HD 7970 didn't outperform the HD 6990 / GTX 590
The GTX Titan didn't outperform the HD 7990 / GTX 690....and nor did the GTX Titan Black...and nor did the R9 290X

But you expect the Titan X to outperform the incumbent 295X2, a card made on the same process node that possesses 90% more bandwidth, 46% more die area, 83% more texture address units, 33% more raster ops, and water cooling

Is it too much to expect that the best Nvidia can come out with to compete the best AMD has to offer? Even more so when AMD's solution is cheaper? Why would anyone even bother with the overpriced Nvidia card when AMD's product is so much cheaper and much faster?

Even if you could overclock the card like crazy, according to you, it wouldn't matter because it has a huge statistical hurdle. I would much rather see what I could get out of the 295x2 as that cards has a much more power to begin with.

the people buying the card - whether benchmarker, OCD shopaholic, or 4K Octane enthusiast, won't give a flying fig for benchmarks showing the card to X or Y at stock configuration. They'll either be eyeing up EKWB's newsletter for the arrival of a FC block, studying the PCB layout for hard-modding the GPU voltage, or be buying the card for the 12GB framebuffer.

Most enthusiasts have no need for 12GB video card when it can't even fill up that entire space without memory bandwidth being an issue.

I don't know about you but when I'm looking for a card, I'm going to get the beefiest one I can find and then worry about modding it. What's the point of spending time on a sub-par card when a cheaper and more powerful one already exists. Even if the mod the Titan X to hell and back, it will never be as good as a modded 295x2. It just doesn't have the reasources to compete.

I can see the Titan X appealing to a very small portion of people who do rendering or modeling but otherwise it's nothing amazing for gamers, modders, or cooling specialists.
 
Is it too much to expect that the best Nvidia can come out with to compete the best AMD has to offer?
Probably, and you'll be doubly disappointed if AMD's 390X falls short of the same 295X2, as appears to be the case.
Even more so when AMD's solution is cheaper? Why would anyone even bother with the overpriced Nvidia card when AMD's product is so much cheaper and much faster?
Ever heard of a "top of the mind" brand? It's why Intel sell at a premium at the high end.
Even if you could overclock the card like crazy, according to you, it wouldn't matter because it has a huge statistical hurdle.
Um, no. I think you are now making your own interpretations. Overclocking is done for performance gains, fun, and the achievement, not necessarily to beat any one particular board.
The 295X2 is broadly equal to GTX 980's in SLI, so why would anyone in their right mind think that 1.5 GTX 980's (in essence what the Titan X is) would do any better ?
I would much rather see what I could get out of the 295x2
Happy shopping....assuming you're legit in your quest.
Most enthusiasts have no need for 12GB video card when it can't even fill up that entire space without memory bandwidth being an issue.
Just as well most enthusiasts don't buy 12GB cards then isn't it? For gaming it is overkill, for content creation - especially for 4K and 5K, a godsend. Also primarily the reason that AMD shoehorned 16GB on to the W9100.
I can see the Titan X appealing to a very small portion of people who do rendering or modeling but otherwise it's nothing amazing for gamers, modders, or cooling specialists.
It won't stop HWBot's Titan X submissions from piling up as benchmarkers race to break personal bests and increase their rankings. The cards - like their numerous predecessors including previous Titans, will then be onsold to fund the next best thing. Rinse. Repeat.
 
Ooh... ahh... maybe too much to put in my dream PC-05S? Granted, I'd probably be better off getting a GTX 980 and use the extra money to actually pay for the case ha ha ha...

That black shroud with accent lighting to go with it? That would look sharp.
 
I buy whatever happens to be the best at the time when I need it (and isn't priced stupidly). This time it looks like Nvidia will be coming up short in both categories.

Still, it is encouraging to see that the Titan X's 46.3% performance advantage over the 290X often translates directly to frames-per-second, though it seems to depend on how poorly-optimized the 3d engine is (Metro Redux). That gives the 390X's 49.2% advantage some heft. I just hope AMD doesn't think people like to pay over $600 for cards; personally, that's the tipping point between "I might buy this" and "it may as well be $1000".
The 4GB 390X is slotted to be $700 and the 8GB version is slotted to be $900-$1000. It comes with HBM memory, liquid cooling, and will likely beat the Titan by 10-20%. In my mind AMD's price will be more than justified. You can thank Nvidia for the price hikes.

If you want value the 380X should be about 10% stronger than the 980 for $400.
 
Back