Testing Nvidia's $1,000 Graphics Card: GeForce GTX Titan Review

By on March 7, 2013, 1:44 AM

Nvidia's Kepler architecture debuted a year ago with the GeForce GTX 680, which has sat somewhat comfortably as the market's top single-GPU graphics card, forcing AMD to reduce prices and launch a special HD 7970 GHz Edition card to help close the value gap. Nonetheless, we all knew the GK110 existed and we were eager to see how Nvidia brought it to the consumer market -- assuming it even decided to. Fortunately, that wait is now over.

After wearing the single-GPU performance crown for 12 months, the GTX 680 has been dethroned by the new GeForce GTX Titan. Announced on February 21, the Titan carries a GK110 GPU with a transistor count that has more than doubled from the GTX 680's 3.5 billion to a staggering 7.1 billion. The part has roughly 25% to 50% more resources at its disposal than Nvidia's previous flagship, including 2688 stream processors (up 75%), 224 texture units (also up 75%) and 48 raster operations (a healthy 50% boost).

In case you're curious, it's worth noting that there's "only" estimated to be a 25% to 50% performance gain because the Titan is clocked lower than the GTX 680. Given those expectations, it would be fair to assume that the Titan would be priced at roughly a 50% premium, which would be about $700. But there's nothing fair about the Titan's pricing -- and there doesn't have to be. Nvidia is marketing the card as a hyper-fast solution for extreme gamers with deep pockets, setting the MSRP at a whopping $1,000.

Read the complete review.




User Comments: 104

Got something to say? Post a comment
Alpha Gamer Alpha Gamer said:

"...if you have $1,000 burning a hole in your pocket..." funniest thing I've ever read in a computer hardware review.

I'm still laughing!

Skidmarksdeluxe Skidmarksdeluxe said:

"...if you have $1,000 burning a hole in your pocket..." funniest thing I've ever read in a computer hardware review.

I'm still laughing!

If I had $1000 burning a hole in my pocket I certainly wouldn't throw it around on a graphics card. That said it is a very nice card & I would love it but it's price renders it redundant imo but somehow I don't think Nvidia will have much of a problem shifting it. There's always gonna be people willing to shell out for something as frivolous as this card. If it cost $ 500, I myself may be tempted. I guess it's not for me. Too rich for my blood.

dividebyzero dividebyzero, trainee n00b, said:

If I had $1000 burning a hole in my pocket I certainly wouldn't throw it around on a graphics card. That said it is a very nice card & I would love it but it's price renders it redundant imo but somehow I don't think Nvidia will have much of a problem shifting it.

The card seems to be selling fairly well if the Titan owners threads are any indication

There's always gonna be people willing to shell out for something as frivolous as this card.

Might pay to remember that the Titan is essentially a higher clocked Tesla K20X without ECC memory and MPI, so gaming, OCD, and competitive benchmarking aren't the only markets. As Anandtech noted:

Titan, its compute performance, and the possibilities it unlocks is a very big deal for researchers and other professionals that need every last drop of compute performance that they can get, for as cheap as they can get it. This is why on the compute front Titan stands alone; in NVIDIA's consumer product lineup there's nothing like it, and even AMD's Tahiti based cards (7970, etc), while potent, are very different from GK110/Kepler in a number of ways. Titan essentially writes its own ticket here.

At a $1000 it still represents a substantial saving over shelling out $4500 on a K20X

If it cost $ 500, I myself may be tempted. I guess it's not for me. Too rich for my blood.

At $500 it would pretty much mean that the GTX 680 to $299 considering relative performance, added vRAM and bill of materials in general...and of course a realignment to the whole product stack. Nice idea, but I couldn't see Nvidia's (or AMD for that matter) shareholders being overly thrilled at that prospect.

@Steve

Thanks for another nicely executed review. Good to see the frame latency benching additions

Guest said:

This card was meant to be the GTX680.But when Nvidia found their lower cards were faster than AMDs flagship gpu they renamed the whole range.Nvidia are laughing all the way to the bank.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

This card was meant to be the GTX680.But when Nvidia found their lower cards were faster than AMDs flagship gpu they renamed the whole range.Nvidia are laughing all the way to the bank.

I'm not buying into this story. Sounds an awful lot like rumors being passed off as fact.

For nVidia to drop their number scheme, they must have planned the cards name from the beginning. Especially one such as the Titan being named after a supercomputer, you would never get me to believe it was originally meant to be the GTX680.

Cueto_99 said:

While I applaud that you're using now the 99th porcentile test to check on latency, it would be nicer if you would make a table of what the numbers mean. Like for example FPS, must of us literate on games and graphic cards already now that 15FPS Is intolerable, 35 is the bare minimum, and 60 and up is desirable... But in latency I just now X milliseconds is better than Y milliseconds, but not if the difference is something I should mind when making my buying decision...

About the Titan, well I think it's just ahead of its time... I can't see AMD showing something equally or better soon...

cliffordcooley cliffordcooley, TechSpot Paladin, said:

But in latency I just now X milliseconds is better than Y milliseconds
Same here, I'm clueless as to where one would draw an intolerable line.

1 person liked this |
Staff
Steve Steve said:

But in latency I just now X milliseconds is better than Y milliseconds
Same here, I'm clueless as to where one would draw an intolerable line.

Guy's milliseconds is still a time measurement. 1 second is made up of 1000ms so 16ms = 60fps.

16 ms = 60 FPS

33 ms = 30 FPS

40 ms = 25 FPS

50 ms = 20 FPS

"...if you have $1,000 burning a hole in your pocket..." funniest thing I've ever read in a computer hardware review.

I'm still laughing!

Glad you enjoyed that.

slh28 slh28, TechSpot Paladin, said:

Guy's milliseconds is still a time measurement. 1 second is made up of 1000ms so 16ms = 60fps.

16 ms = 60 FPS

33 ms = 30 FPS

40 ms = 25 FPS

50 ms = 20 FPS

Come on guys, simple maths...

Great work on the frame latencies, because a pure average fps is definitely not the whole story.

As for the Titan itself it's insanely priced compared to GTX 660Ti SLI or 7950 CF, and the sad thing is neither Nvidia nor AMD seem to have anything up their sleeves for a good while yet.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

Come on guys, simple maths....
Yes it was and didn't answer the question on where a line to intolerable would be drawn. In fact the answers given was frame time for different FPS. That was not an answer for a latency question or how much could be tolerated.

Here is the question;

Whats tolerable 1FPS latency @60FPS or can you go as high as 10FPS latency @60FPS?

TS-56336 TS-56336 said:

If I didn't already have three (only use two for gaming) 7970's, I too would most likely be looking at buying this card as a single card solution for gaming at 2560x1600. It would be nice not to have to worry about SLI/Crossfire issues.

aboynamedmatt aboynamedmatt said:

I was pretty impressed that the 660Ti SLI system was able to produce such low frame latencies by comparison. Sleeping Dogs and Medal of Honor were still pretty bad by frame latency, but the SLI system was right on par with the other games.

slh28 slh28, TechSpot Paladin, said:

Here is the question;

Whats tolerable 1FPS latency @60FPS or can you go as high as 10FPS latency @60FPS?

Latency is measured in ms and the lower the better. The review methodology is explained in the article, I.e. the latency figure displayed is a 99th percentile. e.g. for BF3 @1080p the 99th percentile is 10.6ms so 99% of the frames were generated in less than 10.6ms, or to put it another way 99% of the time you're getting above 94fps (1000/10.6).

So a 99th percentile frame time of under 16.7ms (I.e. 60fps+) would result in very smooth gameplay.

wiyosaya said:

Personally, though nVidia may be touting this card as a gamer's card, I think the market for this card is really going to be the HPC market. Its DP compute performance is what the 680 should have had, however, it's price still makes it a very attractive and inexpensive alternative to a Tesla yet none the less capable. I am willing to bet that the Titan will fly off the shelves into supercomputer builds.

That said, I think none but the most deep-pocketed, got to have the latest and greatest hardware gamers will be attracted to this card especially when gaming alternatives exist that are cheaper to implement.

LNCPapa LNCPapa said:

I was caught a bit off guard by the pricing of this card. I had been saving up for a while for three of whatever the next single gpu flagship was going to be to replace my 680s... but I'm just not willing to part with $3k just for the video. Because of that I'll be passing on this line altogether. It upset me a bit just as AMD's pricing of the 7970 upset me at launch. I'll be spending that money on a new pair of headphones and a DAC/AMP instead.

Awesome review though!

spydercanopus spydercanopus said:

Do you think you could benchmark Tomb Raider with this card? That TressFX combined with ultimate settings on my GTX480 runs 1FPS then crashes out after 5 seconds.

johnnydoe johnnydoe said:

Well... here's my take. Have been in this business for easily a decade, having seen all sorts of ridiculous cards, this one is the one that costs the MOST while delivering the LEAST... this thing is just absurd. There were X1950 XTX Toxics but the Tide Water that came with them and their excellent OC'ing potential or lowered down prices made them well worthwhile, there were waterblocked 8800 Ultra's, which again became somewhat worth it after they've settled down. And there were TEC water chilled 8800 GTX's, YET, they STILL had something to make up to their price tag.

But this one? This one is just one giant, pointless piece of shit. All that magnesium shroud does is to take the card's price to an insane grand, and the actual build quality is NOTHING special with Taiwan made TRIO chokes and typical high-K MOSFET's. $1000 and no Volterra VRM or CPL chokes or anything of that sort... sigh. Just a 680 with 16 SMX cores rather than 15 active, on a roided PCB/cooler.

Once you start pushing volts into a Galaxy 680 White beyond 1350 Mhz, it can pull the heals of this thing. And I bought mine for $520... hell, even a good chipped $400 reference 7970 can get up to that level of performance with an insane OC on it... sigh.

Guest said:

I though 100ms made up 1 second not 1000ms

Guest said:

I really think the GTX 690 should have be included within the comparisons. If, for anything, just to see how the TITAN stacks up.

johnnydoe johnnydoe said:

I though 100ms made up 1 second not 1000ms

Uhm, no. In the metric system, 'mili' stands for x1000 times the division of 1. A kilometer is 1000 metres, as such, 1000 milimeters is 1 meter and 1000 miliseconds equal to a second.

If 100ms made up one sec, then you'd have to lag to terribly and skip an entire frame one after the other while playing with 100 ping online... it's an extremely simple math.

The U.S education system really needs a TON of reconsideration in the math section... beginning with changing the imperial system to the metric system. Feets, miles. yards need to die already. We aren't in 1400's anymore where people calculated shit by how many steps they put out...

Guest said:

Thanks for putting me down, but ok im surpriced that I notice frame latency in bf3 since 1000ms is

1sec and im only getting 15-20ms frame lag and when some review test tvs they say 28ms is bad

and 50ms is horible but thats not even 1/10th of a second delay.

Guest said:

Where was the GTX 690 in this comparison? You need Nvidia's flagship out there as well to be fair.

johnnydoe johnnydoe said:

Thanks for putting me down, but ok im surpriced that I notice frame latency in bf3 since 1000ms is

1sec and im only getting 15-20ms frame lag and when some review test tvs they say 28ms is bad

and 50ms is horible but thats not even 1/10th of a second delay.

15-20ms FRAME lag means LESS than 60 FPS:

16 ms = 60 FPS

33 ms = 30 FPS

It has absolutely NOTHING to do with 15-20ms NET PING lag.

Guest said:

I didnt say anything to do with network lag I looking at the the cpu and gpu graph in battlefield 3 and it shows u latency bettween cpu/gpu using command:

  • Render.perfoverlayvisible 1
TomSEA TomSEA, TechSpot Chancellor, said:

Huh...well I think I'll hang on to my GTX 660ti SLI setup then.

Ranger12 Ranger12 said:

Oohhh a magnesium shroud!? If you're in a pinch it might double as a good fire starter! Might be worth the 1000 bucks!

/sarcasm

Draconian said:

Why is Nvidia obsessed with the $1,000 price point? It's so absurd. They go from $500 for the 680 and then skip all the way to $1,000. Despite review sites gushing over this card, the fact is that it performs worse than a 690 for the same price. This card would be a winner at $800 or even $900. It offers performance comparable to two overclocked 670's, which can be had for $800.

cmbjive said:

I'm not a 1%er but if I had the money I would buy the card. I'm only looking to utilize a single-card GPU in my setup and this card delivers the goods.

PC nerd PC nerd said:

So it performs less than a 690, but costs the same?

What is the point?

LNCPapa LNCPapa said:

It performs much better in certain circumstances - like when you exceed the frame buffer of the 680 or if the title you're playing doesn't have good SLI support.

dividebyzero dividebyzero, trainee n00b, said:

Why is Nvidia obsessed with the $1,000 price point? It's so absurd. They go from $500 for the 680 and then skip all the way to $1,000

Probably because Nvidia don't actually want the Titan flying off the shelves and being permanently on back-order.

The rationale here is, that every graphics review pits the new card against current reference designs. At this point in time the HD 7970GE is top (single GPU) dog, and every published bar chart at every review site becomes a mini-advertisement for the card/vendor at the top of those charts....enter the Titan, reclaim the top spot for the balance of the year. High price ensures constant stock without the need to divert these GPUs from $4500 Tesla K20X, $3200 Tesla K20/K20C, or the likely more astronomically priced Quadro (K6000?) version.

Despite review sites gushing over this card, the fact is that it performs worse than a 690 for the same price

As LNCPapa noted, the 690 is at the mercy of SLI profiles and game-by-game dual-GPU scaling

This card would be a winner at $800 or even $900. It offers performance comparable to two overclocked 670's, which can be had for $800.

The Titan is already deemed a winner. A simple look at the amount of forum threads, discussion, review charts, and the run of benchmark records falling to Titan should be proof enough...even if the owners thread over at OCN I linked to in my previous post isn't.

The kind of people buying Titan aren't interested in performance-per-dollar, they are interested in performance. One Titan will fall to two GTX 670's, but two Titans ? or three ? or four ?

Titan is not for anyone who is using performance-per-dollar as a criteria for purchase. The same people buying the cards now (and many seem to be buying at least two), will likely sell their cards as soon as the more voltage unlock friendly MSI Titan Lightning and other non-reference cards make an appearance- at an even higher price point.

amstech amstech, TechSpot Enthusiast, said:

I'm not a 1%er but if I had the money I would buy the card. I'm only looking to utilize a single-card GPU in my setup and this card delivers the goods.

Same here. I bought my GK-104/670 at release, expecting to go SLi for my rez (1600p) about this time but its performed very well at this resolution by itself, so I am in no hurry to upgrade.

There is nothing like having a single GPU and its surprising to see so many people confused as to why its priced so high......where have you people been for the last 15 years? The top dog always pulls a premium, just because its top dog.

And I like that the reviewer included frame latency, AMD has improved greatly here but thier overall quality of drivers/Vsync performance still has a ways to go.

Phr3d said:

I would appreciate a verbal or chart reference to power consumption and temp when over-clocking - with this guy well up in the temp range, it -seems- like it would have trouble at a~30% OC? As there is no mention of it and you fully benched it on several titles, I conclude that there is Not, but stating that fact would be welcome, as would a reference power supply size - I would avoid a 1250W power supply unless you believe it is necessary, lotta' watts for nuthin' some large percentage of the time.

If it is as steady at that OC as it seems to be and it doesn't risk early death from slow meltdown, the price begins to seem almost reasonable, given the card's longevity from buying groundbreaking performance out of the gate (and NON-SLI).

1 person liked this | dividebyzero dividebyzero, trainee n00b, said:

I would appreciate a verbal or chart reference to power consumption and temp when over-clocking - with this guy well up in the temp range, it -seems- like it would have trouble at a~30% OC?

Titan has dynamic boost similar to the other boosted Keplers. The 837 base/ 876 boost numbers are basically meaningless, since the cards' boost is based upon temperature profile and GPU usage. From Anandtech, here are the measured peak boost frequencies encountered:

As there is no mention of it and you fully benched it on several titles, I conclude that there is Not, but stating that fact would be welcome, as would a reference power supply size - I would avoid a 1250W power supply unless you believe it is necessary, lotta' watts for nuthin' some large percentage of the time.

The card doesn't use appreciably more power than the 7970 GE, and since the voltage is locked at 1.2v (software), and I believe, 1.21v in BIOS (good for ~ 1150-1180MHz boost clock), the card stays relatively frugal on power usage. 85C is the GPU thermal throttling limit.

[ Anandtech: [link] ]

Guest said:

AMD: Release the Kraken!

AnilD AnilD said:

Only 25 to 50%? That could be a game changer my friend. Pricing, well...

JC713 JC713 said:

I really want to see what AMD is cooking up, if they are cooking up the rumored 7990 dual GPU card instead of a single GPU, they are sunk. But on the other hand they are said to come out with 2 7990 models, 1 with dual gpus, another with a single GPU, I wont be surprised if they guzzle up a lot of power.

hahahanoobs hahahanoobs said:

Do you think you could benchmark Tomb Raider with this card? That TressFX combined with ultimate settings on my GTX480 runs 1FPS then crashes out after 5 seconds.

Turn off Tessellation.

3 people like this |
Staff
Steve Steve said:

Turn off Tessellation.

We have tested Tomb Raider and that review will be online next week. Tessellation accounts for only a very tiny performance hit for Nvidia graphics cards. Right now Nvidia's cards cannot handle depth of field when set to ultra, back it off to high and you will see a massive increase in performance. Of course TressFX does also impact performance greatly as well but you will find DOF is the main culprit here.

So it performs less than a 690, but costs the same?

What is the point?

I thought we explained that in the review. The GTX Titan is a much better solution than the GTX 690.

JC713 JC713 said:

Plus the Titan has 2 more GB of GDDR5. Better for higher resolutions.

St1ckM4n St1ckM4n said:

Something is wrong with the 7950. The first two games have the Xfire fps more than double single-card.

JC713 JC713 said:

Something is wrong with the 7950. The first two games have the Xfire fps more than double single-card.

1. Double sing card? 2. Isnt it normal for xfire to perform better?

St1ckM4n St1ckM4n said:

How can two cards perform better than x2 single card? :S Xfire scaling can't beat physics.

hahahanoobs hahahanoobs said:

We have tested Tomb Raider and that review will be online next week. Tessellation accounts for only a very tiny performance hit for Nvidia graphics cards. Right now Nvidia's cards cannot handle depth of field when set to ultra, back it off to high and you will see a massive increase in performance. Of course TressFX does also impact performance greatly as well but you will find DOF is the main culprit here.

I wasn't referring to the review. It was for spyder. I've heard turning Tessellation off can stop crashes with nVIDIA cards running Tomb Raider. nVIDIA is claiming they didn't get the final code until after the game was released, so that could have something to do with the issues with their cards right now.

[link]

Staff
Steve Steve said:

I wasn't referring to the review. It was for spyder. I've heard turning Tessellation off can stop crashes with nVIDIA cards running Tomb Raider. nVIDIA is claiming they didn't get the final code until after the game was released, so that could have something to do with the issues with their cards right now.

[link]

I wasn't referring to this review on the GTX Titan either. I was referring to our upcoming Tomb Raider coverage and was also answering spyders question and trying to help him with the performance issue. I saw no evidence that Tessellation would help with crashing or performance based on our testing. What I did find as I mentioned previously is that DOF has a seriously negative impact on performance so that is where I would start if I was a GeForce owner.

dividebyzero dividebyzero, trainee n00b, said:

How can two cards perform better than x2 single card? :S Xfire scaling can't beat physics.

It doesn't really have to beat the laws of physics- just the vagaries of driver optimization. HardOCP's recent SLI and Crossfire comparison threw up the same anomaly (as do other sites)

Notice that the Crossfire solution doesn't have the low framerate issue in the first 200 seconds of the gameplay graph- hence 105.2% scaling.

[Source: [link] ]

St1ckM4n St1ckM4n said:

Ah, of course, because we're only looking at average fps here. My mistake!

Guest said:

Conclusion: the card is ooh and aah

(I'm hoping to see the non-reference card of Titan, sadly nVidia don't allow that)

1 person liked this | Scavengers Scavengers said:

I have 2 7870's in crossfire now that cost me $455 total.

So Titan buyers get to keep their milliseconds and I get to keep my $445.

Dave

Scavengers Scavengers said:

I have 2 7870's in crossfire now that cost me $455 total.

So Titan buyers get to keep their milliseconds and I get to keep my $445.

Dave

Sorry. $545

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.