Claiming the Crown: Gainward GeForce GTX 680 Phantom Review

By on May 1, 2012, 1:24 AM

On March 22, Nvidia unveiled its first "Kepler"-based graphics card. Branded as the GeForce GTX 680, the card is powered by a 28nm GPU codenamed GK104 that crams 3540 million transistors into a 294mm2 die. That's an improvement from the 40nm GTX 580, which has 540 million fewer transistors yet is almost three times larger, and it partly highlights the overall goal of Kepler: improved efficiency.

This refinement is visible in all aspects of the card, not least of which is raw performance. On paper, the GTX 680 has almost 200% more shader performance than the GTX 580, roughly 250% more texture performance, 90% more ROP performance and 100% more memory bandwidth. With the Radeon HD 7970 only being on average 9% faster than the GTX 580, AMD priced itself into a corner with its flagship originally set at $549. However, the company responded quickly by reducing prices on much of its HD 7000 series, putting the HD 7970 at $479, just under the GTX 680's $499 suggested rate.

We were surprised AMD acted so fast. Although the GTX 680 is quicker than the HD 7970, Nvidia's card is still only available in limited quantities -- if at all. As of writing, Newegg doesn't have a single card available and Amazon has low stock warnings on most GTX 680s. While Nvidia struggles to meet demand, board partners have been busy crafting custom cards. Because we didn't get a reference sample last month, our GTX 680 review will showcase one of the special edition products instead, meet the Gainward GTX 680 Phantom...

Read the complete review.




User Comments: 19

Got something to say? Post a comment
Guest said:

Phew, need to cut my case in half to make room for a card like this .. or at least throw a few of my disks out ^^

Darth Shiv Darth Shiv said:

Pretty amazing performance... wasn't expecting them to trump by this much. Justifies the decision to release 3 months after AMD by the look of it.

slh28 slh28, TechSpot Paladin, said:

Not a fan of the triple slot design (limits SLI capabilities), and even less of a fan of the price - here a reference GTX 680 costs ~£420 and the cheapest Phantom I could find is over £500.

Ranger12 Ranger12 said:

If I pay that much for a card the fins on the heatsink better not be bent like the one in the pictures. Also not a fan of the triple slot design.

Staff
Steve Steve said:

Not a fan of the triple slot design (limits SLI capabilities), and even less of a fan of the price - here a reference GTX 680 costs ~£420 and the cheapest Phantom I could find is over £500.

The triple slot design is excellent for these high-end cards, the GTX 680 Phantom is amazingly quiet and runs very cool. This is something most high performance cards can't claim. Furthermore it should have no real impact on SLI capabilities, any motherboard worth its weight in gold will have more than a single slot separating the primary and secondary PCI Express slots. In fact in my opinion the GTX 680 Phantom is the perfect candidate for SLI for the reasons mentioned above.

Ohh and about the price, as we said in the review right now any price you can find online is irrelevant since there is no stock, or very little. Jump online and try and buy one, go to a massive retailer like newegg.com and you will find they are all out of stock. So until you can actually buy a GTX 680 I wouldn't get too worked up about the pricing

If I pay that much for a card the fins on the heatsink better not be bent like the one in the pictures. Also not a fan of the triple slot design.

I just gave my thoughts on the triple slot design above but I have to say I was a little disappointed with the warped fins as well. I am not 100% sure if I did that when I took it apart or not, I didn't notice them till after I put the card back together so it is quite possible that was my fault.

slh28 slh28, TechSpot Paladin, said:

Furthermore it should have no real impact on SLI capabilities, any motherboard worth its weight in gold will have more than a single slot separating the primary and secondary PCI Express slots.

My mobo has 3 PCI-E slots, each of which are a double card width apart. So if I wanted to SLI/CF with a triple slot card I'd have to use the 1st and 3rd slots, but they would run at x16x4 instead of x16x16.

Ohh and about the price, as we said in the review right now any price you can find online is irrelevant since there is no stock, or very little. Jump online and try and buy one, go to a massive retailer like newegg.com and you will find they are all out of stock. So until you can actually buy a GTX 680 I wouldn?t get too worked up about the pricing

From what I've heard stock levels are especially poor in the US, but in the UK you can actually buy a reference 680 for £420 right now. I searched for the Phantom and that was around £510-530 pre-order.

No doubt this is a great card considering the noise level and performance, but the price premium is isn't worth it IMO. Same with the Phantom 580.

Alpha Gamer Alpha Gamer said:

According to the power consumption chart, it uses less power than my currently gtx 560 ti. Does it mean that if I upgraded to the gtx 680, I could still use my 500w psu? Or there's something I'm missing?

Burty117 Burty117, TechSpot Chancellor, said:

According to the power consumption chart, it uses less power than my currently gtx 560 ti. Does it mean that if I upgraded to the gtx 680, I could still use my 500w psu? Or there's something I'm missing?

From that chart your right and right again, hell if you get this particular model you'll even produce less heat!

wow nvidia stepped up there game, the 560ti has what? 384 cores? for the same power and less heat they've managed to squeeze in an extra 1152 cores?!

That, sounds mad to me in a space of a year...

dividebyzero dividebyzero, trainee n00b, said:

My mobo has 3 PCI-E slots, each of which are a double card width apart. So if I wanted to SLI/CF with a triple slot card I'd have to use the 1st and 3rd slots, but they would run at x16x4 instead of x16x16.

Boards for most (if not all) of the latest chipsets can be found with 3 slot spacing- even the budget ones

Just so I'm on the same page here...you're talking about dropping £850-1k on cards, but wouldn't consider updating (I presume judging by the x16, x16 reference) to what is now a budget range of X58 boards ?...but then, just having bought a 7970 I doubt you'd have any real intention of actually buying a 680...let alone two.

I'm tempted to quote Kyle Bennett at this stage, but it could end up more asterisks than letters- if you know what I mean...and I'm sure you do.

slh28 slh28, TechSpot Paladin, said:

Boards for most (if not all) of the latest chipsets can be found with 3 slot spacing- even the budget ones

Yeah I'll admit the 3 slot spacing is one of the things I overlooked when buying my mobo, but there are a lot of people out there with only double slot spacing. And while a new board might not cost that much in relation to two 680's it's still a pain to change the mobo.

I'm tempted to quote Kyle Bennett at this stage, but it could end up more asterisks than letters- if you know what I mean...and I'm sure you do.

Well it's a good thing TS's forums are censored, I wonder how many people Kyle has scared off over the years...

dividebyzero dividebyzero, trainee n00b, said:

Yeah I'll admit the 3 slot spacing is one of the things I overlooked when buying my mobo, but there are a lot of people out there with only double slot spacing. And while a new board might not cost that much in relation to two 680's it's still a pain to change the mobo.

Having had dual (or triple) card setup's continually since the advent of nV's 680i, it is something I always check for. A two slot gap is generally of no use in any case even with dual slot cards since it hinders airflow into the top card and is exacerbated by heat rising from the PCB/backplate from the lower card. A two slot spacing need not be a deal breaker in any case -any more than lower graphics cards blocking front panel/USB/CMOS/Reset ports has ...it's just a matter of lateral thinking. The limiting factor becomes how many expansion slots the chassis has. The tendency towards 8-11 slot chassis adds a lot of options.

Well it's a good thing TS's forums are censored, I wonder how many people Kyle has scared off over the years...

Well, I left as opposed to being scared off, but my issue was [H]'s blatant shilling. In this particular instance I would tend to agree with his sentiments. Too many people post simply as a form of entertainment for themselves.

Guest said:

Judging by these results vs the 6990 a new computer purchaser may as well buy two second hand 6970s or 6950s and crossfire them for better scores in most games than the 680 (and be quite a bit cheaper with the flood of 2nd hand ones atm). Although this will of course use up more slots and power.

DanUK DanUK said:

Nice review. What a beast of a card that is. Can't wait to see 690 stats..

I think there's a mistake in the 2nd para of the first page though - you say the GTX 580 has 48 ROPs and the GTX 680 ramps it up to... 32 ROPs ... 16 less? :S

Sarcasm Sarcasm said:

According to the power consumption chart, it uses less power than my currently gtx 560 ti. Does it mean that if I upgraded to the gtx 680, I could still use my 500w psu? Or there's something I'm missing?

From that chart your right and right again, hell if you get this particular model you'll even produce less heat!

wow nvidia stepped up there game, the 560ti has what? 384 cores? for the same power and less heat they've managed to squeeze in an extra 1152 cores?!

That, sounds mad to me in a space of a year...

The cores aren't the same. But I guess the most non-scientific but somewhat perceivable comparison would be saying the 560ti has 768 cores compared to the 1536 of the 680.

But again, not Apple to Apples.

ghasmanjr ghasmanjr said:

My friend had 2 GTX580s running SLI and he tried to play The Witcher 2 with ubersampling on. The intro video dropped down as low as 23 fps. Why don't review sites test The Witcher 2 with ubersampling on? That would give a really good indication about how much more intricate cards are getting. I know the 680 is supposed to be a beast at all of the newest additions to visual settings, like DX11 and ubersampling. I want someone to prove it.

Guest said:

Review stated it tested with 11.13 drivers for AMD cards which new drivers have improved performance so that maybe restricting performance for the AMD cards some. Why use such old drivers? Second since Nvidia uses boost all the time, using OverDrive and upping the frequencies for the Radeons which is part of the driver set would show that the Radeons have a lot of overhead that is not tapped into. For example the HD 7970 for most can OC to max OverDrive settings of 1125/1575 while some are slightly slower then that without having to use an utility for upping the GPU volts. Anyways coming to conclusion the 680 is X amount faster I think is short sighted without using full capability of the HD 7970 with options in the drivers. A number of reviews show that when both the 680 and 7970 are both maxed out they perform virtually the same at higher resolutions.

Review had a good broad base of games which is good but lack any second checking of playing the games for stuttering, problems in rendering etc which may not be caught by running a pure benchmark. Also no GPU compute benchmarks where done for those that go beyond gaming, here the 7970 has a significant lead over the 680.

dividebyzero dividebyzero, trainee n00b, said:

My friend had 2 GTX580s running SLI and he tried to play The Witcher 2 with ubersampling on. The intro video dropped down as low as 23 fps. Why don't review sites test The Witcher 2 with ubersampling on? That would give a really good indication about how much more intricate cards are getting. I know the 680 is supposed to be a beast at all of the newest additions to visual settings, like DX11 and ubersampling. I want someone to prove it.

You answered your own question. SLI'ed GTX 580's to get 23 fps ? What's the point of ramming the game i.q. up so high it becomes unplayable- is this indicative of real world gaming?

Does 15-16 fps hold anything more than an academic interest for anyone...or are you the kind of person that prefers playing a semi-interactive slideshow ?

Second since Nvidia uses boost all the time, using OverDrive and upping the frequencies for the Radeons

It's a stock vs stock review...Performance out of the box.

Radeons have a lot of overhead that is not tapped into. For example the HD 7970 for most can OC to max OverDrive settings of 1125/1575 while some are slightly slower then that without having to use an utility for upping the GPU volts.

Cool. It's still not an OC vs OC review. And as you've noticed, OC's tend not to be guaranteed. Here's a GTX 680 at 1204 core, 1257 boost and 7024 memory. [link] maybe not representative, but still a valid benchmark candidate...and it certainly would stack up well against the same sites 1015 core/ 5600 memory OC'ed HD 7970 [link]

Staff
Steve Steve said:

Driver number was a typo, it was meant to be 12.3 and I have now fixed that thank you. Also more of the tests are run manually so we would notice any rendering issues.

My friend had 2 GTX580s running SLI and he tried to play The Witcher 2 with ubersampling on. The intro video dropped down as low as 23 fps. Why don't review sites test The Witcher 2 with ubersampling on? That would give a really good indication about how much more intricate cards are getting. I know the 680 is supposed to be a beast at all of the newest additions to visual settings, like DX11 and ubersampling. I want someone to prove it.

Percentages are percentages my friend

LNCPapa LNCPapa said:

I just tried Witcher 2 with Ultra presets (Ubersampling on) and my fps was between 25-40 during actual gameplay @ 1920. For some reason I thought I'd do better.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.