Claiming the Crown: Gainward GeForce GTX 680 Phantom Review

Julio Franco

Posts: 9,090   +2,042
Staff member
Read the full article at:
[newwindow=https://www.techspot.com/review/525-gainward-geforce-gtx-680/]https://www.techspot.com/review/525-gainward-geforce-gtx-680/[/newwindow]

Please leave your feedback here.
 
Phew, need to cut my case in half to make room for a card like this .. or at least throw a few of my disks out ^^
 
Pretty amazing performance... wasn't expecting them to trump by this much. Justifies the decision to release 3 months after AMD by the look of it.
 
Not a fan of the triple slot design (limits SLI capabilities), and even less of a fan of the price - here a reference GTX 680 costs ~£420 and the cheapest Phantom I could find is over £500.
 
If I pay that much for a card the fins on the heatsink better not be bent like the one in the pictures. Also not a fan of the triple slot design.
 
Not a fan of the triple slot design (limits SLI capabilities), and even less of a fan of the price - here a reference GTX 680 costs ~£420 and the cheapest Phantom I could find is over £500.

The triple slot design is excellent for these high-end cards, the GTX 680 Phantom is amazingly quiet and runs very cool. This is something most high performance cards can’t claim. Furthermore it should have no real impact on SLI capabilities, any motherboard worth its weight in gold will have more than a single slot separating the primary and secondary PCI Express slots. In fact in my opinion the GTX 680 Phantom is the perfect candidate for SLI for the reasons mentioned above.

Ohh and about the price, as we said in the review right now any price you can find online is irrelevant since there is no stock, or very little. Jump online and try and buy one, go to a massive retailer like newegg.com and you will find they are all out of stock. So until you can actually buy a GTX 680 I wouldn’t get too worked up about the pricing ;)

If I pay that much for a card the fins on the heatsink better not be bent like the one in the pictures. Also not a fan of the triple slot design.

I just gave my thoughts on the triple slot design above but I have to say I was a little disappointed with the warped fins as well. I am not 100% sure if I did that when I took it apart or not, I didn’t notice them till after I put the card back together so it is quite possible that was my fault.
 
Furthermore it should have no real impact on SLI capabilities, any motherboard worth its weight in gold will have more than a single slot separating the primary and secondary PCI Express slots.

My mobo has 3 PCI-E slots, each of which are a double card width apart. So if I wanted to SLI/CF with a triple slot card I'd have to use the 1st and 3rd slots, but they would run at x16x4 instead of x16x16.


Ohh and about the price, as we said in the review right now any price you can find online is irrelevant since there is no stock, or very little. Jump online and try and buy one, go to a massive retailer like newegg.com and you will find they are all out of stock. So until you can actually buy a GTX 680 I wouldn?t get too worked up about the pricing ;)
From what I've heard stock levels are especially poor in the US, but in the UK you can actually buy a reference 680 for £420 right now. I searched for the Phantom and that was around £510-530 pre-order.

No doubt this is a great card considering the noise level and performance, but the price premium is isn't worth it IMO. Same with the Phantom 580.
 
According to the power consumption chart, it uses less power than my currently gtx 560 ti. Does it mean that if I upgraded to the gtx 680, I could still use my 500w psu? Or there's something I'm missing?
 
According to the power consumption chart, it uses less power than my currently gtx 560 ti. Does it mean that if I upgraded to the gtx 680, I could still use my 500w psu? Or there's something I'm missing?

From that chart your right and right again, hell if you get this particular model you'll even produce less heat!

wow nvidia stepped up there game, the 560ti has what? 384 cores? for the same power and less heat they've managed to squeeze in an extra 1152 cores?!

That, sounds mad to me in a space of a year...
 
My mobo has 3 PCI-E slots, each of which are a double card width apart. So if I wanted to SLI/CF with a triple slot card I'd have to use the 1st and 3rd slots, but they would run at x16x4 instead of x16x16.
Boards for most (if not all) of the latest chipsets can be found with 3 slot spacing- even the budget ones
Just so I'm on the same page here...you're talking about dropping £850-1k on cards, but wouldn't consider updating (I presume judging by the x16, x16 reference) to what is now a budget range of X58 boards ?...but then, just having bought a 7970 I doubt you'd have any real intention of actually buying a 680...let alone two.

I'm tempted to quote Kyle Bennett at this stage, but it could end up more asterisks than letters- if you know what I mean...and I'm sure you do.
 
Boards for most (if not all) of the latest chipsets can be found with 3 slot spacing- even the budget ones

Yeah I'll admit the 3 slot spacing is one of the things I overlooked when buying my mobo, but there are a lot of people out there with only double slot spacing. And while a new board might not cost that much in relation to two 680's it's still a pain to change the mobo.

I'm tempted to quote Kyle Bennett at this stage, but it could end up more asterisks than letters- if you know what I mean...and I'm sure you do.
Well it's a good thing TS's forums are censored, I wonder how many people Kyle has scared off over the years... :mad:
 
Yeah I'll admit the 3 slot spacing is one of the things I overlooked when buying my mobo, but there are a lot of people out there with only double slot spacing. And while a new board might not cost that much in relation to two 680's it's still a pain to change the mobo.
Having had dual (or triple) card setup's continually since the advent of nV's 680i, it is something I always check for. A two slot gap is generally of no use in any case even with dual slot cards since it hinders airflow into the top card and is exacerbated by heat rising from the PCB/backplate from the lower card. A two slot spacing need not be a deal breaker in any case -any more than lower graphics cards blocking front panel/USB/CMOS/Reset ports has ...it's just a matter of lateral thinking. The limiting factor becomes how many expansion slots the chassis has. The tendency towards 8-11 slot chassis adds a lot of options.
Well it's a good thing TS's forums are censored, I wonder how many people Kyle has scared off over the years... :mad:
Well, I left as opposed to being scared off, but my issue was [H]'s blatant shilling. In this particular instance I would tend to agree with his sentiments. Too many people post simply as a form of entertainment for themselves.
 
Judging by these results vs the 6990 a new computer purchaser may as well buy two second hand 6970s or 6950s and crossfire them for better scores in most games than the 680 (and be quite a bit cheaper with the flood of 2nd hand ones atm). Although this will of course use up more slots and power.
 
Nice review. What a beast of a card that is. Can't wait to see 690 stats..

I think there's a mistake in the 2nd para of the first page though - you say the GTX 580 has 48 ROPs and the GTX 680 ramps it up to... 32 ROPs ... 16 less? :S
 
According to the power consumption chart, it uses less power than my currently gtx 560 ti. Does it mean that if I upgraded to the gtx 680, I could still use my 500w psu? Or there's something I'm missing?

From that chart your right and right again, hell if you get this particular model you'll even produce less heat!

wow nvidia stepped up there game, the 560ti has what? 384 cores? for the same power and less heat they've managed to squeeze in an extra 1152 cores?!

That, sounds mad to me in a space of a year...

The cores aren't the same. But I guess the most non-scientific but somewhat perceivable comparison would be saying the 560ti has 768 cores compared to the 1536 of the 680.

But again, not Apple to Apples.
 
My friend had 2 GTX580s running SLI and he tried to play The Witcher 2 with ubersampling on. The intro video dropped down as low as 23 fps. Why don't review sites test The Witcher 2 with ubersampling on? That would give a really good indication about how much more intricate cards are getting. I know the 680 is supposed to be a beast at all of the newest additions to visual settings, like DX11 and ubersampling. I want someone to prove it.
 
Review stated it tested with 11.13 drivers for AMD cards which new drivers have improved performance so that maybe restricting performance for the AMD cards some. Why use such old drivers? Second since Nvidia uses boost all the time, using OverDrive and upping the frequencies for the Radeons which is part of the driver set would show that the Radeons have a lot of overhead that is not tapped into. For example the HD 7970 for most can OC to max OverDrive settings of 1125/1575 while some are slightly slower then that without having to use an utility for upping the GPU volts. Anyways coming to conclusion the 680 is X amount faster I think is short sighted without using full capability of the HD 7970 with options in the drivers. A number of reviews show that when both the 680 and 7970 are both maxed out they perform virtually the same at higher resolutions.

Review had a good broad base of games which is good but lack any second checking of playing the games for stuttering, problems in rendering etc which may not be caught by running a pure benchmark. Also no GPU compute benchmarks where done for those that go beyond gaming, here the 7970 has a significant lead over the 680.
 
My friend had 2 GTX580s running SLI and he tried to play The Witcher 2 with ubersampling on. The intro video dropped down as low as 23 fps. Why don't review sites test The Witcher 2 with ubersampling on? That would give a really good indication about how much more intricate cards are getting. I know the 680 is supposed to be a beast at all of the newest additions to visual settings, like DX11 and ubersampling. I want someone to prove it.
You answered your own question. SLI'ed GTX 580's to get 23 fps ? What's the point of ramming the game i.q. up so high it becomes unplayable- is this indicative of real world gaming?
Does 15-16 fps hold anything more than an academic interest for anyone...or are you the kind of person that prefers playing a semi-interactive slideshow ?
GTX-680-76.jpg

Second since Nvidia uses boost all the time, using OverDrive and upping the frequencies for the Radeons
It's a stock vs stock review...Performance out of the box.
Radeons have a lot of overhead that is not tapped into. For example the HD 7970 for most can OC to max OverDrive settings of 1125/1575 while some are slightly slower then that without having to use an utility for upping the GPU volts.
Cool. It's still not an OC vs OC review. And as you've noticed, OC's tend not to be guaranteed. Here's a GTX 680 at 1204 core, 1257 boost and 7024 memory. http://www.kitguru.net/components/graphic-cards/zardon/asus-gtx680-graphics-card-review/20/ maybe not representative, but still a valid benchmark candidate...and it certainly would stack up well against the same sites 1015 core/ 5600 memory OC'ed HD 7970 http://www.kitguru.net/components/graphic-cards/zardon/amd-hd7970-graphics-card-review/21/
 
Driver number was a typo, it was meant to be 12.3 and I have now fixed that thank you. Also more of the tests are run manually so we would notice any rendering issues.

My friend had 2 GTX580s running SLI and he tried to play The Witcher 2 with ubersampling on. The intro video dropped down as low as 23 fps. Why don't review sites test The Witcher 2 with ubersampling on? That would give a really good indication about how much more intricate cards are getting. I know the 680 is supposed to be a beast at all of the newest additions to visual settings, like DX11 and ubersampling. I want someone to prove it.

Percentages are percentages my friend
 
I just tried Witcher 2 with Ultra presets (Ubersampling on) and my fps was between 25-40 during actual gameplay @ 1920. For some reason I thought I'd do better.
 
Back