Nvidia GeForce GTX 480 SLI vs. ATI Radeon HD 5870 Crossfire

I know the power requirements would be insane, but overclocking these GTX 480s to 850-932MHz would yield 15-20% more performance! These GTX 480s do overclock well and per clock show more gains in actual performance than an ATI overclock, my overclock on water cooling is an insane 932MHz core/1864MHz Shader on my GTX 480! I wonder how two GTX 480s in SLI with an overclock of 932MHz core/1864MHz Shader would perform! I would have liked to see both ATI and Nvidia cards at 1920x1080 resolutions, I think I will start turning off my computer when I'm not home or in and out all day!
 
dividebyzero said:
BTW: Using the 2Gb card also throws out the pricing comparison, and if the review were using non-reference 5870's then surely non-reference GTX 480's should be used also (Pretty exhorbitant at $520...but then again so is this). The Zotac card also runs quieter, cooler and even though overclocked uses virtually the same wattage in comparison to the reference model.
I agree that it would have been nice to see the highest end setups. However, since it's been shown elsewhere that 2GB vs. 1GB can make a real different in certain games, that's a more important than the effects of clock speed, which is something which can be interpolated from existing results, or cooling, which is just a factoid. If the games where the Radeon fails miserably (for single card) work much better with 2GB, then there's be a better point of comparison.
 
I know the power requirements would be insane, but overclocking these GTX 480s to 850-932MHz would yield 15-20% more performance! These GTX 480s do overclock well and per clock show more gains in actual performance than an ATI overclock, my overclock on water cooling is an insane 932MHz core/1864MHz Shader on my GTX 480! I wonder how two GTX 480s in SLI with an overclock of 932MHz core/1864MHz Shader would perform!
Running overclocked SLI'ed GTX480's stacked on top of one another (no one slot gap) on air is probably asking for trouble, or at the very least, throttling.
Not a lot of point running oc'ed cards in any case, since there's no guarantee that any reader/user could reach the same clocks and duplicate the gains- unlikely that any two cards are going to OC to the same extent.

@ET3D
Whether or not the price premium is worth any extra performance is probably down to the individual and what price they put on any increase.
I linked to Anands 1Gb v 2Gb review in an earlier post which shows negligable (in my opinion) gains. Obviously for some people a couple of extra fps here or there justifies the $100 (25%) price premium over the 1Gb card. Strangely enough I'd wager that a good percentage of those same people would denounce the GTX480 at the same price point even though it offered better performance boost. No accounting for the foibles of consumerism or brand allegience.
Hardware Canucks also had similar results. basically approx 3fps increase over the standard card. Bear in mind that the card is overclocked, which accounts for most of that increase.
The only game I've seen where 2Gb makes any tangible difference is Metro 2033...but again, applying AA brings the card back to DNF status.

What remains untested is the possibility that CF'ed 2Gb HD 5870's offer a substantial gain where the single card does not. I personally can't see a large turnaround in fortune- although from my recollection there has been no 2Gb CF v GTX 480 SLI reviews. The closest approximation would be Sapphire's HD 5970 4Gb Toxic v GTX 480 SLI, which isn't quite an apples-to-apples comparison even with the Toxic's OC. The review is a little dated in any case given that it predates the 25x.xx nVidia driver release (as well as Cat 10.5/10.6)
 
Observations...
1. What happens if you turn off 10 lights in your house when gaming?

Not sure, what time of day is this? If I'm from the age 6-12 the boogyman might come to get me if its too late, or maybe the 10 lamps I have off will help give Al Gore more moneys for his "environmentally friendly" campaign.

Just saying...I lol'd at your number one.
 
Hey, I'm an Nvidia owner of late... 8800gt then just switched to a 260gtx OC the other day (late upgrader as no reason to upgrade till BFBC2). Anyway... just wondering how the 5870 x 1 was only capable of 0.0 fps in Metro 2033? Seems a bit weird? Did you guys get the drivers installed and everything working? Was there a specific bug encountered? Surely 0.0 fps means the game crashed as soon as you tried to run it so wouldn't it be better to say that in the review than post a graph showing fps along the y-axis as a line along the bottom.

Anyway i'm going to wait till the 460gtx or whatever comes out and even maybe consider waiting till 2nd gen dx11 comes out from nvidia before upgrading. No major dx11 games worth playing (yes i know bfbc2 has dx11) and so i don't think there'll be a need till next year.
 
Wow, look whos quick to throw out the term "Fanboi".. fanboi.

nVidia loozes this time again. Bankruptcy is just around the corner. Its just that nvidia fanboys can be quite dull sometimes.. indeed its hard to swallow the reality pill.
 
While a 700w power supply was not used for the review as the article originally said I'd just like to point out the fact that Steven measured the power consumption of the PSU at the AC side (I.e. with a KillAWatt or similar device that you plug the power cord into)

So to get the power consumption of the actual computer parts you need to substract the (in)efficiency of the PSU, most modern powersupplies are around 80% efficient...
So the power draw of the system thus is;
787w X 0.8 = 630w

So a good quality 700w PSU would have been enough, however not recommended since it would be quite heavily loaded...
 
I run two GTX 470s in SLI mode. Even at an eight-foot distance, I considered the fan noise to be obtrusive. In regular desktop usage, they would be outright loud. (I am talking about noise levels under load in more demanding games.)

So for anyone who's not hard of hearing, you need to spring another $350 for water cooling, which is what I did. (No, a pair of headphones won't spare you the noise.)

That said, SLI does scale better than CF, though the numbers I've seen before weren't this drastic.

Anyway, all the fanboy vs. fanboy nerd wars are irrelevant. Judging from Fermi availability (plentiful), ATI's prices (still above MSRP), and the most recent market share numbers, Nvidia has lost this round decisively. The chip ojust doesn't scale well and runs too hot, requiring a loud stock cooling solution.
 
The most ridiculous part of the comparison is the bloody ati cards were released over 9 months ago. Wait 9 months and than benchmark the new bs ati vs the old hot box 480. nvidia fans will be qq'in hard over there overpriced bar heaters.
 
Haha yeh above poster is correct. Nvidia fans are just salivating over a technology that can finally outperform a 9 month old card.
 
Who cares if it is 9 month old tech. It is their tech that is out today. Do they have anything newer to bench against? That's what I thought.
 
Who cares if it is 9 month old tech. It is their tech that is out today. Do they have anything newer to bench against? That's what I thought.



....Tonight we have a special guest who will perform a one man argument
 
ATI drivers have always been bad. Even if Nvidia had slower cards I would go Nvidia because i know it will work in a new and old game. ATI drivers have problems in new games and sometimes it takes them over 9 months to correct it and by that time people have moved on to other games again with new driver problems with ATI.
 
ATI drivers have always been bad. Even if Nvidia had slower cards I would go Nvidia because i know it will work in a new and old game. ATI drivers have problems in new games and sometimes it takes them over 9 months to correct it and by that time people have moved on to other games again with new driver problems with ATI.


"Danger Will Robinson!...Danger"
 
You should know the perils of drivers by now red...
True story ! I just updated my nVidia driver to 196.75 AND updated my Creative Soundblaster driver...next thing you know, I'm a widower, the dog has lost his eyesight, a passerby suffered shrapnel wounds and I'm the subject of an upcoming heart-rending Oprah special....and my house....well, just look at it!
nVidia_driver_problem.jpg
 
^^^^^^:haha::haha::haha: ROFL

well at least it was an Nvidia driver you downloaded there Chef, so the insurance adjuster will be there tomorrow. Had it been an ATI driver, they wouldn't be there for nine months.
 
I love ATI's hardware but I can never buy their cards as the games I play, my MMO of choice is City of Heroes and is over 5 years old now, and ATI still is having driver problems. I just like my stuff to work so until ATI steps up the driver support I will always be forced to buy Nvidia, and that sucks as they charge for that extra stability their drivers give a gamer.
 
Observations...
1. What happens if you turn off 10 lights in your house when gaming?
ii. Don't know about Argentina...but in NZ 1kw/hr is $0.16-0.21 (I pay $0.17), so on a generous computer/gaming session the differential could be 3hrs idle @ 180w (540w) + 5hrs gaming @ 196w (980w) = 1.52kw x 30 days x $NZ 0.17 = $NZ 7.75/mo ($US 5.51/month or 21.67 Pesos)....assuming I run the system 8hrs a day every day.
c. When did ATI enthusiasts suddenly become eco-crusaders ? It was certainly after the R600 went EOL...must be because global warming hadn't been invented in 2007!
§. The max power draw for both the 5870 and 480 is usage under Furmark...You spend a lot of time playing Furmark gr ? So, when all's said and done you're basically saying 800 watts system usage = bad, while 600 watts system usage = acceptable ?
Here's some table's showing the relative power draw for both cards at Idle, normal 3D usage, maximums and blu-ray playback. You can breathe easy....the world's demise has been put back two weeks.

Never said ati enthusiasts were eco-crusaders, but some people here say that the difference between one an other's power consuptiom isn't relevant, so I made a power usage comparison just to make a simplier way to see the effect. I wasn't talking about how expensive your bill would come. Bad? Acceptable? Did I mean that?
 
Bad? Acceptable? Did I mean that?
Who knows gr, although I'd equate it with "Not so enviroment-friendly" as in...
790w-590w=200w/20w= 10....Ten 20W CFL... Under load...Not so environment-friendly...
...and since you are pulling the "enviroment-friendly" card, I'll tar you with the same brush that seems to pervade the AMD-nVidia fanboy debate; namely that one side will use whatever argument best fits their narrow-minded bandwagon-jumping pseudo-ideology...so for nV*****s its "red team.... driver sux/doesn't have CUDA or PhysX/crappy minimum framerates/bad linux support" yada yada...and for the Association of Mental Defectives it's green team...late/bad architecture/too hot to handle/power hog/expensive" etc.,etc.
So in this case we have the trite analogy of the power saving equivalent of ten lightbulbs while gaming in relation to a graphics system that costs upwards of $US800...you may as well be referencing CO2 emissions to John Force and Tony Pedregon
That is to say that power savings (or usage) come way, way down the list of priorities for people running multi-GPU enthusiast graphics, likewise how many people with enthusiast motherboards and graphics laden with extra copper and gold give a damn about enviromental pollution regarding mining of these metals?
 
Think before you troll...

I love ATI's hardware but I can never buy their cards as the games I play, my MMO of choice is City of Heroes and is over 5 years old now, and ATI still is having driver problems. I just like my stuff to work so until ATI steps up the driver support I will always be forced to buy Nvidia, and that sucks as they charge for that extra stability their drivers give a gamer.

This of course begs the following questions,
I love ATI's hardware but I can never buy their cards

Hmmm, if you never buy their cards...how do you know? and if they don't work, why would you "love them"? Don't get me wrong...its an interesting perspective, cuz most people have no desire to purchase items that don't work properly.

and ATI still is having driver problems.

Boilerplate alert! It mat interest you to know that millions are gaming on ATI cards and not having the problems you are on your GeForce 6200.

I just like my stuff to work

Gosh...so do I Darn it!
I will always be forced to buy Nvidia,

Oh dear!, don't ya hate that?
for that extra stability their drivers give a gamer.

This one was C&P right of of the Nvidia web site was it?
and off course silly! we all love the creamy goodness of that "extra stability" that they give unto the gamer......yikes :rolleyes:

You realize of course there is a ' off topic' forum right? you could maybe talk about how much you really want a Ford, but gosh darn it! the wheels fall off.
 
Who knows gr, although I'd equate it with "Not so enviroment-friendly" as in...

...and since you are pulling the "enviroment-friendly" card, I'll tar you with the same brush that seems to pervade the AMD-nVidia fanboy debate; namely that one side will use whatever argument best fits their narrow-minded bandwagon-jumping pseudo-ideology...so for nV*****s its "red team.... driver sux/doesn't have CUDA or PhysX/crappy minimum framerates/bad linux support" yada yada...and for the Association of Mental Defectives it's green team...late/bad architecture/too hot to handle/power hog/expensive" etc.,etc.
So in this case we have the trite analogy of the power saving equivalent of ten lightbulbs while gaming in relation to a graphics system that costs upwards of $US800...you may as well be referencing CO2 emissions to John Force and Tony Pedregon
That is to say that power savings (or usage) come way, way down the list of priorities for people running multi-GPU enthusiast graphics, likewise how many people with enthusiast motherboards and graphics laden with extra copper and gold give a damn about enviromental pollution regarding mining of these metals?


This is the best post concerning fanboidum I've seen, ever.
 
Wow ! 28 fps at Max detail @ 8xAA and DX10....that's AWESUM....how about including a (missing) link ?.....and since that's NOT going to happen, I'm calling bs.


FYI Crossfired 5870's would have trouble posting 28fps at 4xAA let alone 8xAA
 
I don't know what the fuss is about the 0 fps results, that just means the game didn't run properly with those settings likely due to a driver bug (as described in the article's text). Then we have subsequently tested with settings that worked fine.

In the case of Crysis Warhead that meant disabling anti-aliasing completely at 2560x1600. See the last test in that section.
 
Back