Nvidia GeForce GTX 480 SLI vs. ATI Radeon HD 5870 Crossfire

By on June 23, 2010, 7:10 AM
Recently we revisited the battle between the high-end Nvidia GeForce GTX 400 and ATI Radeon HD 5000 graphics cards series using updated drivers and a new testing method that saw us abolish all time demos. This comparison was limited to single GPU testing at what we consider to be the mainstream screen resolution for high-end PC gaming.

But for those amongst you that like to take things to the next level, gaming at 2560x1600 with 30-inch displays and beyond. For extreme users that do not necessarily care about value, power consumption or even heat... who offers the best gaming solution? This question leads us to a new showdown between ATI and Nvidia, only this time using a pair of GeForce GTX 480 graphics cards in SLI versus a pair of Radeon HD 5870 using Crossfire technology.


For many of us the prospect of spending $400 on a Radeon HD 5870 graphics card seems a little crazy, let alone the $500 Nvidia is asking for the GeForce GTX 480. So it goes without saying that those willing and able to purchase two of these mighty graphics cards are in the minority. Still, such configurations do exist and those of you looking to go down that path will be interested to check our findings.

Read the complete review.




User Comments: 110

Got something to say? Post a comment
LinkedKube LinkedKube, TechSpot Project Baby, said:

Ohhhh no's, here comes the 6 page rants about nvidia power consumption.

Staff
Steve Steve said:

I think the conclusion sums up the power consumption story pretty well. Yes the power consumption is high but guys looking to purchase two of the world's fastest graphics cards probably don't care

Guest said:

Would be interested to see a HD 5850 Vs GTX 470 in dual card ...

LinkedKube LinkedKube, TechSpot Project Baby, said:

, post: 897695"]I think the conclusion sums up the power consumption story pretty well. Yes the power consumption is high but guys looking to purchase two of the world's fastest graphics cards probably don't care

95% of the people that go on those rants don't buy the cards anyway. If you're willing to pay 800+usd for sli/crossfire. 2 more dollars on your energy bill shouldnt kill you too bad per month.

Guest said:

While I dont' go for the highest end cards anymore, I do spend $300 or so on one. I never cared about the power they took or the heat they put out. If the heat does not hurt my card, and I can get it out of my case so as not to harm my other components, heat is a non issue.

I run my A/C at 73 degrees in the summer months, 24x7, my computer room is always cold and a little more power doesnt' bother me for the cards. I do try to turn off my lights when not in use, but I am not so power conservative around the house to worry about an extra 300 watts of power 24x7.

Also, I leave my pc on 24x7, my son's and my wife's, including a HTPC. I assume those taht scream about the power and heat it takes/puts out, also shut their PC down after 15 minutes of non use, and leave the A/C in summer on 78 and heat in winter on 68, and always turn off home theater equipment when not using it etc...if they don't do all those things, they why complain about the power/heat they put out?

If you want the highest end you can get, power and cooling and price is a non issue, PERIOD.

ET3D, TechSpot Paladin, said:

Obviously AMD needs to work on Crossfire. If these are the results after a driver that was supposed to increase crossfire performance...

BTW, it would have been interesting to see how 2GB Radeon cards fare. I know they're not the norm, but again people who buy this kind of stuff would go for them if the extra memory makes a different, and I think it would in some cases.

Ritwik7 Ritwik7, TechSpot Chancellor, said:

I had been of the opinion that CrossFire scales better than SLI. This review seems to change that conception.

Guest said:

This is the most biased and flawed pro Nvidia article I have read for a while, thanks for the laugh.

dividebyzero dividebyzero, trainee n00b, said:

The Asus P6T in the picture looks remarkably like a Gigabyte 890FXA-UD7....is this Asus' response to tougher competition?

Guest said:

Not a fanboy, but was on the ATI bandwagon for the single card review. Can't remember my password to login though...

Anyway, not for me anymore, but clearly Nvidia is the way to go if can/ want to fork over an extra $1500 for top performance (including a good PSU). 20% is my personal threshold to jump on improvements and that appears to be the bottom line here. Hat's off to Nvidia for top fuel crown.

Still, your talking 2 - 5% of total market. Nvidia has to come up with a better mainstream product or the dollars will shrink up soon.

ikenlob

LNCPapa LNCPapa said:

Even though it makes sense that a person willing to spend this kind of money on video cards will likely have a 30" monitor I'd really love to see these benchies run at 1920 just for the sake of the numbers.

Guest said:

The performance gain is clearly there for the high-resolution extreme gamer who doesn't care about cost, power usage, heat and noise. That's great, no doubt about that. Question is, how much of a market is there for something like that? I'm having a hard time seeing the business sense in that. It must have cost a fortune to develop the fermi and it certainly isn't cheap to produce either, yet it's only really at the extreme that fermi shines. For the average gamer who is concerned about factors other than pure raw performance, ATI still seems like a better choice right now.

Guest said:

Nvidia Corporation probably isn't happy with the power consumption and heat of the

480 GTX, but their hardware and software engineers deserve credit for the sheer performance

of the card. Now they are in a comfortable position to consider their options. Maybe the big,

powerful style of GPU design has reached its limit. That's not something to be ashamed of.

Guest said:

SOMETHING IS FISHY ABOUT THIS ARTICLE. First of all, noone would use 2xHD5870, they would take HD5970, or even HD5970. You could show us the strongest GPU's of ATI and NVIDIA in a fight (GTX480x2 vs HD5970 x2), then it would be interesting.

grvalderrama said:

Guest said:

SOMETHING IS FISHY ABOUT THIS ARTICLE. First of all, noone would use 2xHD5870, they would take HD5970, or even HD5970. You could show us the strongest GPU's of ATI and NVIDIA in a fight (GTX480x2 vs HD5970 x2), then it would be interesting.

The challenge is between single-processor video cards. Woudn't be fair to compare Quad configuration vs Sli configuration.

Guest said:

Comparison of high-end video cards, I don't think so, as you should've tests the 480 -- or higher -- against the 5970; and no one ever seems willing to do so, but still states it's a battle of the high-end video cards!

Guest said:

Yeah, me too!

Guest said:

What's the difference? You're not buying a GPU, you're buying a video card.

Guest said:

Cypress @ 850 MHz (vec5, TU, ROP)

vs

Fermi-480: GPC domain (+ TU inside) @ _1401 MHz_ and ROP @ 700 MHz

Think about it...

TomSEA TomSEA, TechSpot Chancellor, said:

Wow...you don't often see a hands-down winner like this when comparing cards. Very interesting and a bit of a surprise.

So how many cases did you guys melt in doing this comparison?

Guest said:

HURR DURR FASTER CARDS USE MORE POWER

Great article

Guest said:

I hope all the PC guys who keeps knocking console graphics as super inferior to PC will get these cards for this setup. Otherwise....let only the extreme pc gamers knock us console users, they are the ones who truly are experiencing graphics as it ought to and frame rate as intended.

The next time any pc gamer other than the "extremist" is gonna rant about console graphics being inferior.... I say "STFU"!!! cause your setup is not the best either.

EXTREMIST.. FTW !!!! then consoles...lol

BMfan BMfan said:

SOMETHING IS FISHY ABOUT THIS ARTICLE. First of all, noone would use 2xHD5870, they would take HD5970, or even HD5970. You could show us the strongest GPU's of ATI and NVIDIA in a fight (GTX480x2 vs HD5970 x2), then it would be interesting.

How can you say noone(no one) would buy 2x HD 5870's?

I would buy 2x 5870's over a HD 5970.

Great review.

Thanx

princeton princeton said:

Guest said:

This is the most biased and flawed pro Nvidia article I have read for a while, thanks for the laugh.

This is the most biased and flawed pro ATI comment I have read for a while. Thanks for the laugh.

dividebyzero dividebyzero, trainee n00b, said:

Even though it makes sense that a person willing to spend this kind of money on video cards will likely have a 30" monitor I'd really love to see these benchies run at 1920 just for the sake of the numbers.

[link]

A reasonable assumption would also be to forego an expensive IPS 2560x1600/1440 and use three cheaper monitors in a 5760x1080 display. Most interest in these cards SLI'ed that I've seen is leaning towards surround gaming- which requires two (or more cards) since nVidia cards can only support two monitors each.

Just as a side note....Regarding GTX400's stacked in consecutive PCIe x16 slots on a board, I, like nVidia wouldn't recommend the practice. Using a bog-standard P6T is a great way to inflate temps -not that the cards need any encouragement in that area. I think I'd be looking at a board that can seperate it's two primary graphics slots with at least two PCI/PCIex1/blank slots, such as the

P6T Deluxe/P6TD, P6T6 WS, P6T7 WS, R3E, M3E, P7P55D/P7P55D-E Deluxe/Premium or Sabertooth i55 when it comes to Asus products (amongst others I'm sure).

ebolamonkey3 said:

I have a question, how did you run the tests using that 700w power supply??!

Guest said:

Not alot of 5870 CF VS GTX 480 SLI Benchmarks around , thank you taking some time to get one up :)

I just recently built my new rig, I currently have:

Asus Maximus Rampage Extreme III X58

Corei7 930 @ 4.2 ghz on air 24/7 max Max Load 70C with Hyperthreading, 60C W/o Hyperthreading on a Noctua Cooler NHD

GTX 480's SLI

6 Gig G.Skill Ripjaw triple channel @ 1600 mhz

OCZ agility 120 GB SSD for main Drive

OCZ agility 120 GB for Games

2 x 1TB WD Caviar Black in Raid 0 for Video Capture

Spedo Advanced Case, lot of fans to keep things cool.

Both cards reach about 90C max on a hot day @ 100% load, I have them on PCI Express Slot 1 & 3 to give them breathing room Idle about 48C for both. ( they replaced 2 4890's in CF which idled @ 60C and Max load at 85C)

So I don't see what the big fuss is all about with those temps, oh and my 4890's used to crash quite often in the newer games .I.e Borderlands, Dragon Age, Mass Effect 2.

I agree that the power requirements are much higher, but the performance they deliver makes up for at least some of the that cost. Using an Ultra X4 1050w Power Supply which runs the system solid.

dividebyzero dividebyzero, trainee n00b, said:

I have a question, how did you run the tests using that 700w power supply??!

Timonius Timonius said:

ebolamonkey3 said:

I have a question, how did you run the tests using that 700w power supply??!

Ah yes, perhaps you can answer the good old power supply mystery.

Guest said:

3GB SLI vs. 2GB Crossfire: Which is King?

There is something seriously wrong here. Picking a 1GB card for a test of 2560x1600 with anti-aliasing in Crossfire? Sorry dear, but that's just silly and that makes this review pretty pointless. You should at least have picked one (two) of the 2GB 5870 versions, when going for the higher res with AA.

All this review really tells me is that 3GB are better than 2GB in dual configurations with 2560x1600 and AA.. You could have done that with a one-liner and a single graph.

Nice design and colours in the graphs,

Staff
Julio Franco Julio Franco, TechSpot Editor, said:

I have a question, how did you run the tests using that 700w power supply??!

Good catch. The PSU info was incorrect indeed, we will update the specs as soon as Steve comes back online (he lives on the other side of the globe).

dividebyzero dividebyzero, trainee n00b, said:

3GB SLI vs. 2GB Crossfire: Which is King?

That's probably been established

There is something seriously wrong here.

Fanboy using the Guest account....check

Imminent whinging over Red Team card not "winning" the fps battle...check

No, I'd say we're right on track so far

Picking a 1GB card for a test of 2560x1600 with anti-aliasing in Crossfire?

So, while CF'ed reference 5870's CAN run games at 2560x1600-as shown in the review (and a few others available online) your quibble is that CF doesn't WIN.

Question: Wasn't one of the main promotional tools used to sell this card the Eyefinity technology (i.e. 5040x1050 +) ?

You should at least have picked one (two) of the 2GB 5870 versions, when going for the higher res with AA.

So....the review should use the non-reference AMD cards (Eyefinity6/Toxic, overclocked)

Well guess what....

[link]

[link]

[link]

[link]

[link]

Which covers five of the six games tested.

BTW: Using the 2Gb card also throws out the pricing comparison, and if the review were using non-reference 5870's then surely non-reference GTX 480's should be used also (Pretty exhorbitant at $520...but then again so is this). [link] in comparison to the reference model.

All this review really tells me is that 3GB are better than 2GB in dual configurations with 2560x1600 and AA..

Since 4Gb of HD5870 falls to 3Gb of GTX 480 then it should tell you that:

1. SLI on GTX480's scales really well

2. The GTX 480 suffers only a minimal performance hit under heavy AA -unlike it's nVidia predecessors

3. A 2Gb HD 5870 still uses the same 256Mb frame buffer as the 1Gb

4. Maybe AMD need to tweak/redesign Cypress to compete at extreme IQ settings ( hellllllooooooo Southern Islands)

You could have done that with a one-liner...,

So could I...but I've given up swearing for Lent

Relic Relic, TechSpot Chancellor, said:

Good read, just wish I was in the top echelon of enthusiasts to give something like that a shot =D.

Guest said:

The next time any pc gamer other than the "extremist" is gonna rant about console graphics being inferior.... I say "STFU"!!! cause your setup is not the best either.

Mid-range systems can still outdo consoles, so your point is moot. Console gamers are the ones that claim you need a behemoth like system worth thousands of dollars to properly game on when in reality you don't.

Guest said:

105 degree's in sli? wtf in my opinion nvidia just over clocked it themselves to gain an advantage over amd look at the cooler and its 4 heat pipes there just pre over clocking to up performance and power consumption.

The single cards performance is identical when comparing 5870 and then 480 and the 5870 is much older now. Amd ftw for single card. Although I do agree the scaling of crossfire is poor for amd compared to nvidia (not for the 5770 range though which goes upto 80% consistently) and a 5970 should be looked at instead (probably why they released it in the first place).

summary

-105 degrees wtf

- single cards are identical performance but nvidia just o/c the crap out of it with a mass cooler.

- amd crossfire fails on 5870

-5970 should be considered instead.

red1776 red1776, Omnipotent Ruler of the Universe, said:

Guest said:

The next time any pc gamer other than the "extremist" is gonna rant about console graphics being inferior.... I say "STFU"!!! cause your setup is not the best either.

you forgot "my dads bigger than your dad"

This is great! I've often thought we don't hear nearly enough from the 6-12 year old demographic.

LinkedKube LinkedKube, TechSpot Project Baby, said:

you forgot "my dads bigger than your dad"

This is great! I've often thought we don't hear nearly enough from the 6-12 year old demographic.

I was thinking I must be a pretty smart 12 year old. When I read that comment earlier I thought to myself. " You've already lost the race by the time UPS shows up at your door with your purchase." Just as quick as we upgrade, there's another announcement.

Anyway, I think ati needs to get on the ball, and by the end of this year. Although I'm sure they could make their cards pump out more power and more heat. Ati seems a little more practical to most people. While the nvidia fan boys wait in line to get hit by the next big black and green nuclear reactor they can get their hands on.

Guest said:

@ Guest who said 5970 should be considered instead ... are you a bone head? The Radeon Hd 5970 is slower than a pair of Radeon HD 5870's and that is why it is cheaper. Given the option you should always go for a pair of Radeon HD 5870 Crossfire cards when two slots are available.

As for all the guys jumping up and down about them not using 2GB cards do some research. For the most part you get a frame or two more, hardly worth the massive increase in price.

Guest said:

Yea, it's all a conspiracy that the Nvidia setup ripped the 5870 setup. Cherry picked cards, mass oc's, etc. etc. No, they should not be compared to a 5970. I remember all you ati fans complaining back in the day, when nV had sli on a stick and all I heard was, you can't compare, you can't compare!!!! Now all of a sudden things have changed I guess. The 5970s are a load of problems anyways in crossfire and people are pissed that they do not work as advertised. I think the 5870 crossfire was the best choice to bench against. You could bench with the 2 gig versions, but then you would lose out on the $ side and performance side. By the way, they will beat the 2gig models too. Of course the benchies would become closer, but not as much as you would like to think.

By the way, I have owned ati cards the last 3 years, so no I am not a fanboy. You guys just get ridiculous when your card gets trashed. Always some deep antics going on behind the scenes with you guys. DId you not expect something that uses more power. more heat and more money wouldn't smoke the 5870s?

Guest said:

a 5% performance gain with a 20% heat and coast isn't smoking anything cept your computer box.

^^ single cards

hellokitty[hk] hellokitty[hk], I'm a TechSpot Evangelist, said:

I had been of the opinion that CrossFire scales better than SLI. This review seems to change that conception.

I'm impressed by that 90% scaling.

LinkedKube LinkedKube, TechSpot Project Baby, said:

Just 5%, did you look at the charts at all?

red1776 red1776, Omnipotent Ruler of the Universe, said:

a 5% performance gain with a 20% heat and coast isn't smoking anything cept your computer box.

^^ single cards

....I think you forgot to carry the 1...

Guest said:

OMG i cant stand people specially fanbois who just would say anything to justify their brand preference or purchase

@people saying biased unfair comparison "they should compare it to the 5970"

Looks like many ati fanbois are butt hurt at how well Fermi scales on SLI and the power of the GTX 480, even more now with the new drivers

Ok to start off the 5970 is a dual GPU and the GTX 480 is a single GPU

Second of all in the USA the 5970 cost $200 more than the GTX 480, whereas the GTX 480 only costs $100 more than the 5870.

New egg is the one of best places where you can buy Computer parts here in the USA and has the some of the best prices around here

It list the 5970 at $700 http://tiny.cc/73be8 and the GTX 480 at $500 http://tiny.cc/38zm5 and the 5870 at $400 http://tiny.cc/dv5q1

Point one:

2 GTX 480 on SLI are way faster, it easily beats ONE 5970

Point two:

The 5970 scales like **** on crossfire when you add a second 5970, it is almost a waste of money, on average just 10 to 15 percen faster than 2 way SLi GTX 480, also important to note that is some cases 2 GTX 480 match or even outperform quad 5970 depending of res and Anti Aliasing NOT to mention it quad 5970 would cost you $400 MORE than the nvidia configuration

Point three:

5970 should only be bought if thats would be your only graphics card and you DON'T plan to add a second one in the future, since it would be a waste of money considering how bad it scales on quad crossfire when you add a second 5970

Point four:

3 way SLI GTX 480 costs $1500

Quad crossfire 5970 costs $1400

So the the 3 way SLI nvidia configuration would blast the **** out of the quad 5970 for just $100, considering that even 2 way GTX 480 matches quad 5970 in some games/cases

http://www.maingearforums.com/entry.php?24-So-You-Want-To-Bu
-A-GeForce-Part-2

Note

I wouldn't go as far as 3 way sli, 2 way sli is the best choice, add water blocks to each card ($300 total) overclock the 2 way sli GTX 480 and you would get 3 way sli level performance at cool temperatures that do NOT surpass 60 C at almost silent levels thanks to the water cooling

IMPORTANT NOTES (that enraged fanbois always forget to mention when they call fermi an sub par product)

-GTX 480 has superb dx 11 tessallation support, it does it better than any ati card, as games mature and start utilizing tessellation more heavily the GAP is just going to get bigger

-Fermi scales great on SLI anywhere from 80 to 90 percent

-GTX 480 handles Anti Aliasing like no other, it really has superb performance, it takes a much lower performance hit when jumping from 4XAA to 8/16xAA (if you enable 8/16 AA the GTX 480 can be 40% faster than the 5870)

-One word: GPGPU aka CUDA as of now fermi has no competition on the GPGPU field it folds like crazy, it really is the best card for video editing/edition, photoshop etc

- Better drivers, well.. DUH...

- 8/16xAA a single GTX 480 is 20% than the 5870 and heavy tessellation to the equation and the performance gap gets bigger

Final note:

Its pathetic when people mention power consumption of 2x GTX 480 Vs 2X 5870

One would think that ANYONE planning to spend 800/1000 dollars on Graphics card would have at LEAST an decent 1000 watt PSU. Only enthusiast would spend that much on graphics cards and enthusiast clearly have beefed up power supply and don't have a problem with power consumption

"ZOMGZ my electricity bill"

All i have to say ti this is *facepalm* it wold probably cost you an extra $2 more than the ati configuration a month in your electricity bill

grvalderrama said:

"ZOMGZ my electricity bill"

All i have to say ti this is *facepalm* it wold probably cost you an extra $2 more than the ati configuration a month in your electricity bil"

350w-170w=180w/20w= 9...Nine 20w CFL.... Imagine 9 more lamps in your house, that would give you a more accurate idea of the energy consumption between SLI and Crossfire. (when idle)

790w-590w=200w/20w= 10....Ten 20W CFL... Under load...

Not so environment-friendly...

ruzveh said:

for me ATI 5 series card is KING. I am infavour of more efficient card like high performance, features with less power consumption and heat. Less heat will also lead to long life of my purchase that i make. I dont intend to use my graphics card 24x7 for gaming but my graphic card will work for 24x7 coz i keep my PC running all time. And if the idle power consumptions r on the roof then god save me with the electricity bills and life of the card every now and then.

Yea so with ATI i will game when i want and i will leave the pc idle without worrying much abt power and heat factor. Win win for both

dividebyzero dividebyzero, trainee n00b, said:

"ZOMGZ my electricity bill"

All i have to say ti this is *facepalm* it wold probably cost you an extra $2 more than the ati configuration a month in your electricity bil"

350w-170w=180w/20w= 9...Nine 20w CFL.... Imagine 9 more lamps in your house, that would give you a more accurate idea of the energy consumption between SLI and Crossfire. (when idle)

790w-590w=200w/20w= 10....Ten 20W CFL... Under load...

Not so environment-friendly...

Observations...

1. What happens if you turn off 10 lights in your house when gaming?

ii. Don't know about Argentina...but in NZ 1kw/hr is $0.16-0.21 (I pay $0.17), so on a generous computer/gaming session the differential could be 3hrs idle @ 180w (540w) + 5hrs gaming @ 196w (980w) = 1.52kw x 30 days x $NZ 0.17 = $NZ 7.75/mo ($US 5.51/month or 21.67 Pesos)....assuming I run the system 8hrs a day every day.

c. When did ATI enthusiasts suddenly become eco-crusaders ? It was certainly after the R600 went EOL...must be because global warming hadn't been invented in 2007!

. The max power draw for both the 5870 and 480 is usage under Furmark...You spend a lot of time playing Furmark gr ? So, when all's said and done you're basically saying 800 watts system usage = bad, while 600 watts system usage = acceptable ?

Here's some table's showing the relative power draw for both cards at Idle, normal 3D usage, maximums and blu-ray playback. You can breathe easy....the world's demise has been put back two weeks.

@ Guest (post #44)

You forgot pics of the nVidia sales brochure. I'm also deducting points for lack of Powerpoint presentation.

Guest said:

Unfair resolution to the ATI cards, 1GB of memory will not cut it in most of the benched games with max IQ, should have benched at 1900x1200 or use the 2GB radeons.

Guest said:

Yeah 1GB won't cut it at max rez as shown by the drop in performance in crysis with 8xAA versus no AA where it works fine, anyway in most of the games the 5870's where well close enough to the 480's except for the ones where Vram is obviously being maxed out like metro2033 which is undoubtedly one of the biggest Nvidia supported titles out there.

You game at max rez then yes 480's all the way, all be it with high temps, noise, and massive power consumption, under max rez then ATI all the way easily.

RavenXXX2 said:

Great review, always appreciate Techspot's unbiased detailed reviews.

On the review I feel the rez is a bit to much for the 1GB radeons as shown by the dodgy results in some of the tests, a lower rez or 2GB cards would have been a fairer test, or you could have benched at 1920x1200 as well as the tested rez. The 5870's still performed well in the majority of the games and anything under the tested rez I would go ATI, the power consumption and heat of fermi is ridiculous, fermi SLI and water cooling should be a requisite.

Guest said:

I know the power requirements would be insane, but overclocking these GTX 480s to 850-932MHz would yield 15-20% more performance! These GTX 480s do overclock well and per clock show more gains in actual performance than an ATI overclock, my overclock on water cooling is an insane 932MHz core/1864MHz Shader on my GTX 480! I wonder how two GTX 480s in SLI with an overclock of 932MHz core/1864MHz Shader would perform! I would have liked to see both ATI and Nvidia cards at 1920x1080 resolutions, I think I will start turning off my computer when I'm not home or in and out all day!

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.