TechSpot

Nvidia GeForce GTX 480 SLI vs. ATI Radeon HD 5870 Crossfire

By Julio Franco
Jun 23, 2010
Post New Reply
  1. I know the power requirements would be insane, but overclocking these GTX 480s to 850-932MHz would yield 15-20% more performance! These GTX 480s do overclock well and per clock show more gains in actual performance than an ATI overclock, my overclock on water cooling is an insane 932MHz core/1864MHz Shader on my GTX 480! I wonder how two GTX 480s in SLI with an overclock of 932MHz core/1864MHz Shader would perform! I would have liked to see both ATI and Nvidia cards at 1920x1080 resolutions, I think I will start turning off my computer when I'm not home or in and out all day!
     
  2. ET3D

    ET3D TechSpot Paladin Posts: 981   +31

    I agree that it would have been nice to see the highest end setups. However, since it's been shown elsewhere that 2GB vs. 1GB can make a real different in certain games, that's a more important than the effects of clock speed, which is something which can be interpolated from existing results, or cooling, which is just a factoid. If the games where the Radeon fails miserably (for single card) work much better with 2GB, then there's be a better point of comparison.
     
  3. dividebyzero

    dividebyzero trainee n00b Posts: 4,908   +716

    Running overclocked SLI'ed GTX480's stacked on top of one another (no one slot gap) on air is probably asking for trouble, or at the very least, throttling.
    Not a lot of point running oc'ed cards in any case, since there's no guarantee that any reader/user could reach the same clocks and duplicate the gains- unlikely that any two cards are going to OC to the same extent.

    @ET3D
    Whether or not the price premium is worth any extra performance is probably down to the individual and what price they put on any increase.
    I linked to Anands 1Gb v 2Gb review in an earlier post which shows negligable (in my opinion) gains. Obviously for some people a couple of extra fps here or there justifies the $100 (25%) price premium over the 1Gb card. Strangely enough I'd wager that a good percentage of those same people would denounce the GTX480 at the same price point even though it offered better performance boost. No accounting for the foibles of consumerism or brand allegience.
    Hardware Canucks also had similar results. basically approx 3fps increase over the standard card. Bear in mind that the card is overclocked, which accounts for most of that increase.
    The only game I've seen where 2Gb makes any tangible difference is Metro 2033...but again, applying AA brings the card back to DNF status.

    What remains untested is the possibility that CF'ed 2Gb HD 5870's offer a substantial gain where the single card does not. I personally can't see a large turnaround in fortune- although from my recollection there has been no 2Gb CF v GTX 480 SLI reviews. The closest approximation would be Sapphire's HD 5970 4Gb Toxic v GTX 480 SLI, which isn't quite an apples-to-apples comparison even with the Toxic's OC. The review is a little dated in any case given that it predates the 25x.xx nVidia driver release (as well as Cat 10.5/10.6)
     
  4. LinkedKube

    LinkedKube TechSpot Project Baby Posts: 4,265   +41

    Not sure, what time of day is this? If I'm from the age 6-12 the boogyman might come to get me if its too late, or maybe the 10 lamps I have off will help give Al Gore more moneys for his "environmentally friendly" campaign.

    Just saying...I lol'd at your number one.
     
  5. Hey, I'm an Nvidia owner of late... 8800gt then just switched to a 260gtx OC the other day (late upgrader as no reason to upgrade till BFBC2). Anyway... just wondering how the 5870 x 1 was only capable of 0.0 fps in Metro 2033? Seems a bit weird? Did you guys get the drivers installed and everything working? Was there a specific bug encountered? Surely 0.0 fps means the game crashed as soon as you tried to run it so wouldn't it be better to say that in the review than post a graph showing fps along the y-axis as a line along the bottom.

    Anyway i'm going to wait till the 460gtx or whatever comes out and even maybe consider waiting till 2nd gen dx11 comes out from nvidia before upgrading. No major dx11 games worth playing (yes i know bfbc2 has dx11) and so i don't think there'll be a need till next year.
     
  6. Wow, look whos quick to throw out the term "Fanboi".. fanboi.

    nVidia loozes this time again. Bankruptcy is just around the corner. Its just that nvidia fanboys can be quite dull sometimes.. indeed its hard to swallow the reality pill.
     
  7. Per Hansson

    Per Hansson TS Server Guru Posts: 1,932   +126 Staff Member

    While a 700w power supply was not used for the review as the article originally said I'd just like to point out the fact that Steven measured the power consumption of the PSU at the AC side (I.e. with a KillAWatt or similar device that you plug the power cord into)

    So to get the power consumption of the actual computer parts you need to substract the (in)efficiency of the PSU, most modern powersupplies are around 80% efficient...
    So the power draw of the system thus is;
    787w X 0.8 = 630w

    So a good quality 700w PSU would have been enough, however not recommended since it would be quite heavily loaded...
     
  8. I run two GTX 470s in SLI mode. Even at an eight-foot distance, I considered the fan noise to be obtrusive. In regular desktop usage, they would be outright loud. (I am talking about noise levels under load in more demanding games.)

    So for anyone who's not hard of hearing, you need to spring another $350 for water cooling, which is what I did. (No, a pair of headphones won't spare you the noise.)

    That said, SLI does scale better than CF, though the numbers I've seen before weren't this drastic.

    Anyway, all the fanboy vs. fanboy nerd wars are irrelevant. Judging from Fermi availability (plentiful), ATI's prices (still above MSRP), and the most recent market share numbers, Nvidia has lost this round decisively. The chip ojust doesn't scale well and runs too hot, requiring a loud stock cooling solution.
     
  9. The most ridiculous part of the comparison is the bloody ati cards were released over 9 months ago. Wait 9 months and than benchmark the new bs ati vs the old hot box 480. nvidia fans will be qq'in hard over there overpriced bar heaters.
     
  10. Haha yeh above poster is correct. Nvidia fans are just salivating over a technology that can finally outperform a 9 month old card.
     
  11. hellokitty[hk]

    hellokitty[hk] Hello, nice to meet you! Posts: 4,370   +125

    How come there are always so many guests?
    Sign out for anonymous/debatable comments?
     
     
  12. LinkedKube

    LinkedKube TechSpot Project Baby Posts: 4,265   +41

    because they're afraid someone will quote them and blow them out of the water, so to speak.
     
  13. Who cares if it is 9 month old tech. It is their tech that is out today. Do they have anything newer to bench against? That's what I thought.
     
  14. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,906   +90



    ....Tonight we have a special guest who will perform a one man argument
     
  15. ATI drivers have always been bad. Even if Nvidia had slower cards I would go Nvidia because i know it will work in a new and old game. ATI drivers have problems in new games and sometimes it takes them over 9 months to correct it and by that time people have moved on to other games again with new driver problems with ATI.
     
  16. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,906   +90


    "Danger Will Robinson!...Danger"
     
  17. dividebyzero

    dividebyzero trainee n00b Posts: 4,908   +716

    You should know the perils of drivers by now red...
    True story ! I just updated my nVidia driver to 196.75 AND updated my Creative Soundblaster driver...next thing you know, I'm a widower, the dog has lost his eyesight, a passerby suffered shrapnel wounds and I'm the subject of an upcoming heart-rending Oprah special....and my house....well, just look at it!
    [​IMG]
     
  18. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,906   +90

    ^^^^^^:haha::haha::haha: ROFL

    well at least it was an Nvidia driver you downloaded there Chef, so the insurance adjuster will be there tomorrow. Had it been an ATI driver, they wouldn't be there for nine months.
     
  19. Athena

    Athena TS Rookie Posts: 69

    I love ATI's hardware but I can never buy their cards as the games I play, my MMO of choice is City of Heroes and is over 5 years old now, and ATI still is having driver problems. I just like my stuff to work so until ATI steps up the driver support I will always be forced to buy Nvidia, and that sucks as they charge for that extra stability their drivers give a gamer.
     
  20. grvalderrama

    grvalderrama TS Enthusiast Posts: 196

    Never said ati enthusiasts were eco-crusaders, but some people here say that the difference between one an other's power consuptiom isn't relevant, so I made a power usage comparison just to make a simplier way to see the effect. I wasn't talking about how expensive your bill would come. Bad? Acceptable? Did I mean that?
     
  21. dividebyzero

    dividebyzero trainee n00b Posts: 4,908   +716

    Who knows gr, although I'd equate it with "Not so enviroment-friendly" as in...
    ...and since you are pulling the "enviroment-friendly" card, I'll tar you with the same brush that seems to pervade the AMD-nVidia fanboy debate; namely that one side will use whatever argument best fits their narrow-minded bandwagon-jumping pseudo-ideology...so for nV*****s its "red team.... driver sux/doesn't have CUDA or PhysX/crappy minimum framerates/bad linux support" yada yada...and for the Association of Mental Defectives it's green team...late/bad architecture/too hot to handle/power hog/expensive" etc.,etc.
    So in this case we have the trite analogy of the power saving equivalent of ten lightbulbs while gaming in relation to a graphics system that costs upwards of $US800...you may as well be referencing CO2 emissions to John Force and Tony Pedregon
    That is to say that power savings (or usage) come way, way down the list of priorities for people running multi-GPU enthusiast graphics, likewise how many people with enthusiast motherboards and graphics laden with extra copper and gold give a damn about enviromental pollution regarding mining of these metals?
     
  22. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,906   +90

    Think before you troll...

    This of course begs the following questions,
    Hmmm, if you never buy their cards...how do you know? and if they don't work, why would you "love them"? Don't get me wrong...its an interesting perspective, cuz most people have no desire to purchase items that don't work properly.

    Boilerplate alert! It mat interest you to know that millions are gaming on ATI cards and not having the problems you are on your GeForce 6200.

    Gosh...so do I Darn it!
    Oh dear!, don't ya hate that?
    This one was C&P right of of the Nvidia web site was it?
    and off course silly! we all love the creamy goodness of that "extra stability" that they give unto the gamer......yikes :rolleyes:

    You realize of course there is a ' off topic' forum right? you could maybe talk about how much you really want a Ford, but gosh darn it! the wheels fall off.
     
  23. LinkedKube

    LinkedKube TechSpot Project Baby Posts: 4,265   +41


    This is the best post concerning fanboidum I've seen, ever.
     
  24. dividebyzero

    dividebyzero trainee n00b Posts: 4,908   +716

    Wow ! 28 fps at Max detail @ 8xAA and DX10....that's AWESUM....how about including a (missing) link ?.....and since that's NOT going to happen, I'm calling bs.


    FYI Crossfired 5870's would have trouble posting 28fps at 4xAA let alone 8xAA
     
  25. Julio Franco

    Julio Franco TechSpot Editor Topic Starter Posts: 6,570   +338

    I don't know what the fuss is about the 0 fps results, that just means the game didn't run properly with those settings likely due to a driver bug (as described in the article's text). Then we have subsequently tested with settings that worked fine.

    In the case of Crysis Warhead that meant disabling anti-aliasing completely at 2560x1600. See the last test in that section.
     


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.