Nvidia GeForce GTX 480 SLI vs. ATI Radeon HD 5870 Crossfire

By Julio Franco
Jun 23, 2010
Post New Reply
  1. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,867   +74

    Hey Chef,
    Expound on this, if you would. There has been much made of CF not scaling near as well as SLI. As anecdotal as it may be, I am not finding that at all with my setup. I have noticed however that many reviews start the OC section with " OC'ing was easy....we simply pushed the voltage towards 1.55 and raised the multiplier .....ect...ect.
    I have been playing around with my new system now for a while and found increased throughput by OC'ing not just the CPU, but the whole system, NB/HT/Mem/CPU, Etc. makes me wonder as I am getting some phenomenal scaling. my preference for the MB over-road the GPU's as I expected (perhaps foolishly) that ATI would get caught up on the scaling/driver issues, but I have been extremely pleased with the scaling thus far. very close to 100% with card #2, and 30% with card #3. I can put more of a scientific numbers/approach to it if anyone is interested.
  2. hellokitty[hk]

    hellokitty[hk] I'm a TechSpot Evangelist Posts: 4,316   +116

    It doesn't make sense to have a physics card over an additional GTX460.
    Plus, you'll need some extra room for that physics card if you already have two GTX460's, and the return's aren't very high.
  3. dividebyzero

    dividebyzero trainee n00b Posts: 4,781   +638

    As a percentage Crossfire scales just fine-and in most cases if you're comparing to the GTX 480 for instance- CFX will scale better than SLI as a percentage. Mainly because the GTX is starting from a relatively higher fps as a single card. The opposite is also true for the reverse (a single HD 5850/5870 posting higher single card numbers, ~10% and 22-27% respectively, than a GTX460 for example). You can add into the mix benchmarking anomolies such as capped framerates and/or system component limitations (bottlenecks :wave: captain), in-game IQ settings and whether any settings are forced at driver level, and whether the game favours SLI, Crossfire or any GPU scaling
    The only metric probably worth taking into account is probably fps, either as an average, if the benchmarking sample is large enough (and ideally it should cover the resolutions commensurate with the cards likely pairing, a range of games widely played including RTS, FPS, RPG, Sim's etc.) or a aggregate fps total

    I don't doubt you're getting close to 100% scaling in some games with CFX- especially if one of the games you're benching is Metro 2033 (IQ and screen res dependant of course)
    If the PhysX component is intensive (the aforementioned Metro 2033 or Dark Void, Mirror's Edge, Batman:AA etc) then offloading PhysX to a dedicated PPU will raise your overall framerates (esp. the minimums). Any reasonable 256Mb frame buffer/ 1Gb memory nvidia card of the previous generation (i.e. 8800GT/9800GT/GTX/GTS250) would suffice as a good PPU.
  4. Leeky

    Leeky TechSpot Evangelist Posts: 4,378   +98

    Thanks once again DBZ. :)
  5. One point I think many readers may not understand is that crossfire mostly does not work at all. The benchmarks are extremely misleading in this regard. They make both Crossfire and SLI out to be completely transparent performance enhancements. They are not -- or at least Crossfire is not.

    I recently purchased another 5830 for use in a crossfire configuration. It was not until after installing and much digging for data that I discovered that crossfire does not work in windowed mode -- full screen only mode (which I never use). Not only that, in full screen mode it has serious problems: it's SLOWER than non-crossfire windowed mode for the games I play, it has very annoying graphic artifacts on the screen, it doesn't support all the resolutions and modes of non-crossfire, and it may require 3rd party tools to enable crossfire mode.

    All this combines into a "do not use multiple graphics cards, ever" for me. It's not a backhanded endorsement of SLI. On the contrary, if this is what AMD thinks provides decent competition with SLI, then SLI is likely equally flawed. No doubt there must be some use for NxGFX cards, but from my experience with crossfire, it must be an extremely narrow use restricted to a very few mass-market games that specifically support multiple-gpu systems.

    -=-=-

    IMHO, I'd love it if the reviewers would simply ignore Crossfire/SLI or at least put VERY large caveats on it when showing comparisons to single-card solutions. They're not comparable on basic functionality. It's very misleading to compare something that works with something that doesn't.
  6. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,867   +74


    Possibly the most solipsistic, anecdotal post to date. There is obviously not a great demand for using CF in widowed mode. I'm sure that every single review, benchmark and article on the planet is wrong, or they are just getting extraordinarily lucky as it works for them.

    Why don't you provide a list of these games that do not support CF/SLI?
    If you can find them, they are probably pre 2007, and if they do not support it...they don't need it in the first place.
  7. LNCPapa

    LNCPapa TS Special Forces Posts: 4,271   +257

    I'm with Red on this one - almost all the games I've played support CrossfireX just fine - and if you need a little reassurance then enable the logo to appear when an application uses CX. If you're playing windowed then you probably didn't need the performance boost of CX or SLI anyway - just get a decent single GPU card.

    The only disappointment (due to lack of proper CX support) that I've had recently was FFXIV - it apparently does just fine with SLI but crap with CX. The CX logo will appear just fine and the second GPU will run at a fairly high level of usage, but your performance will actually go down with CX enabled. I wonder if this is the title you're talking about...

    Edit: I also thought it was fairly well known that SLI and Crossfire don't function properly on windowed apps.
  8. LinkedKube

    LinkedKube TechSpot Project Baby Posts: 4,262   +41

    They didnt make the game for people that can spend 600 dollars every 6 months to upgrade hardware. They made it for the people who can afford 15 usd a month to play. Papa.
  9. Thank you guest,why did you take soo long to come on here and tell me that,i just bought myself a second 4770 for a performance boost and now i find out that i wasted money on the card.
    I thought the extra fps i'm getting was from the second card.
    Why did so many people waste money on a 4890 if a single 4770 could get higher fps?
  10. Masta680

    Masta680 Newcomer, in training

    What I need to know it the bench marks for ATI Radeon HD 5870 x2GB over the GeForce GTX 480
    I can't find that anywere and anyone help me.
  11. dividebyzero

    dividebyzero trainee n00b Posts: 4,781   +638

    Tom's Hardware GPU gaming charts.
    For most gaming benchmarks the 2Gb and 1Gb cards offer pretty much identical performance. The 2Gb card, in general terms, only makes sense if you plan on a multi-monitor setup.....which of course doesn't then make it a direct comparison with the GTX 480
     


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.