TechSpot

Nvidia GeForce GTX 480 Review: Fermi Arrives

By Jos
Mar 26, 2010
Post New Reply
  1. wait, that is all correct except one thing, it is a half year late, remember? but it doesn't matter if it is, remember, the games for DX11 hasn't come out yet, so I don't see the reactions of people saying that it is very bad for them to be late, if the supposed games came out. I do agree though, is why it isn't faster than the HD5870 since they took long to do it, but then again don't expect what you expect, that is not a healthy thing, at least on competetion. But I think I know the reason why it isn't faster than it is supposed to be, the cores is 480 of course thats one, the other is they put raytracing, and other nonsense for it that people don't need right now at least games. Now if at least 15 games supporting the card came out, I'd definately agree that people waited for nothing, but of course there are different people, and I think they can afford that card anyways. Just wish nvidia started on it since day 1, since it doesn't look like they did.
     
  2. Deso

    Deso TS Rookie Posts: 130

    After seeing these results I'm VERY glad I went with 5850 =P
    That heat and power drain is just horrific

    This looks like they just took a 295 and made it "2 gpus" in 1 gpu instead of a sli configurationg on the PCB, pushed them next to each other on the small chip dye which would explain number of cores the very bad thermal performance and the regular performance of it.
     
  3. I have an ATI 4850 and I am looking for an upgrade, since I have a 550watt psu I guess I will be getting an ATI5870 since it will not drain all my PSU power. Also I don't wan't to waste more money for a new psu + the price difference between the cards just for a 10-20% performance improvement. Finally I feel a bit worry about the 480 operating temperatures. I seems that they coolers are at full speed and despite that they have quite a lot trouble to cool it. With the 5870 I guess I could speed the fans a bit more to lower the temps more and still wouldn't be so loud. Nvidia has a real big problem with that temps. I remember my 4850 card was quite hot but fans almost inaudible coz they where set at a really slow speed so editting the bios to increase fan speeds was an easy solution that in the case of the gf480 is not possible.
     
  4. dividebyzero

    dividebyzero trainee n00b Posts: 4,947   +728

    Fan speed changes are in the main, done at driver level.

    Then again you could always use software, such as the RivaTuner based EVGA Precision or MSI's Afterburner...or ATI Tray Tools for that matter- as most enthusiasts do.

    Just to add a moment of sanity at the end of your strange post, PCGH overclocked their review sample GTX 480 by 18% (Core/shaders) and 24% (Vram) - 825MHz Core, 1650MHz shaders and 1150MHz VRAM (4600MHz effective) with a temp of 75C - albeit at 80% fan speed- using the aforementioned Precision.
    I'd anticipate nV's next driver will have revised fan speed rotation settings.

    Nice OC....all they need now is a (moderately) cheap self contained watercooled setup a la Leadtek's 8800Ultra Leviathon.
     
  5. The price to performance ratio is ridiculous. The heat it produces is unacceptable. Reminds me of a boxer that talked so much trash and got his butt kicked. The most significant thing about this card is that it may help bring down ATI prices.
     
  6. princeton

    princeton TS Addict Posts: 1,716

    And another thing. AMD fanboys are saying "Who cares if it beat the hd 5870 its waaaay later" while the exact same fanboys at the launch of the hd 5970 almost a year after the gtx 295 screamed "HAHA SEE AMD ROCKS WE BEAT NVIDIA SOO BAD!"
     
  7. compdata

    compdata TechSpot Paladin Posts: 604

    The power/temp requirements make me think that they knew they were not going to perform as well as the ATI 5870 and so they overclocked and then got stuck trying to deal with the power/heat issues. I know there were yield and other issues as well, but it seems odd that they would have let this pass through given all the the recent attention to higher efficiency and lower power requirements.
     
  8. SiliconDoc

    SiliconDoc TS Rookie

    I've never seen so much whining in all my life. I guess people think it proves how smart they are, if they respew the standard pop culture complaint.
    First of all, the PCI-E slot is rated for 300 watts, and probably all of you have more than one of those on your motherboards, so none of you should worry about a full 300 watts in one slot or 600 in two, and this is less than that, unfortunately.
    Furthermore, if you don't have a 600 watt power supply, you aren't going to be running any card like this - nor should you be even thinking about it. Where have you been for 3 years ? 750watts has been standard gamer PS size for a long time, and that's getting short now. Whining about having to have a 600 watt PS ? ! LOL
    Next, some person cried about "melted cables" - gee - the cards have heatsinks and plastic shrouds - not like you can stick your tongue on 95C die substrate (ALTHOUGH it looks like some whiners might, just so they could wail an nvidia card burned them).
    Enough of the crybabying, the 4870X2 uses as much and more power, and you same people never had a big fat piglet over that. In fact you couldn't stop screaming how wonderful it was- and still cream about how good it is.
    Enough of the constant bellyaching.
    You can't afford the 480, nor the 470, your crybaby eyes can't pay the extra buck or two a month to burn a 100 watt lightbulb 24/7.
    Yeah, so forget the $24 a year on extra electric, too - because that's your two $10 walmart games and tax, or perhaps the one pizza you order once a year from mommies basement.
    Good gawd!
     
  9. dividebyzero

    dividebyzero trainee n00b Posts: 4,947   +728

    Quite the opposite in fact. The original specification was for (I believe) an SKU with clocks of 725MHz core/ 1450MHz shader/1050MHz effective memory. The fact that, with review samples at least, the cards can fairly easily reach those clocks through overclocking obviously indicates that the cards have been underclocked to keep them within TDP.
    The alternative being what? Having no enthusiast class cards in retail for another 6to 12 months ?
    At least with these releases enthusiast grade card consumers now have a degree of competition in the marketplace, and for some of us we can see where gaming might be headed in future.
    Before you ask "what competition?", I'll add a few observations.
    Power hungry....Yes, but hasn't stopped SLI and Crossfire ownership (nor single card versions: HD 5970 -294w TDP, GTX 295 -289w TDP, HD 4870X2 - 290w TDP , HD 4850X2 -250w TDP) and overclocked CPU's/GPU's.
    Noisy....Yes, kill two birds with one stone and buy the waterblocked version, or buy some noise cancelling headphones/headset.
    Expensive...Yes, but thats the nature of enthusiast/performance components
    Cost effective...Hell no, but tell me of a enthusiast grade product that is?
    I could buy the cost effectiveness argument if PC gaming was in any way cost effective outside of a budget system at medium to low resolution-But it is not.
    So would I buy one (or two) ? No, not until DX11/tesselation makes a quantative difference, and by that time both AMD and nV will have more refined products available.
    My personal advice to customers requiring good GPU performance is to Crossfire HD 5770's. If they can afford to game on a 2560x1600 IPS monitor then in the main, cost becomes somewhat immaterial. Prime consideration becomes "is the new game playable on zero-day ? and do I have to lower any settings?"
    One thing I've found is that people who hand over large wads of cash for both components and games get really irked over having to turn down the settings.
    Would I consider nV's future offerings based on these cards? Definitely.
    AMD have had GDDR5 for 2+ years. By now the memory controller should be refined, yet the number of cards that have a memory bus larger than 256 bit...zero.
    At their second attempt (after the GT 240) nVidia have a working 384-bit GDDR5 memory controller-the difference is vivid in games such as Metro 2033 and/or heavy antialiasing, they also have 8 x multisampling AA working out of the box....no too bad a start for what is essentially a proof-of-concept part.
    Ultimately, the argument is less about the hardware than how it will be implemented.
    All things being equal, AMD and nV will both be sponsoring game development optimizing for their own products. The difference is that nVidia's TWIMTBP is both well established and linked to many AAA titles, so it's a fair assumption that nVidia will be putting development funding into games that feature options for heavy tesselation, PhysX and DirectCompute- area's where AMD's 5 series don't have an answer. AMD's answer was the recently announced game dev funding at GDC 2010 - the "Gaming Evolved" brand. If the initiative is followed through with sustained funding then the balance tilts clearly towards AMD. If however it turns into another smoke-and-mirrors campaign...
    Richard Huddy and earlier defunct “Get in the Game” (GITG) program

    then the gaming future might be a little less clear cut than some are expecting.
     
  10. SiliconDoc

    SiliconDoc TS Rookie

    Why would anyone do this:
    " My personal advice to customers requiring good GPU performance is to Crossfire HD 5770's. If they can afford to game on a 2560x1600 IPS monitor then in the main, cost becomes somewhat immaterial "
    ---
    Then you have endless problems screwing around with crossifre and some games don't even scale at all, not to mention the extra motherboard cost, and fighting the heat of two cards next to eachother, twice the chance one dies or is DOA, and since they are $170 bucks each, you're at $340 plus crossfire problems, might as well wait till the 12th or later and get a GTX470 or pre-order a GTX470 for $349.
    --
    Who wants to get a videocard, especially a high end one, then have the gray screen of death, green gots, lack of 2D acceleration, hacking issues with "overdrive" just to try to get it stable, then hear the LIES over and over again from ATI, like "the problem is a windows& driver update" when the same crap was all over XP and Vista installs. Then when you fire up furmark, "it's a power virus according to ATI" and WHOOP ! there goes yer overclock if you even have one and it downclocks to desktop 2D speed just so the cheap underpowered VRM's don't blow right off the card PCB.
    Forget it.
    Oh, the RANDOM CRASHES on ati cards - have all the eggheads figured it out yet ? LOL "Oh it's some bad memory" (another rumor)---
    Then you're right in the middle of a hot firefight and yer new costly 5000 series that is overpriced because ATI is losing a billion a year anyway, downclocks and ganks into single digit framerates - and STUTTERS.
    Forget it ati.
    HIRE SOME MORE DRIVER TEAM MEMBERS! QUADRUPLE YOUR DRIVERS TEAMS ! YOU NEED 4X AS MANY PEOPLE REMOVING THE BUGS FROM YOUR CRAP VIDEOCARDS !
    ---
    Yeah, so forget it. $50, $100 - $200 - it's just NOT WORTH IT to buy ati - I'm sick of my friends screaming over chat after they buy a new ati card and it's all screwed up !
    "I can't use PhysX, I bought ATI"
    " I'm all for open source, but I bought ATI and it doesn't have openCL working yet like Nvidia does, and doesn't have good linux drivers like Nvidia does"...
    " How come my BRAND NEW 5000 SERIES 5870 only has "direct compute5.0 checked in GPU-Z!??!??
    The Nvidia card has Cuda, OpenCL, PhysX, AND DIRECT COMPUTE 5.0 CHECKED IN GPU-Z! "
    ----
    That's why ati fans scream bloody murder all the time. To make sure you don't notice how pathetic EVERYTHING is with ATI cards "except xxx framerate at xxxx resolution in xxxxxxx game with these certain AA and AF settings and gamer/enthu settings "... (YES IF YOU GET IT JUST RIGHT THE ATI CARD MIGHT ACTUALLY TAKE A WIN IN FRAMERATE).
    ---
    It's amazing to me how pathetic suck and a billion plus lost every year for years in a row "is a good deal for end users".
    --
    No, there's NO CHANCE I will buy a 5000 series ati card! until .....
    1. Get your OpenCL WORKING in the main driver !!!! no "SDK" developers downloading !
    2. DON'T BLOCK THE PHYSX WRAPPER NGOHQ MADE ANYMORE!!! STOP BLOCKING IT!!!
    3. NO GRAY SCREEENS OF DEATH ! NO "GREAT 2D 27 WATTS!!??!! THEN THE DANG GREEN DOTS AND GRAY SCREEN OF DEATH! THEN HIDING THE 2D CLOCK UPPING QUIETLY IN A DRIVER RELEASE AND NOT REDOING YOUR IDLE WATTS USED NUMBERS ! THAT'S CALLED LYING!!!
    4. No more "random crashes" no one can figure out
    5. No more "urgent need" for your ati fanboys to rip on nvidia for stupid crap, because ATI SUCKS SO MUCH they have to, to feel good!
    6. There's more but #5 sounds like a good time for me to buy an ATI card again - when all the raging ati fan boys calm down and don't have to make so much bs up it's sickening to read. If ATI actually had a good competing card, it would not be neccessary - the constant whining and blubbering trying to convince everyone the billions losing ati cards are "equal and better". They ARE NOT EQUAL, PERIOD.
    No cuda, no physx(since ATI blocked the WRAPPER ! the guy who made the wrapper said so! , and no good direct compute 5.0 , and NOW - poor DX11 performance comparatively!
    ----
    I'll buy ATI again when all the BS raging lies from the reds stop - then I'll know they're worth it.
     
  11. Steve

    Steve TechSpot Staff Posts: 1,442   +503 Staff Member

    Okay I am going to do it, why I am not sure but I am. SiliconDoc your posts are extremely bias and quite inaccurate/misleading whatever you want to call it. It’s clear for whatever reason you hate ATI and would do anything for Nvidia, including taking time out of your day to make silly posts on forums to try and deter potential ATI buyers. I think we need to sit down and work out what it is that ATI did to hurt you but before you do that here is my 2c worth on your comments.

    First of all no one is “whining” or “crying” they are simply stating facts and their own personal opinions which they are entitled to and you are as well. Unless of course that opinion is to attack other reads without first being provoked which you weren’t.

    The PCI Express bus specification is irrelevant and I am sure most readers do not care if it is 300 watts to 3000 watts. The fact is a single GPU graphics card that uses as much power as the GeForce GTX 480 is ridiculous.

    There is certainly nothing wrong with a 600 watt power supply and gamers will have no problem with a Core i5 750 platform and a Radeon HD 5850 with a 600 watt power supply. Both components are extremely power efficient as they should be.

    “Next, some person cried about "melted cables" - gee - the cards have heatsinks and plastic shrouds - not like you can stick your tongue on 95C die substrate (ALTHOUGH it looks like some whiners might, just so they could wail an nvidia card burned them).”

    I ask you, have you tried to remove a GeForce GTX 480 from a system just after it has been doing some 3D work? Have you touched the metal plate on the graphics card while it is in operation? If not then you have no right to make the above comments, this product can very easily burn you if not carful. We are not talking serious burns but it is bloody hard to handle after being used.

    “Enough of the crybabying, the 4870X2 uses as much and more power, and you same people never had a big fat piglet over that. In fact you couldn't stop screaming how wonderful it was- and still cream about how good it is.”

    The Radeon HD 4870 X2 is an obsolete product that is well over a year old so no one would expect it to proved the same efficiency as today’s graphics cards. Why on god’s green earth are you comparing old tech? Why not the Radeon HD 5970 which we admit is a power pig anyway? Not to mention the Radeon HD 4870 X2 is a dual-GPU graphics card and should only be compared to a GeForce GTX 480 SLI configuration if you were to make such an obscure comparison.

    “You can't afford the 480, nor the 470, your crybaby eyes can't pay the extra buck or two a month to burn a 100 watt lightbulb 24/7.”

    Well enough said really, that immature comment needs no rebuttal.

    Moving on to your comments about the Radeon HD 5770 Crossfire configuration … Now I do totally agree that Crossfire and all multi-GPU configurations for that matter are far from perfect. However if you look at how well the Radeon HD 5770 Crossfire cards scaled in the review it is hard to argue with those results.

    However you are completely wrong about your power and thermal comments. The Radeon HD 5770 Crossfire setup is not only far more power friendly when compared to the GeForce GTX 470 it is also much cooler and easier to manage. Not to mention a pair of Radeon HD 5770 Crossfire cards will crush the GeForce GTX 470.

    That’s about all I have to say about your comments really. As for all the gibberish about the crashing and drivers well I am leaving that. Over the past year we have found the ATI Catalyst drivers to be very good and certainly comparable to Nvidia’s. Both companies have their fair share of problems and nothing will change here. As for cards dying both companies will suffer from defects and nothing will change their either.

    Since the release of the Radeon HD 5000 series I have owned over a dozen cards while several friends own Radeon HD 5870 cards and have never had any problems with reliability. Again not all cards are going to be perfect as there are always a certain percentage of defective products that do ship and this is the same whether it is a Radeon or GeForce graphics card.
     
     
  12. Kibaruk

    Kibaruk TechSpot Paladin Posts: 1,423   +116

    Hahahaha!!! I LAUGH AT YOU FERMI LOVERS!

    All that time reading "Wait till Fermi comes" "Oh yes AMD enjoy it while you can!"... it took a long time for it to come and... wait! AMD still has the lead!

    Loving it!
     
  13. Hi, This card is bleeding edge since it will use and incorporate 3d stereo, 3d surround and ultra high def high framerates. Nvidia can control the watts and heat from their drivers making games run faster and stable with less than the 250 max watts www.nvida.com says it only needs.
     
  14. Burty117

    Burty117 TechSpot Chancellor Posts: 2,524   +324

    Wait a minute thats not actually true, this is the single fastest graphics card out, how is AMD still in the lead?
     
  15. Steve

    Steve TechSpot Staff Posts: 1,442   +503 Staff Member

    No idea about that one either ;)
     
  16. Burty117

    Burty117 TechSpot Chancellor Posts: 2,524   +324

    "GTX 480 is about 10% faster than the HD 5870 in average framerates but gets 20% better minimums framerates."

    The above was a quote from various review sites on this card. Even techspot bench marks prove the above quote true as well.

    Now I have usually used Nvidia cards, I have owned ATI's in the past. to me, this card does seemed to be the most powerful out there?

    I have read the posts from "SiliconDoc" and he makes a point about the 4870 X2 taking the same if not more power than the GTX480.

    I did some googling into this and infact he is right yet when you go looking into reviews about that graphics card it was put as "taking a lot of power" but never really dismissed as a "stupid" power hogging graphics card like the GTX480 has been.

    I know its old tech but same holds true to PSU in those days. PSU's could not sustain the power as well in those days as they can today.

    So what i'm trying to say is, YES, this is a power hog but when you take everything into account, its not really, it just hasn't kept up with the efficiency of ATI's cards, but again, its more of users choice. would you prefer a card which takes less power but suffers in FPS or would that extra £10 a year on your electric bill really hurt?
     
  17. Kibaruk

    Kibaruk TechSpot Paladin Posts: 1,423   +116

    The price difference for the slim performance improvement in SOME games, for the power***** it is and how heat non-efficient... AMD still has the lead!

    *edited by LNCPapa
     
  18. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,286   +232

    Well, -Steve- said most of what I was thinking, but this one I've got to take issue with. Unless you are talking about the old discontinued and unsupported PhysX processor cards, that statement makes absolutely no sense. nVidia owns PhysX outright, and all of the PhysX code is embedded in the nVidia driver architecture. Last I heard (and there were plenty of discussions regarding this little gem) it was nVidia who is blocking PhysX from working on anything but a pure nVidia graphics system. They had removed any option of running PhysX on a mixed nVidia/ATi GPU configuration, to help protect their marketshare by keeping their proprietary PhysX code from working on systems that don't fill their coffers. Completely understandable and smart (if a tad slimy) business tactics... But last I heard, it was ALL on nVidia there, SiliconDoc...

    Unless this changed somewhere down the line and I completely missed it? Corrections anyone?
     
  19. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,286   +232

    I'm not thinking you are a blind fanboy, Burty... But you are falling into the lovely little logic trap that fanboys love to throw into the mix. Fact is, the "power hungry" debate was being lost by nVidia fanatics, so they throw out that the 5870x2 is a power hog too. Classic misdirection, defend yourself by trying to throw the competition under the bus. But, you see, this argument is completely irrelevant. You are comparing a single GPU card to a dual GPU card. Of COURSE the dual GPU card runs hot and pulls more power - it is expected to, and if it can run at something under twice the power and heat specs of the single GPU variant of the same card, it's considered a qualified success. So, tell me, what would a dual GPU version of that 480 run? Based on the specs of the single GPU unit now, it would probably melt down all but the liquid cooled PCs. But, until a dual GPU version of this new nVidia line is released and benchmarked, there is no real validity to using a comparative argument against the ATi dual GPU product - other than in relative performance benchmarks. Apples and Oranges until that point.
     
  20. dividebyzero

    dividebyzero trainee n00b Posts: 4,947   +728

    Not really....last I heard, you couldn't plug a monitor, PSU and a hdd straight into a graphics accelerator and expect to use it. The only real issue is total system power draw (note 1)
    As I noted earlier, performance is largely power dependant. Case in point; If you were to buy a HD5830/50/70 with a view to an Eyefinity (3 x 1920x1080 for sake of argument) setup-since this technology is front-and-centre in AMD's 5xxx series sales drive.
    What percentage of current release games (I'm assuming that the theoretical user is not likely to want to revisit Far Cry for example) are playable for a HD 5830/50/70 at 5760 x 1080 ( or 7680 x 1600) ?- That is to say 25-35fps for an RTS, or 50-60fps for FPS. And how many DX11 games ?, Afterall this subset of qualities are the only tangible benefit over the series that preceded it. I think you'll find that this is why AMD enabled Crossfire use in Eyefinity.
    If anyone can find an Eyefinity review on a single stock retail card that delivers playable framerates in current AAA titles (not cherry-picked individual benches) please post.
    We could also do the apples-to-apples thing....
    How about what is possibly to be similar priced and performing setups at the enthusiast end of the scale:

    Core i7 870................(95w TDP) + GTX 480 (250w TDP) = 345w
    Phenom II X6 1075T..(125w TDP)+ HD 5870 (188w TDP) = 313w

    If we are not going to take the whole system as an homogeneous unit then surely the 33% penalty in power draw that the GTX 480 accrues over it's closest rival (HD 5870) then becomes analogous to say, AMD's current CPU lineup versus Core i5/i3.....I can't say I remember seeing too much reactionary witch-burning from the same posters, or for that matter much introspection from AMD champions.
    From Hexus' GTX 480 SLI review...93C single card, 95C in SLI
    Chemistry not your strong suit ? Might I suggest a little less hyperbole....although it serves as a red rag to a bull (pun intended) in my case.
    BTW, since you mentioned the SLI aspect, I'll add that the GTX 480 SLI (and tri-SLI) now appears to be set as the undisputed performance setup...till the next best thing turns up.
    Note 1: The other considerations are obviously noise and chassis tempreture control concerns, which of course are major factors for the majority of users, and has been outlined earlier, the GTX 470/480 hardly cover themselves in glory in stock form.
    Note 2: SiliconDoc,You should consider leaving the nVidia Fanboy club and joining the Teletubbies...you could conceivably raise the I.Q. of both groups by doing so.
     
  21. Burty117

    Burty117 TechSpot Chancellor Posts: 2,524   +324

    Dude, I think you got what i was trying to say all wrong here, i know i was unsure on how to explain myself but yeah, i'll try again.

    What i'm saying is at the end of the day, a ATI 4870X2 is a graphics card that takes 2 slots and 1 PCI express imput. I know the technology behind it is 2 chips put together but at the end of the day, its just 1 graphics card.

    I'm then saying that the GTX 480 in comparison is also a graphics card that takes 2 slots and 1 PCI express input. The tech behind it is different in that respect that it has only 1 chip inside.

    What i was trying to explain is that 4870X2 took more and/or equal power than the GTX480.

    Now that thats out the way the next bit which is really my point about the heat and power usage:-

    - It gets as hot as a 4870X2
    - It takes the same amount of power as a 4870X2

    yet when that graphics card came out i don't remember as big an outroar as the GTX480 power and heat problems?

    And even so in those days higher wattage PSU's were less common and more expensive but still no outroar?

    So at the end of the day, the Fermi is not a fail, they improved performance to one of the fastest graphics cards of all time, they just haven't improved on the power consumption or heat isues.

    Yet like the 4870X2 it is the most powerful graphics card out? surely its not a complete fail considering they didn't improve heat or power consumption but also made a lot more bang for your buck?It may not be as efficient in those areas but most high end computer components are not anyway?

    I've NEVER read a sound card review that complained that an x-fi one took twice as much power as the Asus Xonar sound card? Even though the bench mark shows that it takes twice as much the reviewer only mentions it a couple of times but then gets in on what it can do and a user buys which ever one they decide to go, Make sure they have enough power for the better one or not?

    If your buying this graphics card (just to note i won't be for a while) then you most probably have the computer capable of it anyway.
     
  22. Burty117

    Burty117 TechSpot Chancellor Posts: 2,524   +324

    And Vrmithrax:-

    your sounding like an ATI fanboy =P
     
  23. Steve

    Steve TechSpot Staff Posts: 1,442   +503 Staff Member

    Sorry Burty117 but I am failing to see your points. Actually I fail to see why anyone would compare the GeForce GTX 480 to the Radeon HD 4870 X2 in the first place to try and make any point.

    First the power requirements of the Radeon HD 4870 X2 were not surprising though in some ways we were impressed by them. At the time a pair of Radeon HD 4870 Crossfire graphics cards would consume slightly more than the Radeon HD 4870 X2 under load and considerably more when idling. In fact when at idle we found the Radeon HD 4870 X2 to use slightly less power than a pair of Radeon HD 4850 Crossfire graphics cards.

    At the time the Radeon HD 4870 X2 was a hot powerful graphics card but it was not a failure for the simple reason that it was efficient in terms of price and performance.

    Now over a year later the Radeon HD 4870 X2 has been replaced by the Radeon HD 5970, another hot and hungry dual-GPU graphics card which as it so happens uses about the same amount of power as the GeForce GTX 480 under load, slightly less in fact and considerably less at idle. Still this is a dual-GPU solution so the comparison is not that important.

    The Radeon HD 5870 which is slightly slower, on average 16% at 2560x1600 used 110 watts less than the GeForce GTX 480. This made the GeForce GTX 480 around 22% more power hungry than the Radeon HD 5870 so as you can see in terms of efficiency this new product is not that great.

    Upon further testing I have found that the GeForce GTX 480 only uses 70 watts less than a pair of Radeon HD 5870 Crossfire graphics cards which is shocking. However the idle results indicated that the GeForce GTX 480 uses almost 50 watts more at idle when compared to the Radeon HD 5870 Crossfire graphics cards.

    This all aside the problems that we felt the GeForce GTX 480 were currently facing not only included power and temperatures, they were far from the be all and end all. At the moment with an MSRP of $499 the GeForce GTX 480 is a poor product in terms of value which was our greatest concern. Couple that with the fact that it was on average just 16% faster than the Radeon HD 5870 and the situation started to look very rough for Nvidia.

    I just look at the facts, nothing more. I still think Nvidia might be able to resurrect the GeForce GTX 480 over the next few months so we will see.
     
  24. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,286   +232

    @dividedbyzero - I never once mentioned SLI, somehow you read that into my simplified explanation. Didn't want to get into a big long drawn out thing, but my point was that Burty was pointing at a dual GPU card and saying nobody complained about it running hot and needing power. Nothing about SLI in there at all, I said the only true comparison would be to a dual GPU nVidia card (not 2 separate cards, see?)

    @Burty117 - Yes, I get what you are saying, but again, it's apples and oranges. You generally expect dual GPU cards to run hotter and suck more power. When the x2 units arrive, there is no surprise that they need juice, and it's after the single GPU versions have already been established in the market. For nVidia to throw its first single GPU card of this new generation into the fray and immediately be compared to the requirements of dual GPU units from the previous generation just makes no sense to me. That is the kind of tactic I was referring to that fanboys like to use to suck the wind out of opposing opinions - yes, there is data behind it, but it's a rather shady comparison that really lacks teeth. They don't want you to see the truth: that their side's single GPU entrant sucks as much juice as the highest level dual GPU from the previous generation. That's definitely not anything you could consider a win on optimization or refinement at all, that's moving backwards - because (again) you have to consider what level of hell their x2 GPU version of this generation's card will require. See my point on this yet?

    And, for the record, I use a lot of ATi and nVidia cards both, and had (and still have) high hopes for the possibilities of the FERMI architecture. Their current implementation after this long of a wait is just a little disappointing.
     
  25. Ok so why is everyone overlooking the obvious.Sure this card only beats the 5870 by a small margin.However it also gives better image quality in games with way better PhysX and distant objects look noticeably better on the 480.So the truth is that to get close to identical quality most people pair a 5870 with a gts250(for dedicated physX)at this point the 5870 setup becomes more expensive and draws more power and this still doesn't change the fact that distant object still look better on the 480.FPS benchmarks don't do this card justice.
     


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.