TechSpot

Nvidia GeForce GTX 480 Review: Fermi Arrives

By Jos
Mar 26, 2010
Post New Reply
  1. Burty117

    Burty117 TechSpot Chancellor Posts: 2,512   +314

    OK, i give up. Fermi is still a little dissapointing but I don't think its a failure. I guess thats why i'm waiting a year or 2 before buying one.
     
  2. Steve

    Steve TechSpot Staff Posts: 1,402   +483 Staff Member

    Burty117 you are correct the GeForce GTX 480 is certainly not a failure. At launch it is a bit slower than we were hoping for and its not great in terms of value either. The weaker performance than anticipate has also hurt the operating efficiency.

    Despite the fact that Nvidia should have had a lot of time to optimize their drivers already, given how delayed Fermi has been lets just assume that for some reason they have not. If this is the case then improved drivers in the coming months could really improve the Fermi situation. If performance increases by around 10% then efficiency will improve and so will value making it a more desirable product.

    That said you might not want to wait 2 years before buying one. Hell in 1 year they will be old news.
     
  3. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,286   +232

    Heh... Just cause I'm critical, doesn't mean I'm not cautiously optimistic. I think the 4xx platform could have maybe been done a little better in terms of efficiency, for sure. But me, I'm always disappointed when new products come out seem less efficient than the previous generation (the engineer in me peeking out there).

    There is something many anti-nVidia rants are conveniently leaving completely out of the equation - the actual FERMI coding. In short, there isn't anything out right now that truly utilizes the strengths of the FERMI architecture... Yet. When/if some software that uses these new nVidia GPUs to their fullest, exploiting all of the untapped graphics power that is just coiled up and waiting for a release, I believe that nVidia could jump far ahead of ATi. That is, if the FERMI stuff can really perform as advertised (which I really hope is true).
     
  4. Ugh... this is so disappointing...

    Look at the reference cooler... it has all those heatpipes and it STILL gets that hot. How are the companies going to bring their own coolers out for it if it already gets so hot with a beefy cooler like that?

    I'm sorry nvidia, but this is one heck of a failure card.

    NEXT!
     
  5. Burty117

    Burty117 TechSpot Chancellor Posts: 2,512   +314

    LOL! yeah i'm thinking of building a new rig this year as i'm running an old athlon X2 from 2.7Ghz overclocked to 3.30Ghz, and it still struggles with intense moments in crysis! but again its 5 years old now!

    Do you really think Fermi will be old in 1-2 years time? If so i might wait a few months and save a few pennies for it then get one and possibly opt for water cooling? if they bring out a water cooled heatsink for it by then.
     
  6. dividebyzero

    dividebyzero trainee n00b Posts: 4,913   +718

    This time next year the next-best-thing will be out on the 28nm process for both AMD and nVidia.
    At the very least you should see a hybrid series from AMD on 40nm incorporating elements of both the current HD5xxx series and the next gen Northern Islands architecture, and I wouldn't be overly surprised to see nVidia launch a full 512 shader SKU - assuming that they and TSMC can better implement the architecture on Fermi.
    Both manufacturers should also have access to the next generation of faster (7000MHz) GDDR5 memory very soon, so it's probably safe to assume that in 12+ months time we'll be viewing the current crop of cards in much the same way as we view the HD4870/4850/GTX280/260 now.
     
  7. klepto12

    klepto12 TechSpot Paladin Posts: 1,364   +9

    man u need to get over urself when is the last time u owned an ati card? you are an Nvidia fanboy straight up for sure but just to let u know ati has great drivers and has had them since the 9.0 drivers came out also i have heard way more bs about nvidia cards screwing up than ati u need to get ur facts straight my good man.
     
  8. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,906   +90

    Why y'all gots to be hatin?........



    Hey.......?.....were you the one playing Hitler in that video?

    well, several hundred caps,and 15 exclamation points later....what Klepto said.
    Hope the rest of your day goes better Doc :wave:
     
  9. mailpup

    mailpup TS Special Forces Posts: 8,462   +229

    You don't have to wait that long. EVGA already has water cooled Fermis which should be out soon. Check this Techspot article.
     
  10. Anyone who continues to defend this latest offering from nVidia as anything but an epic fail is an nVidia appologist, straight up:

    6 months late, ATI are already planning their next GPU launch (Southern Islands)
    Power hungry beast and all the negatives associated with that
    Low yields on a monolithic chip design = lower yields than even ATI faced

    It amazes me how loyal some people are to the point of becoming amature spin-doctors trying to defend a big company that could care less about you and only wants your money. It's like watching lemmings walking off a cliff face. I think some of you guys are intelligent enough to know you're doing it but still can't help yourselves.

    Not that ATI is aything different. If nVidia had actually made a card that was truly competitive (and by that I mean more performance at better efficiency and at the right price) then I'd be the first to buy their card (loved their G80/92/94 cpus) ut the fact is the 5 series simply has no viable competition this generation.

    In a way, this is probably not a bad thing. ATI has struggled for many years and I often wondered if they would even stay in the game. I'm sure nVidia can take this loss, and hopefully learn from it. I will say I don't appreciate the way nVida develops proprietary tech (physx) as it creates an uneven playing field on software applications (games). To me, this demonstrates a lack of integrity, that they are willing to win the fight by throwing sand in competitions face and kicking them in the balls. Just develop great hardware: the API into the hardware should be standard regardless of the manufacturer!
     
  11. Arris

    Arris TS Evangelist Posts: 4,581   +103

    It may seem a bit disappointing for a card thats been in development 6 months longer than the competitors, but keep in mind that newer drivers with specific support for it could improve things a lot. And the hardware for tessellation may give it the edge when more DX11 titles hit the shelves. It might pump out the same frame rate as the ATI offering but extra tessellation power could mean it gives higher visual quality.

    I'm still going to wait and see what this can do, since it isn't just a bumped up core and memory speeds and increased memory bandwidth. I'm so close to buying a 5850/5870 right now but want to see more of the new Nvidia cards in action with DX11 before I commit.
     
     
  12. have you realized yet that the GTX295 is a DUAL gpu?

    also DX11 which is gaining lots of support unlike DX10?

    also over 20fps compared to the 5870?


    heat might be a problem and the fan noise, but power consumption is silly, most people today have 600w or more psu, gamers who buy fast cards usually have 800w psus.
     
  13. Steve

    Steve TechSpot Staff Posts: 1,402   +483 Staff Member

    I am not sure who you are throwing these pointless fun little facts at but I would just like to point out that the power supply is not the issue. People are not upset that they need a 600 - 800 watt power supply that's not the point and like you said many have them already anyway.

    The point is it uses a stupid amount of power and generates an incredible amount of heat which all points to poor efficiency with the current level of performance. I still think things will improve given time but you cannot deny that it has been a disappointing launch.
     
  14. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,906   +90

    Which reminds me. When these sites, Toms,Guru,Hardocp,TS, ect benchmark these components they afford them every opportunity to demonstrate the best possible performance. This always includes either an open testbed or a large well ventilated enthusiast case, and an ambient temperature of at or around 20c. Even here in Minnesota, during the summer, the ole thermostat is always above 70F. so what happens when the dude from Calcutta,Perth, or Orlando buys one of these things?
     
  15. after reading this I am glad I got 2 gtx 275 two weeks ago I was looking fo best bang fo buck and I believe I got it. I almost went ati but it was only the physx that stopped me I was worried about 65* heat under full load with a 8%oc but I believe it cost me $100 less than a gtx 295 that also seems to suffer from excessive heat problems. That set up gives me the lowest frame rate of 40fps in any game on 1920x1080 resolution but an average of 60fps with 2xAA so what more would I want anything more than that is simply bragging rights.
    MAYBE ONE DAY I WILL ALSO TRY TO RACE A FERRARI ON A DIRT ROAD BECAUSE I CAN THEN BRAG ABOUT IT
     
  16. All I can think of is how most high end 4000 series ATI users ended up getting an Nvidia gpu to run for dedicated physX at which point the setup became more expensive and less power efficient than if they had just went with Nvidia to begin with.Now not just physX being used in about half the games but Nvidia cards are so dominant in tessellation(which most new games are gonna use).Also @1920X1200 with 4X aa(where I run) I took a compilation of all the game benchmarks I could find(no vantage no unigine just games)from many reviews (Tom's,anandtech,here,overclocker club)and the result was the gtx480 wins by just short of 24% overall not to mention that the 5870 can't even run Metro 2033 at the same settings look at Tom's hardware review they seem to be the only ones that mention how some graphics options have to be turned off to make it a fair comparison.I personally feel that $90 to $100 more is a small price to pay to improve over 20% in FPS and have most games(the ones that use tessellation or physX)look nicer.I am not real concerned with power draw and my case has 3X140mm fans(one right on the gpu)and 3X120mm fans so I should be able to control the heat.I think alot of peeps are just looking at the fps #'s and not keeping in mind all the extra stuff the 480 can do that the 5870 cannot.
     
  17. HI guys,after reading this I don't care about Nvidia or ATI.I just want a graphic card that can deliver good frames while playing CRYSIS 2. I can't afford 5970,so which graphic card should I pick up?
     
  18. dividebyzero

    dividebyzero trainee n00b Posts: 4,913   +718

    Unfortunately there aren't a great deal of commonality amongst reviews (or reviewers). To get a good idea of overall performance you would need a large review db to work from.
    You'll notice that reviews differ quite markedly depending upon what games are used, both in titles (some being ATI friendly, some TWIMTBP) and whether they use OpenGL, DX9, 10 or 11. Something else to consider is that some reviewers use settings either much lower most gamers would ever use (no AA/AF etc.) resulting in exaggerated fps, or maximum settings that produce bigger percentage variances (i.e comparing 4xMSAA in Metro 2033) at framerates that make the game unplayable.
    As a system builder I tend to keep a database of current GPU's performance targeted for screen resolution, game engine and renderer for my customers- since many people will upgrade/buy graphics based on a particular game engine or game type. Frames per second is secondary in as much as usually just wants a definitive answer- Can I play it smoothly?

    Taking the reviews from : Techspot, Tech Report, Guru 3D, Anand, Xbit, bit-tech, Hardware Canucks, PCGH, PC Perspective, Hexus, Tweakers.net, Tom's, Neoseeker, Hartware, OCC, Firing Squad, Computerbase, Legit Reviews, [H]OCP, Bjorn 3D, Tech Power Up, Extreme Tech, Benchmark Reviews, Motherboards.org, Inside Hardware, Tweaktown, Trusted Reviews, Legion Hardware, Ninjalane and Xtreme Systems (forum)...

    Average fps.
    DX9, 1680x1050/1200............HD 5870 (108.58fps).....GTX480 (129fps, +18.8%)
    DX10.......................................HD5870 (71.45fps).......GTX480 (89.11fps, +24.71%)
    DX11.......................................HD5870 (66.3fps).........GTX480 (75fps, +13.07%)
    OpenGL..................................HD5850 (119.95fps).....GTX480 (125.6fps, +4.71%)

    DX9, 1920x1080/1200............HD5870 (100.89fps)......GTX480 (116.69fps, +15.67%)
    DX10......................................HD5870 (64.24fps)........GTX480 (77.4fps, +20.49%)
    DX11......................................HD5870 (52.35fps)........GTX480 (59.34fps, +13.35%)
    OpenGL.................................HD5870 (116.27fps)......GTX480 (108.17fps, -6.97%)

    DX9, 2560x1600....................HD5870 (70.9fps)..........GTX480 (80.25fps, +13.18%)
    DX10.....................................HD5870 (44.98fps)........GTX480 (52.43fps +16.55%)
    DX11.....................................HD5870 (37fps).............GTX480 (40.03fps, +8.17%)
    OpenGL................................HD5870 (72.63fps)........GTX480 (66.13fps, -8.95%)

    As for the state of the play at this moment, you would need to realize that these numbers are only as good as the next driver revision, or in Metro 2033's case a bug fix.
    The longer term depends on how much of a commitment to gaming AMD show-to benefit their own architecture (which seems to play fine with nVidia cards, the reverse is not necessarily true). If the development funds continue to flow from TWIMTBP, especially in AAA titles, then it's equally probable that nVidia sponsored titles will feature heavier tesselation options within DX11 to further distinguish themselves (at least in the enthusiast sector) from ATI's offerings. Add in the facts that whoever provides the dev funding is likely to have optimized drivers on game launch day (while the other is obviously playing catch-up unless the get a look at the code before launch), and most people do not upgrade their graphics solution every six months, then these are the factors that should influence a graphics purchase. If AMD follow through with funding then that particular playing field will level out.
    Looking at the above figures it can easily be noted that the average framerates of both cards are well above the playable level for the most part...so after all these empirical arguments the choice still boils down to the subjective- For some buyers, the heat, noise, power consumption and price are offset (or are of little consideration) by PhysX, driver support on game launch day, generally better multi-gpu support, and I suppose for some, satisfaction in having the fastest single gpu. Likewise, for many consumers the price/performance point (which heavily favours AMD), noise (a valid concern), risk of heatstroke (?) and heat output/power consumption* have priority

    * Something I could agree with if the person buying the card was also the kind of person that kept their car tuned, turns off the ignition when waiting at a drive-thru/pick-up lane/stalled in a tail-back and doesn’t crank the a/c when the temp gets 5º outside their comfort zone.

    Speaking from a place that can nudge 40ºC/104ºF in summer (with 95-100% relative humidity) I’d say that having air-flow through the house is a must in any event. Anyone that fancies sitting in a sealed room 24/7 gaming or running Furmark on loop probably needs a life more than they need a graphics upgrade. Worst case scenario I presume would be card failure > RMA > Get new replacement Fermi II (since the AMD fanboy scuttlebutt tells us that after the chips are used from the initial 9,000 risk wafers there will be no more production of GTX4xx –so no direct exchange replacement for bad cards).
    2 x HD 5770
     
  19. From what I have read Crysis 2 is not going to be a whole lot more demanding than Crysis,In fact many say it will actually be less demanding.5870 or 480 will do fine and leaves you a nice upgrade path(adding a second later)that the 2X5770's won't.If you play or plan on playing other newish games and have a case with good airflow I strongly recommend the 480 as many new games will be using tessellation and physX.The 480 has PhysX built in and handles tessellation about 150% better
    than the 5870.
     
  20. Ha no suprise that the whole FERMI thing is causing so much debate with fanboys on both sides really getting stuck in. This time around I went with an ATI 5870, purely because my old system (a really old Athalon 4200 with SLI gt7900) blew up in Jan. I therefore just bought the best system I could afford at the time which was a I7 960 + ATI 5870 rig. (would have got a I7 920 but got a 960 for the same price as part of a military discount so didnt complain!)

    If my old system had blown up today I would have considered a GTX 480 though for sure. The extra performance isnt a huge amount over a 5870 at the moment but I wont now get a new rig for at least 3 years so it would last me longer before really needing to be replaced. If really needed I guess I can crossfire another 5870 anyway as I have a spare x16 slot and a big enough PSU.

    Being interested in this sort of tech I have read a heap of reviews, most are roughly the same though some are obviously biased either one way or the other. One of the most interesting ones though is at http://www.hardocp.com/article/2010/03/26/nvidia_fermi_gtx_470_480_sli_review/8.

    This is the only review I found that doesnt obsess about framerates. Instead It compares IQ and playability at a range of resolutions and compares them qualitively. For the most part it concludes that a single 5870 compared to a single 480 will give a gamer the exact same user experience, ie same amount of AA / AF / IQ with average framerates well over 60 (any more is a waste as monitor only refreshes that fast anyway). That may change in the future with newer more demanding DX11 titles but then again both ATI and NVIDIA will be releasing new cards in the future as well which will undoubtedly be better. That said the same review did rave about a SLI 480 rig, would personally love that setup myself, however I couldnt justify(to my wife of course) paying out for the huge screens needed to make it worthwhile. I only have a 1080 monitor so my single 5870 is already overkill for most games anyway!
     
  21. Wow, what a bunch of complaining commenters about power.
    I guess not a single one of the raging reds noticed that the 4870x2 TOPPEDTHE CHARTS for power consumption.
    I certainly do not recall a SINGLE red rooster complaining even once about the power useage of the 4870x2. NOT ONCE.
    Heck, not a single one of the non sentient ragers here even brought it up !
    It's right on the chart, right there if they paid attention at all to the review.
    I'll tell you what I think most of em do.
    They chit and chat and and someone tells them the new pop culture whine and then they make up their own spew around that theme.
    It's funny to watch them rant and rave and roid up till their backsides are bleeding, and see that 4870x2 HIGHER IN POWER USEAGE. LOL
    ROFLMAO\
    Gosh, we're going to have to rewrite history and insert the endless WHINING concerning the 4870x2 power use, or better yet, the raging red roided roosters could start whining about it RIGHT FREAKING NOW !
    Good luck wacked out bs'ers.
     
  22. Steve

    Steve TechSpot Staff Posts: 1,402   +483 Staff Member

    Not sure how well you read all the complaining commenter’s but this was pretty much covered right off the bad.

    Anyway just for the hell of it I will highlight the key “point” again, clearly I have nothing better to do right now. The Radeon HD 4870 X2 is older tech and yes it is a power pig and no one ever disputed that. However the fact that the Radeon HD 4870 X2 and the GeForce GTX 295 are power pigs never phased anyone because they are essentially two Radeon HD 4870 or two GeForce GTX 275 GPU’s stuck on a single PCB, what on god’s green earth were you expecting.

    What we have with the GeForce GTX 480 is a single GPU graphics card that is about 15% faster than the Radeon HD 5870 in most of our tests. For that gain you can enjoy 22% more power being sucked through your system thanks to just one component.

    Anyway comparing an older, much older dual-GPU graphics card to a brand new single GPU graphics card is the very definition of an apples to oranges comparison.
     
  23. dividebyzero

    dividebyzero trainee n00b Posts: 4,913   +718

    How true. You would think Techspot rated a better class of troll...could have least had an example of a current gen power hog
     
  24. I'm shock that this review chose to show two 5870s in CF, but failed to show two GTX 480s in SLI! I'm sure that would have shown how well they scale in SLI!
     
  25. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,906   +90

    Gee, and I just watched yet another story of a teacher and a football coach molesting the kids...thanks for setting that straight.
     


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.