Comparison Revisited, Methods Refined: Nvidia GeForce GTX 480/470 vs. ATI Radeon HD 5870/5850

Julio Franco

Posts: 9,090   +2,042
Staff member
Although the Nvidia GeForce 400 series has been out for about 2 months now, it'd seem the jury is still out on whether or not the series is a success. Some will tell you the GeForce GTX 480 is a power-hungry expensive GPU that failed to deliver, while others claim it lived up to the expectations as the world’s fastest single GPU graphics card and that power consumption figures are for sissies.

Two months later and with newer drivers, it is time to take another look at the Nvidia GeForce GTX 480/470 vs. ATI Radeon HD 5870/5850 comparison.

Read the full review at:
https://www.techspot.com/review/283-geforce-gtx-400-vs-radeon-hd-5800/

Please leave your feedback here.
 
Crazy to spend $50 - $100 more for a card that will burn up in a year. (I promise, with heat like that in your case you gonna pay sooner or later.)

Regardless, Nvidia really screwed up...

What truly amazes me is the voltage and heat arguments. Some marketing guy at Nvidia should get a $50 bonus for every card sold. Maybe that is the reason for the premium price. The same company/ fanboys that overall spend hours bragging about maximum performance for low heat. Particularly the same company/ fanboys that ragged Radeon power usage two years ago, are now all selling/ buying that this beast running at 95c+ is a good thing worth paying a premium for.

So, I need to pay fifty extra bucks or more for an occasional 2 extra fps and assume my money won't be a piece of toast (defying all known laws of circuit physics) in a year?

FTR, I have had Nvidia for about last six years. ONLY went to 5770 when MOBO crashed around Christmas and couldn't wait for Nvidia to get something DX11 out. I realize my card isn't in this class, but runs everything I have exceptionally well. Can run everything maxed, but on BFBC2 I dialed it back a bit to eliminate occasional bit of lag when tons of smoke effects present at once. Really happy I didn't try to wait and only paid $150.
 
the GeForce GTX 480 was just 2fps faster than the Radeon HD 5870 on average, and the minimum frame rate was 3fps greater.

I'm not picking sides or anything here, but in crysis 3fps higher minimum FPS is quite a lot i would say.

From your chart i noticed that in that particular game the ATI 5870 went down to 32-33FPS as the GTX 480 went to the lowest of 36-37 FPS which for crysis anyway is a big drop.

Anyway I loved this article! awesome stuff Techspot, I have been in the arena for a new Graphics card since the GTX 480 as i am (or was) an Nvidia fan but your reviews and general advice has held me back to wait a bit longer :) Thank you Techspot!
 
That's it. It is quite obvious that someone who buys a 480 instead of the 5870 has severe intelligence problems...or he either just likes the smell of burning computer hardware!!!
 
kaonis92 said:
That's it. It is quite obvious that someone who buys a 480 instead of the 5870 has severe intelligence problems...or he either just likes the smell of burning computer hardware!!!

Maybe they just want to fry their eggs on their computer? Otherwise, I can't think of why you would want an nVidia card right now.
 
You know what, after reading the whole thing again this is actually the best and most conclusive re-visited review on the internet, This sums it up perfectly, But yet i'm still not compelled to want either? but i really need a new graphics card?! Well if this is anything to go by, from what I know when Nvidia released their 8800 Card that same chip is in the 9800GTX+ just modded slightly and same for the GTX285 is a more refined model of the 9800GTX+.

Does anyone else think Nvidia will do the same thing here? and create a more refined version in a year or less? I Don't Know now if I want a new Graphics card!

Although again this battle (if done by pure Frame Rate, ignoring the power and heat) both cards are too close to call as its more on a per game bassis on which one to choose! I guess more waiting is needed to really see if all games take advantage of Tessallation and DX 11. even then, ATI will have something ready.

Sorry Nvidia, you just ain't having my money any time soon.
 
Hopefully the Nvidia fanboys have run out of excuses now.

One of their points - about tessellation - seems conpletely irrelivant to me anyway. Firstly, most games that use heavy tessellate have quite a performance hit anyway, meaning that a lot of people may just turn it off regardless of video card. Also, most games that are coming out don't even utilize tessellation in the first place, meaning the argument that they are good for future games is pointless - by then there may well be another new generation of graphics cards that top the GTX400 series.
 
A really good comparison. Kudos Julio. I know i'm asking for a lot, but it t would be great if in future benchmarks, both forms of testing would be present (demo and fraps).
Right now NVIDIA doesn't have good enough cards to keep their fanboys buying them. They need to change their design a lot to make them more efficient. I always buy the best bang for my bucks and i'm going AMD/ATI this time. (hd 5770 with maybe an i5 750 ^_^ )
 
Nvidia's cards really are high performance computing beasts this time around.Their workstation version of these cards are going to destroy the ati's....

Nvidia didn't build this generation to game, it built it to compute. The increase in double precision floating point is where these cards shine. However, with the heat levels they are going to have to put water coolers on them if they want them to last in a workstation environment.

Basically the only reason why Nvidia should be able to price these cards as they do is for the uber-nerds. People who want to accelerate programs like matlab or contribute to distributed computing projects are the one's lusting after this architecture.
 
The only thing missing to make this a true comprehensive revisit of the video card shootout is power/temp numbers while in use... But I really like how you ran real world situations to compare true fps, rather than relying on the benchmarks.

And, for the "why would you possibly buy nVidia?" comments, there are a few compelling reasons... As Stonos pointed out, CUDA and 3D Vision are good starts, but the PhysX engine is also a draw. And just plain brand loyalty counts for a good portion of sales.

I had pretty high hopes for this new batch from nVidia, but so far they haven't produced anything capable of pulling me back from the ATi camp. The price/performance/power curves just can't compare.
 
CUDA, Nvidia 3D Vision

LOL

You can cuda a low end nvidia card with a high end ATI card.
3d vision is a fad, was a fad when 3d tv came out and still is.
 
Nice review, it's always nice to see updated reviews based on the latest drivers. I'm very surprised that the gap at the top has been closed, I thought those latest drivers from nvidia would bring some significant performance boosts but evidently not.

Right now it's not even a contest unless nvidia significantly reduce their prices. Here the GTX 470 is £50 more than the 5850 and the 480 is £100 more than the 5870.
 
I understand the CUDA/ precision argument and Nvidia should be given some kudos for their efforts there. Still, that is like Ford developing a revolutionary engine for Fire and Dump trucks then selling them in Mustangs dialed in at 18,000 rpm.

I just can't see them winning in the large, core market of desktop application and gaming for this cycle.

I don't think 3D is a fad... ultimately. I do think anything requiring glasses, or at least 'active" glasses is doomed. Physic(x)s is going integrated and I can't honestly tell what I am missing based on samples I've seen. Either way, neither of these techs are a big selling point in this cycle of cards. Possibly next year or later and ATI wil probably respond in kind by then.
 
There are the small 3% of people that upgrade like I do, buying the best of what's available at the time of purchase. When I buy my Rampage III extreme I'll be picking up 480's if they are the best that's on the market at the time. I do run a water cooled setup, with lots of headroom so heat isn't an issue. I dont care about performance. If ati is on top when I upgrade this summer I'll be swinging over their way, otherwise its nvidia I go

Btw great revisit on the subject. There have been 2-3 threads about the 470 and 5850 in the last few weeks. Seems like those readers got what they asked for here.
 
If you're buying what's best at the moment, then that isn't a question between a 480 or 5870 - the best card still is the 5970, which can even be Crossfired!
 
The one arrow in Nvidia's quiver is PhysX. I would buy an Nvidia card, new, to do PhysX. But I wont because some bright spark at Nvidia thought it would be a good idea to disable PhysX in the presence of an Ati card. Dumb, so so dumb!
 
Great review! As Vrmithrax mentioned, would have like to seen some temp/power comparisons too, but we all know what the story is on that.

Was on a Egghead.com string yesterday about this very ATI vs. nVidia subject and all the nVidia fanbois could do was scream about tessellation and frame rate. Man, I wish I'd had this article to present to them.

I've been an nVidia customer forever it seems. I think my last ATI card was a Rage Pro. But I'm not stuck on brand loyalty and I do want to get the biggest bang for my buck. Unless something dramatically changes in the next six months, my next card will be an ATI 5870.
 
Actually my reason for buying an ATI card right now is their support for eyefinity. Running three 24" monitor setup on a 5770 while doing software development is perfect. As for the Fermi, my house A/C is not powerful enough to cool the house with this monster running -- although I might buy one for the winter months though...
 
Guest said:
The one arrow in Nvidia's quiver is PhysX. I would buy an Nvidia card, new, to do PhysX. But I wont because some bright spark at Nvidia thought it would be a good idea to disable PhysX in the presence of an Ati card. Dumb, so so dumb!

Actually, not dumb at all... Quite smart, if rather slimy. It's a backhanded way of ensuring brand loyalty and punishing those who don't stay on the green team.

I mean, with all of the negative hubub that popped up with the Fermi launch, nVidia has to keep at least ONE marketing point that nobody can put into contention: If you want PhysX, you have to have a pure nVidia graphics platform.

Unless, of course, you take advantage of that little reported bug in some of the recent nVidia drivers that reportedly breaks the PhysX lockdown on mixed graphics systems........
 
New drivers didn't change anything? Well, ****. I could've told you that. I'm pretty sure a few other people did when the last FERMI article cropped up on the front page. This only serves to reinforce my previously held disappointments with Nvidia and their supposedly revolutionary new graphics chip. I'm definitely not paying between 350 and 500 of my dollars for a card that's practically no different from the current generation.

Nvidia, either you were BSing when you talked about FERMI, or you're hiding it. What the crap is the deal here?
 
I think people are forgetting that the performance gap gets larger in the GTX480's favor when 2 are ran in SLI compared to 2 x 5870s ran in Crossfire. Many people with $400 - $500 dollars have enough money and usually do by a 2nd high-end card for a multi-GPU configuration. In this case which is more so in the likes of reality, 2 x 480s literally eat 2 x 5870s for breakfast.

Dwell on that thought for awhile.
 
Guest said:
I think people are forgetting that the performance gap gets larger in the GTX480's favor when 2 are ran in SLI compared to 2 x 5870s ran in Crossfire. Many people with $400 - $500 dollars have enough money and usually do by a 2nd high-end card for a multi-GPU configuration. In this case which is more so in the likes of reality, 2 x 480s literally eat 2 x 5870s for breakfast.

Dwell on that thought for awhile.

You are talking about a small minority who run these types of configurations... VERY small. If one of the big gripes is the price point of the 480, do you really think paying twice that will fly? Oh, and, of course, the cost of a beefier power supply has to be thrown in... And you might have to look at some serious cooling issues - but, of course, the little super-elite slice of the customer base you are looking at would no doubt already have water cooling and such, so not as much of an issue there...

For the average consumer/gamer (aka the vast majority of the marketplace), whether the 480s in SLi beat 5870s in Crossfire doesn't matter one iota. You might as well start arguing that the 5870s in Crossfire give you the Eyefinity benefit of 6 simultaneous screens, while the 480s only manage 3 or 4 (can't recall exactly)... The number of people who care won't even register on a scan of the consumer base.
 
Back