Comparison Revisited, Methods Refined: Nvidia GeForce GTX 480/470 vs. ATI Radeon HD 5870/5850

I'll see your ridiculous nVidia fanboy...and raise you a nonsensical AMD fanboy.....watcha got?, I got a full house, kneejerk reactionaries over tards!

Oh no, I got ya beat Chef! I read a post on another site that had said that the HD 5870 was actually 20% faster than the GTX 480, however they "faked the numbers" lest Nvidia cut them out of the loop......I think they were sending Charlie to beat them up.:p:wave: Thats at least a flush

Im going to try and find it and link. It was rather amusing.
 
Great article Julio and TS team. Good to see you revisiting and seeing what new drivers have done for the cards performances.

Makes me feel good about my brand new 5870 I just got a couple of days ago. Now I just need my new monitor to go with it.
 
@ princeton dude the whole point of the review was to bench these cards in a non standard way like everyone else. I personally think this really shows that video card chip makers seem to rely on sites being like sheep & use the built in time demos or built in benchmarks. I also tend to think Nvidia relies on it very much because of you take the same game & test it the way everyone does & if Nvidia wins by a huge amount over ATI. Then you take that same game just tested & use the testing methods as done here in this review & nvidia actually loses or only wins by a couple of frames. That should sends red flags even to a die hard fanboy like you princeton.

That should tell you that Nvidia spends a huge amount optimizing their driver to be at its best in that part of the game (time demo or benchmark portion of the game). I find it refreshing that this site thought out side the box a little & did their own testing methods as it just shows maybe those nifty Nvidia cards are not so fast after all. Ok they are fast but so are the ATI cards which in these benchmarks the ATI cards are neck & neck.

Oh & it also shows another thing about ATI they optimize for the whole game it would seem as their cards dont seem to mind working fast no matter what part of the game they are forced to play it seems you can't say the same for Nvidia lol oh both are fast but it seems ATI is more consistent now swollow that Mr p & quit arguing your fanboy crap it does not float any longer.

Oh & to the author of this good review I just found this site tonight & it surely is going to be bookmarked. Thanks
 
Oh no, I got ya beat Chef! I read a post on another site that had said that the HD 5870 was actually 20% faster than the GTX 480, however they "faked the numbers" lest Nvidia cut them out of the loop......I think they were sending Charlie to beat them up.:p:wave: Thats at least a flush

Im going to try and find it and link. It was rather amusing.

Not bad.
How about a 25% more expensive GTX 295 being a better investment than an HD 5870 !!
or, everyones favourite nvidia hate site/AMD house of worship...
The Government manipulates the weather/diverts hurricanes to possibly use as weapons of war !!! (from a thread about the Apple sponsored raids at Cebit no less).


Myself and a few of my tech friends/system builders have an informal agreement to send along outrageous and bizarre posts from the forums/sites we frequent- (along with the more interesting news items and PR releases). Sometimes, especially around CPU and GPU launches, the old inbox gets pretty full.

EDIT...
And then sometimes they just fall straight off the low hanging branch right in front of you...
@ princeton dude the whole poi...
 
@ dividebyzero: -If you were referring to me I was relaying about the article. Then again your here to just make trouble so what should we expect from you.

Oh & before you go off calling me a fanboy of ATI oops wrong. I use Both Nvidia & ATI. I also build systems for a living & have no troubles selling my customers either of the companies cards. Been in the industry for 20 years so unless you been around doing this type of work for the same amount of time STFU already.

Thanks
That is all fool
 
EVERYONE... please refrain from any name calling, discuss with arguments and be done with it. We have a strict policy against flaming.

@dividebyzero... regarding the games we picked for this article, the basis was to choose newer games that presented a challenge when ran at max settings in 1920x1200. Both CoD MW2 and HAWX were used before and we knew were not taxing enough. We also included STALKER and Metro to test Tessellation performance.

I believe somebody else also complained about the resolution and visual quality settings used. Because of the added complexity of testing using this method, we couldn't test the usual three resolutions in various settings. However, 1920x1200 is fairly standard for a 24-inch monitor that can nowdays cost you considerably less than one of these cards, so it was the sweet spot for testing high-end graphics card running games at the monitor's native res.
 
@dividebyzero... regarding the games we picked for this article, the basis was to choose newer games that presented a challenge when ran at max settings in 1920x1200. Both CoD MW2 and HAWX were used before and we knew were not taxing enough. We also included STALKER and Metro to test Tessellation performance..
Thanks for the clarification.
Since all the games tested were FPS/3PS wouldn't it have been a better choice to toss out Wolfenstein (at 75 fps+ not exactly a rigorous test) and use WiC:Soviet Assault ? which at least affords the variance of a RTS game?
 
Guest said:
CUDA, Nvidia 3D Vision

LOL

You can cuda a low end nvidia card with a high end ATI card.
3d vision is a fad, was a fad when 3d tv came out and still is.
PhysX as well... and you can't PhysX an nv card with a high end ATI card. The tossers disable it at driver level.
Another reason to buy ATI.
 
6 months into it, I am very satisfied with my 5850, my first ATI Radeon.
I go back to PCs when they 2 meg video cards and managed Windows 3.1.
Bloat sucks!
 
@Badfinger
Glad to hear from a satisfied customer. I started with computers in 1978....I think I prefer my current systems though.

the whole point of the review was to bench these cards in a non standard way like everyone else.
So the review was like everyone else’s ?
I personally think this really shows that video card chip makers seem to rely on sites being like sheep & use the built in time demos or built in benchmarks. .
Probably because they do not vary that they offer repeatable and directly comparable results
I also tend to think Nvidia relies on it very much because
of you take the same game & test it the way everyone does & if Nvidia wins by a huge amount over ATI. .
Yeah ?
Then you take that same game just tested & use the testing methods as done here in this review & nvidia actually loses or only wins by a couple of frames.
Which is pretty much how every review has run concerning the titles that have been reviewed
That should tell you that Nvidia spends a huge amount optimizing their driver to be at its best in that part of the game (time demo or benchmark portion of the game).
So nVidia tailor their drivers and architecture for demo’s and benchmarks, but don’t bother about the actual game…man, there must be some disgruntled nVidia graphics card owners out there
I find it refreshing that this site thought out side the box a little & did their own testing methods
Been in the “business” for 20 years and never come across premier review sites like sites [H]OCP and PC Perspective
..as it just shows maybe those nifty Nvidia cards are not so fast after all.
Obviously not !
Ok they are fast .
Make up your mind !
but so are the ATI cards which in these benchmarks the ATI cards are neck & neck.
Er, yeah
Oh & it also shows another thing about ATI they optimize for the whole game it would seem as their cards dont seem to mind working fast no matter what part of the game they are forced to play it seems you can't say the same for Nvidia
Sweet, so Saboteur not playing on ATI cards was what exactly?
lol oh both are fast but it seems ATI is more consistent now swollow that Mr p & quit arguing your fanboy crap it does not float any longer….dude… Been in the industry for 20 years
I doubt you’ve been on this planet for twenty years
@ dividebyzero: -If you were referring to me I was relaying about the article. Then again your here to just make trouble so what should we expect from you.
A reasoned argument ?
Been in the industry for 20 years.so unless you been around doing this type of work for the same amount of time…
Hopefully in the next twenty years you’ll learn something about the industry
STFU already.
Erudite to the last
 
I thought I might point out that while I have no idea how well a GPU may cope under long periods of heat stress, the surrounding components on the PCB I do know about.

If you look up the data sheets on the VRM's, Mosfets and Capacitors (all components, really) used on the cards, you'll find the component MTBF decreases as the ambient temperatures of the components increase.

Personally I would be very interested to see the temperatures of these components, even the temperatures of the heatsink/PCB itself in various places, not just the reported GPU temperature as the reported GPU temperature is not by itself an idicator of how long a video card will last. While it is a great indicator of how much heat the GPU is putting out, it is not an accurate guage of the heat that the surrounding components are subjected to that will be a major factor in the product life.

- Deano
 
You guy's are arguing about something that is pointless,The performance difference is basically nothing so if you feel you need physX and cuda get the 470\480 or if you prefer ATI(which i do) and you don't mind losing a couple or so frames then there isn't anything wrong with the 5850\5870.

Great review Julio Franco
 
Here is what heat does to electrical circuits, I will explain this as simple as possible. Silicone in its intrinsic (Pure crystalline) state has a negative temperature coefficient meaning it will offer less resistance as the temp gets higher, but in microprocessors the silicone is dope ( added impurities) in order to achieve a good conduction. This doping process using other elements produces a positive temp coefficient, meaning that higher the temperature the more resistance it creates and more resistance means more current is required to achieve the same results.
When you place a graphics card into a PC it will affect many components such as your CPU, North Bridge and so and so forth due to the heat produced, keep in mind also all the copper tracks connecting those devices on the circuit board and as you may already know copper is also a great heat conductor which spreads the heat much quick. So in some cases a rise of 1 degree Celsius may cause a much higher rise in resistance across many components this in turns causes more current to be drawn by the system which than rises the heat again. In essence while the GPU maybe designed to withstand a higher temperature that heat produced my exceeded other components in the system which will cause adverse side effects in the future. I hope this clears a few things out.
 
I am still going with ATI/AMD for the Bang for the Money.. And I go low power when I can. Got 4 Desktops in 1 room, and 2 more in different rooms, as well as 3 laptops.. heh I could mortgage my house for 6 5870s? I just bought a 5750 to replace a 4830.. Havent seen anything from nvidia that makes me want to change.

None of the top end cards are worth the money. I can put together a complete system with a 5750 and get 80% of the performance for 30% of the cost.. Your better off going crossfire/Sli with a middle card than either of the top cards..

Dont get me wrong.. Both cards are fast but too much. The real money is never at the top end, just bragging rights. The big bucks are in middle to low end in volume..

I want to see both companies continue, Id hate there to be no compitetion..



I alsp thought PhyisX was going to have a independent chip for the MB..
 
Outstanding review. A freind of mine an I were having a showdown, one using dual 5850's and the other using dual 470's last weekend. Our results were very close to yours Julio. One thing we noticed was that the 470's scaled better with overlcocking then the 5850's did, however the 5850's could be overclocked more. Either way the results were fairly close and we are both happy with our cards.

Nicely done review. Thank you. :)
 
Thanks for the article, guys. I found the methodology and results interesting. It would be interesting to see results in more points within a game, but I realise this will be more time consuming. I didn't see any need for discussion of power/heat considering they're a known factor (although heat in particular might change between card models, and I'm sure there are or will be 480 cards beating some 5870 cards).

I'll need a little more convincing that the 480 and 5870 really are that close in most cases, and that the 470 loses big time to the 5850, but this article provides a very useful alternate perspective and hopefully will spark more testing of this kind.
 
Gameplay FRAPS runs are always likely to throw up inconsistancies, simply because the runs cant be 100% identical.
For instance- a comparison in BFBC2 between Techspot and Hexus (review that went up today) benches
Given that the test systems are fairly equal (Techspot Core i7 965 @ 3.7, 6Gb DDR3-1600CL9 and Hexus Core i7 965 @ 3.2 w/turbo enabled, 6Gb DDR3-1066CL7) , identical drivers, both benches were gameplay runs recorded with FRAPS ( 60 seconds for Techspot, 30 seconds for Hexus) and both used identical maximum quality settings with 4xAA/16xAF...

GTX 480 ( 68 fps TS, 68.8 fps Hexus) -remarkably consistant !
HD5870 ( 66 fps TS, 62.33fps Hexus)- likewise
HD5850 ( 56 fps TS, 53.27 fps Hexus)- also fairly close
GTX 470 ( 47 fps TS, 53.4 fps Hexus) - ?

Regardless of the results, I'd still put more faith in user reviews -newegg and tech site forums, preferably with proof of ownership- than a fps pass. I've gamed with three of the four cards and found all to be more than able at the job.
 
"dividebyzero" - Regardless of the results, I'd still put more faith in user reviews -newegg and tech site forums, preferably with proof of ownership- than a fps pass. I've gamed with three of the four cards and found all to be more than able at the job.

-------------------

That says it all really LOL
 
Article update: Mea culpa. When we published this performance review earlier this week, we unintentionally overlooked two Nvidia-specific settings that are available in Just Cause 2 (Bokeh Filters and GPU Water Simulation). These CUDA features are enabled by default when selecting the 'High' preset and thus negatively impacted performance making for an unfair comparison against the ATI Radeon boards. The updated graphs reflect an accurate measure of apples vs. apples performance: https://www.techspot.com/review/283-geforce-gtx-400-vs-radeon-hd-5800/page5.html

We are glad to report this is the only glaring issue with our original numbers. We recently also tested with one additional game, Aliens vs. Predator, here are the graphs if you want to check those out:
https://www.techspot.com/mediagallery.php?f=283&sub=bench&img=AVP_01.png
https://www.techspot.com/mediagallery.php?f=283&sub=bench&img=AVP_02.png
 
people who want maximum performance would get a 480 instead of a 5870! plus u can get the 480 watercooled so its not so hot...power consumption does suck but 4 way sli can work on a 1500w psu
 
people who want maximum performance would get a 480 instead of a 5870!
HD 5970
plus u can get the 480 watercooled so its not so hot...power consumption does suck but 4 way sli can work on a 1500w psu
or 1200w.....but don't forget you'll also need a specialist 9 or 10 PCI slot chassis, a 4-way SLI capable board.....oh, and plenty of 4-way SLI game driver profiles too. Can you pick me up a kit too while you're about it...oh, and some Dorito's as well. Thx.
 
Article update: Mea culpa. When we published this performance review earlier this week, we unintentionally overlooked two Nvidia-specific settings that are available in Just Cause 2 (Bokeh Filters and GPU Water Simulation). These CUDA features are enabled by default when selecting the 'High' preset and thus negatively impacted performance making for an unfair comparison against the ATI Radeon boards. The updated graphs reflect an accurate measure of apples vs. apples performance: https://www.techspot.com/review/283-geforce-gtx-400-vs-radeon-hd-5800/page5.html

Still, one game won't make it up for the whole thing... It's way too expensive (not just the card, you'd probably need better cooling and power supply) and inefficient just for a few more FPS in only a few games!

It just don't make sence GTX480. At least, not to me.

Bye!
 
Princeton said:

Nvidia's cards were designed to operate at a particular temp and who are you laymen to claim that the temp they've designed to is 'bad'. The cards have only been available for 3 months and you're claiming they won't last the year because they happen to run 10 degrees hotter than ATI's cards?

At best, that's piss pour logic. At worst it's a bunch of people who know nothing about how a board is designed, built and tested talking out their ***.

Lol, nVIDIA's temeprature limit on the GTX400 series is 105 degrees celsius...so if a card goes in the 90's, which most of them do, well it's close to it's threshold...

No need to be an engineer to understand that!

Also, it's funny to see that nVIDIA has been really quick to let the board partners loose with their non-reference design...they had no choice, they were on the edge of catastrophy in sales....now, the "custom cooled" GTX4** series will sell, and as EVERYBODY noticed, the standard reference design GTX4**s are ON SALE!!!!

Shame on you nVIDIA fanboi for not fighting fair! You got to give the "enemy team" the respect they deserve...
 
How about you turn up the AA on some of your benchmarks and see which gpu actually does better when not playing at 4x or 8x. Seems like you are taking alot of what the GF100's do so well by curbing AA to 4x in games like RE5 and BC2. Let's see how well each card does if you force 24x or 32x in the same games, at least 16 plz. No more tests with AF off or AA crippled to make the 5870 look like it can compete.
 
Back