Comparison Revisited, Methods Refined: Nvidia GeForce GTX 480/470 vs. ATI Radeon HD 5870/5850

By on June 3, 2010, 5:51 AM
Although the Nvidia GeForce 400 series has been out for about 2 months now, it'd seem the jury is still out on whether or not the series is a success. Some will tell you the GeForce GTX 480 is a power-hungry expensive GPU that failed to deliver, while others claim it lived up to the expectations as the world’s fastest single GPU graphics card and that power consumption figures are for sissies.

While we feel the truth might lie somewhere in the middle, it is time to take another look at the Nvidia GeForce GTX 480/470 vs. ATI Radeon HD 5870/5850 comparison.


We also wanted to revisit this high-end graphics card battle to settle a belief of the green team fans that the GeForce GTX 480 is far superior to the Radeon HD 5870 when measuring minimum frames per second. This is an interesting argument as the average fps results did not suggest a huge difference in minimum frame rate performance when we last tested, though it could be possible and therefore we wanted to find out if this were true or not.

Read the complete review.




User Comments: 106

Got something to say? Post a comment
ikenlob said:

Crazy to spend $50 - $100 more for a card that will burn up in a year. (I promise, with heat like that in your case you gonna pay sooner or later.)

Regardless, Nvidia really screwed up...

What truly amazes me is the voltage and heat arguments. Some marketing guy at Nvidia should get a $50 bonus for every card sold. Maybe that is the reason for the premium price. The same company/ fanboys that overall spend hours bragging about maximum performance for low heat. Particularly the same company/ fanboys that ragged Radeon power usage two years ago, are now all selling/ buying that this beast running at 95c+ is a good thing worth paying a premium for.

So, I need to pay fifty extra bucks or more for an occasional 2 extra fps and assume my money won't be a piece of toast (defying all known laws of circuit physics) in a year?

FTR, I have had Nvidia for about last six years. ONLY went to 5770 when MOBO crashed around Christmas and couldn't wait for Nvidia to get something DX11 out. I realize my card isn't in this class, but runs everything I have exceptionally well. Can run everything maxed, but on BFBC2 I dialed it back a bit to eliminate occasional bit of lag when tons of smoke effects present at once. Really happy I didn't try to wait and only paid $150.

Burty117 Burty117, TechSpot Chancellor, said:

the GeForce GTX 480 was just 2fps faster than the Radeon HD 5870 on average, and the minimum frame rate was 3fps greater.

I'm not picking sides or anything here, but in crysis 3fps higher minimum FPS is quite a lot i would say.

From your chart i noticed that in that particular game the ATI 5870 went down to 32-33FPS as the GTX 480 went to the lowest of 36-37 FPS which for crysis anyway is a big drop.

Anyway I loved this article! awesome stuff Techspot, I have been in the arena for a new Graphics card since the GTX 480 as i am (or was) an Nvidia fan but your reviews and general advice has held me back to wait a bit longer Thank you Techspot!

kaonis92 said:

That's it. It is quite obvious that someone who buys a 480 instead of the 5870 has severe intelligence problems...or he either just likes the smell of burning computer hardware!!!

isamuelson isamuelson said:

kaonis92 said:

That's it. It is quite obvious that someone who buys a 480 instead of the 5870 has severe intelligence problems...or he either just likes the smell of burning computer hardware!!!

Maybe they just want to fry their eggs on their computer? Otherwise, I can't think of why you would want an nVidia card right now.

Burty117 Burty117, TechSpot Chancellor, said:

You know what, after reading the whole thing again this is actually the best and most conclusive re-visited review on the internet, This sums it up perfectly, But yet i'm still not compelled to want either? but i really need a new graphics card?! Well if this is anything to go by, from what I know when Nvidia released their 8800 Card that same chip is in the 9800GTX+ just modded slightly and same for the GTX285 is a more refined model of the 9800GTX+.

Does anyone else think Nvidia will do the same thing here? and create a more refined version in a year or less? I Don't Know now if I want a new Graphics card!

Although again this battle (if done by pure Frame Rate, ignoring the power and heat) both cards are too close to call as its more on a per game bassis on which one to choose! I guess more waiting is needed to really see if all games take advantage of Tessallation and DX 11. even then, ATI will have something ready.

Sorry Nvidia, you just ain't having my money any time soon.

Guest said:

Hopefully the Nvidia fanboys have run out of excuses now.

One of their points - about tessellation - seems conpletely irrelivant to me anyway. Firstly, most games that use heavy tessellate have quite a performance hit anyway, meaning that a lot of people may just turn it off regardless of video card. Also, most games that are coming out don't even utilize tessellation in the first place, meaning the argument that they are good for future games is pointless - by then there may well be another new generation of graphics cards that top the GTX400 series.

Puiu Puiu said:

A really good comparison. Kudos Julio. I know i'm asking for a lot, but it t would be great if in future benchmarks, both forms of testing would be present (demo and fraps).

Right now NVIDIA doesn't have good enough cards to keep their fanboys buying them. They need to change their design a lot to make them more efficient. I always buy the best bang for my bucks and i'm going AMD/ATI this time. (hd 5770 with maybe an i5 750 ^_^ )

Stonos said:

isamuelson said:

Otherwise, I can't think of why you would want an nVidia card right now.

CUDA, Nvidia 3D Vision

Guest said:

Nvidia's cards really are high performance computing beasts this time around.Their workstation version of these cards are going to destroy the ati's....

Nvidia didn't build this generation to game, it built it to compute. The increase in double precision floating point is where these cards shine. However, with the heat levels they are going to have to put water coolers on them if they want them to last in a workstation environment.

Basically the only reason why Nvidia should be able to price these cards as they do is for the uber-nerds. People who want to accelerate programs like matlab or contribute to distributed computing projects are the one's lusting after this architecture.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

The only thing missing to make this a true comprehensive revisit of the video card shootout is power/temp numbers while in use... But I really like how you ran real world situations to compare true fps, rather than relying on the benchmarks.

And, for the "why would you possibly buy nVidia?" comments, there are a few compelling reasons... As Stonos pointed out, CUDA and 3D Vision are good starts, but the PhysX engine is also a draw. And just plain brand loyalty counts for a good portion of sales.

I had pretty high hopes for this new batch from nVidia, but so far they haven't produced anything capable of pulling me back from the ATi camp. The price/performance/power curves just can't compare.

Guest said:

CUDA, Nvidia 3D Vision

LOL

You can cuda a low end nvidia card with a high end ATI card.

3d vision is a fad, was a fad when 3d tv came out and still is.

slh28 slh28, TechSpot Paladin, said:

Nice review, it's always nice to see updated reviews based on the latest drivers. I'm very surprised that the gap at the top has been closed, I thought those latest drivers from nvidia would bring some significant performance boosts but evidently not.

Right now it's not even a contest unless nvidia significantly reduce their prices. Here the GTX 470 is £50 more than the 5850 and the 480 is £100 more than the 5870.

Guest said:

The fact that the 470 costs more than the 5850 is outrageous.

ikenlob said:

I understand the CUDA/ precision argument and Nvidia should be given some kudos for their efforts there. Still, that is like Ford developing a revolutionary engine for Fire and Dump trucks then selling them in Mustangs dialed in at 18,000 rpm.

I just can't see them winning in the large, core market of desktop application and gaming for this cycle.

I don't think 3D is a fad... ultimately. I do think anything requiring glasses, or at least 'active" glasses is doomed. Physic(x)s is going integrated and I can't honestly tell what I am missing based on samples I've seen. Either way, neither of these techs are a big selling point in this cycle of cards. Possibly next year or later and ATI wil probably respond in kind by then.

LinkedKube LinkedKube, TechSpot Project Baby, said:

There are the small 3% of people that upgrade like I do, buying the best of what's available at the time of purchase. When I buy my Rampage III extreme I'll be picking up 480's if they are the best that's on the market at the time. I do run a water cooled setup, with lots of headroom so heat isn't an issue. I dont care about performance. If ati is on top when I upgrade this summer I'll be swinging over their way, otherwise its nvidia I go

Btw great revisit on the subject. There have been 2-3 threads about the 470 and 5850 in the last few weeks. Seems like those readers got what they asked for here.

Guest said:

If you're buying what's best at the moment, then that isn't a question between a 480 or 5870 - the best card still is the 5970, which can even be Crossfired!

Guest said:

The one arrow in Nvidia's quiver is PhysX. I would buy an Nvidia card, new, to do PhysX. But I wont because some bright spark at Nvidia thought it would be a good idea to disable PhysX in the presence of an Ati card. Dumb, so so dumb!

TomSEA TomSEA, TechSpot Chancellor, said:

Great review! As Vrmithrax mentioned, would have like to seen some temp/power comparisons too, but we all know what the story is on that.

Was on a Egghead.com string yesterday about this very ATI vs. nVidia subject and all the nVidia fanbois could do was scream about tessellation and frame rate. Man, I wish I'd had this article to present to them.

I've been an nVidia customer forever it seems. I think my last ATI card was a Rage Pro. But I'm not stuck on brand loyalty and I do want to get the biggest bang for my buck. Unless something dramatically changes in the next six months, my next card will be an ATI 5870.

Guest said:

Actually my reason for buying an ATI card right now is their support for eyefinity. Running three 24" monitor setup on a 5770 while doing software development is perfect. As for the Fermi, my house A/C is not powerful enough to cool the house with this monster running -- although I might buy one for the winter months though...

Vrmithrax Vrmithrax, TechSpot Paladin, said:

Guest said:

The one arrow in Nvidia's quiver is PhysX. I would buy an Nvidia card, new, to do PhysX. But I wont because some bright spark at Nvidia thought it would be a good idea to disable PhysX in the presence of an Ati card. Dumb, so so dumb!

Actually, not dumb at all... Quite smart, if rather slimy. It's a backhanded way of ensuring brand loyalty and punishing those who don't stay on the green team.

I mean, with all of the negative hubub that popped up with the Fermi launch, nVidia has to keep at least ONE marketing point that nobody can put into contention: If you want PhysX, you have to have a pure nVidia graphics platform.

Unless, of course, you take advantage of that little reported bug in some of the recent nVidia drivers that reportedly breaks the PhysX lockdown on mixed graphics systems........

IvanAwfulitch IvanAwfulitch said:

New drivers didn't change anything? Well, ****. I could've told you that. I'm pretty sure a few other people did when the last FERMI article cropped up on the front page. This only serves to reinforce my previously held disappointments with Nvidia and their supposedly revolutionary new graphics chip. I'm definitely not paying between 350 and 500 of my dollars for a card that's practically no different from the current generation.

Nvidia, either you were BSing when you talked about FERMI, or you're hiding it. What the crap is the deal here?

Guest said:

I think people are forgetting that the performance gap gets larger in the GTX480's favor when 2 are ran in SLI compared to 2 x 5870s ran in Crossfire. Many people with $400 - $500 dollars have enough money and usually do by a 2nd high-end card for a multi-GPU configuration. In this case which is more so in the likes of reality, 2 x 480s literally eat 2 x 5870s for breakfast.

Dwell on that thought for awhile.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

Guest said:

I think people are forgetting that the performance gap gets larger in the GTX480's favor when 2 are ran in SLI compared to 2 x 5870s ran in Crossfire. Many people with $400 - $500 dollars have enough money and usually do by a 2nd high-end card for a multi-GPU configuration. In this case which is more so in the likes of reality, 2 x 480s literally eat 2 x 5870s for breakfast.

Dwell on that thought for awhile.

You are talking about a small minority who run these types of configurations... VERY small. If one of the big gripes is the price point of the 480, do you really think paying twice that will fly? Oh, and, of course, the cost of a beefier power supply has to be thrown in... And you might have to look at some serious cooling issues - but, of course, the little super-elite slice of the customer base you are looking at would no doubt already have water cooling and such, so not as much of an issue there...

For the average consumer/gamer (aka the vast majority of the marketplace), whether the 480s in SLi beat 5870s in Crossfire doesn't matter one iota. You might as well start arguing that the 5870s in Crossfire give you the Eyefinity benefit of 6 simultaneous screens, while the 480s only manage 3 or 4 (can't recall exactly)... The number of people who care won't even register on a scan of the consumer base.

Guest said:

You're that NV troll from AT! I know you :D

ikenlob said:

Guest said:

I think people are forgetting that the performance gap gets larger in the GTX480's favor when 2 are ran in SLI compared to 2 x 5870s ran in Crossfire. Many people with $400 - $500 dollars have enough money and usually do by a 2nd high-end card for a multi-GPU configuration. In this case which is more so in the likes of reality, 2 x 480s literally eat 2 x 5870s for breakfast.

Dwell on that thought for awhile.

I bought two 7800gtx in sli a few years ago, so new I had to wait a couple months on them. That cost me >$1000. More than a year later I let Steam survey my system and I was in the top 3% for GPU configuration. A year later, there were dozens of single cards that outpaced my two. I'd guess a lot more people SLI nowadays, but I am affiliated with a few clans, probably close to 150 gamers and don't know one single person with two high end cards in sli. For 99% of all gamers, why spend >$1000 when you can play everything on the market for $150 - $300?

Same argument for tesselation, 3D and such. Great when it makes financial sense or if you have nothing better to spend your money on. That is the point that Nvidia is missing.

princeton princeton said:

Well this is the last time i trust a techspot review on gpu's. Just cause 2. Well you guys have the gtx 470 getting smoked. Well every other site I've seen with the gtx 470 vs hd 5850 just cause 2 benchmark has the opposite. Here are some examples.

[link]

[link]

Look at this. On OC club the stock gtx 470 although it has different fans still runs at stock clocks. Is beating a factory overclocked 5850. How interesting.

Lets try another game.

Bad company 2. Now your review had it being beaten by the hd 5850 and by a noticeable bit as well.

[link]

[link]

[link]

Wow. Look at that. The GTX 470 isn't being beaten at all. Infact it's winning by its minimum framerate in every case. Even on hardware canucks the hd 5850 beats it by average but the gtx 470 has 3 more fps on the minimums.

I know everyone benchmarks different and I am NO nvidia fanboy. Infact the gtx 260 216 was the first card I've every bought and I bought it after reading reviews, specifically ones on this very site. And I was planning on getting an hd 5850 card but after seeing the benchmarks for the gtx 470 on every site EXCEPT this very one as better than the hd 5850. I wouldn't base an nvidia vs ati purchase off this site for this generation of cards.

TomSEA TomSEA, TechSpot Chancellor, said:

princeton - you missed the whole point of this particular comparison. TechSpot ran these comparisons using Fraps over a period of time in real game play. Not just the usual bank of frame-rate testing software that everyone else in your links did.

This was a better representation of real-life performance.

Kibaruk Kibaruk, TechSpot Paladin, said:

Would you please do one of this reviews, but taking into consideration heat and power, which is what really makes the difference between this.

Guest said:

How many electrical engineers do we have posting on here? How many of you non-engineers can conclusively say that the ninety degrees temps are "bad" while the ATI 80 degree temps are "good". Nobody complains that a Honda engine combustion temperature reaches 2300K, which is more than a 'typical' engine, because the actual degree doesn't doesn't matter as long as it doesn't exceed the temp it was designed for.

Nvidia's cards were designed to operate at a particular temp and who are you laymen to claim that the temp they've designed to is 'bad'. The cards have only been available for 3 months and you're claiming they won't last the year because they happen to run 10 degrees hotter than ATI's cards?

At best, that's piss pour logic. At worst it's a bunch of people who know nothing about how a board is designed, built and tested talking out their ***.

Guest said:

..Because high temps cause all sorts of hardware failures? Comparing it do an engine is pretty stupid since those things are built to withstand that kind of thing, whereas computers are more delicate when it comes to heat. It's why when I overclock my 4870 too high my computer crashes - I have a perfectly nice 650w power supply so that ain't the problem. However, I don't have any case fans and need to get a new case full stop, which is the reason my temps are higher than others, not the card itself. And out of curiosity, where do you get this 80C ATI temps from? From what I've seen it's much lower than that in the benchmarks.

Guest said:

I see that these were done with retail cards. In the first review, they appear to be generic. Could that make the difference we see here? There were some rumors floating around about a BIOS change between the launch and actual appearance on the retail market, purportedly to address the fan levels, but who knows what else may have been changed.

LNCPapa LNCPapa said:

Not only that princeton, but look at the driver versions and processors (one of them) used - they could explain some of the differences you see as well.

princeton princeton said:

LNCPapa said:

Not only that princeton, but look at the driver versions and processors (one of them) used - they could explain some of the differences you see as well.

At 1900x1200 in most games CPU won't have much of an impact. Now when your playing at 1900x1200 you are probably running an sli/CF setup or a dual gpu card. Dual gpus means the cpu will need to be better to feed info to both cards. But since these are single card benchmarks I doubt it would have much impac. Regarding driver versions. I have a gtx 260 and I can say I gained significant increase in fps in crysis warhead. Around 4-5fps actually. My friend micheal runs a gtx 480 and he has also reported increased FPS in warhead and metro 2033 to name a couple.

princeton princeton said:

TomSEA said:

princeton - you missed the whole point of this particular comparison. TechSpot ran these comparisons using Fraps over a period of time in real game play. Not just the usual bank of frame-rate testing software that everyone else in your links did.

This was a better representation of real-life performance.

FRAPS we recorded a 1-minute passage: Crysis warhead

Yah 1min is suuuure an accurate representation of actual gameplay. It doesn't even say what level it was. For all we know it could be any level in the entire game and as we all know crysis and warheads levels tend to produce different levels of fps. You want a representation of real life performance. Take a 10run through adapt or perish with every card. Then come say "representation of real life performance."

TomSEA TomSEA, TechSpot Chancellor, said:

Princeton - what part of those other reviewers using the same generic testing software and didn't do it using a real-time performance test are you not getting? Even an assessment for one minute of actual game play is better than some 3DMark or StoneGiant b.s. "who the hell knows how they got it" benchmarking figures

dividebyzero dividebyzero, trainee n00b, said:

NIce review as far as it went.

Interesting to see how the usual benchmark suite was culled down to seven games...

The games left out:

Batman:Arkham Asylum

CoD: MW2

Far Cry 2

HAWX

Company of Heroes pposing Front (this one getting a little long in the tooth anyhow)

World in Conflict: Soviet Assault

mmmmm....well I suppose certain tech (satire) sites will have one less target in their sights.

BTW: Nice to see the HD 5870's benchmark scores finally move on Resident Evil 5. The original 5870 review (23 Sept 2009) listed the fps as 108, 96, 70 ( for 1680, 1920, 2560 resoluitions) using the release Catalyst 9.9 driver, while under the Catalyst 10.4 driver in the iChill GTX 480 review (26 Mar 2010) the benchmark fps's were an identical 108. 96, 70. The Far Cry 2 benchmarks I noted were also identical.

princeton princeton said:

TomSEA said:

Princeton - what part of those other reviewers using the same generic testing software and didn't do it using a real-time performance test are you not getting? Even an assessment for one minute of actual game play is better than some 3DMark or StoneGiant b.s. "who the hell knows how they got it" benchmarking figures

So your implying that every review for gpu's ever made that has used a benchmark tool is useless? Well then I guess we can put these on the list of useless review sites.

tomshardware

bit-tech

hadwarecanucks

bjorn3d

guru3d

overclockers club

overclockers.com

overclock.net

ect. I understand that the benchmarks here MAY be better representations of actual gameplay. The point still stands that the nvidia cards are more powerful. If they weren't then they wouldn't get higher scores in benchmarks. I would also like to point out that people whining about the operating temperatures obviously don't understand that there is no universal "safe" component temperature. I know people who have ran core i7 cpus at 100 degrees and they didn't crash or die. A motherboard northbridge tends to get very hot. These nvidia gpus are designed to run at higher operating temperatures. I would also like to note that these people seem to think that after all these years of outstanding service and products by nvidia that they would just decide to put out a product they know wont last very long.

grvalderrama said:

It was quite a mistake not to consider the heat thing by Nvidia's staff. Heat is the nightmare of all microdevices developers... I must say AMD did it really good.

Cheers!

Guest said:

Nvidia has had a long long time to create a product to beat the 5870 but what they brought out was about the same at a higher cost and poorer efficiency. Fail.

Guest said:

I bought my gtx 470 msi for $270 dollars new. Can't complain!

princeton princeton said:

Guest said:

I bought my gtx 470 msi for $270 dollars new. Can't complain!

/Puts gun to your head. WHERE WHERE WHERE WHERE WHERE!!!!!!

Guest said:

@ princeton - ohh no TechSpot is not trusted by an idiotic Nvidia fan boy that throws links around to try and make his point. I have seen plenty of tests where ATI is faster in JC2. It comes down to the settings used and TS used maximum quality settings. Google GTX 470 Just Cause 2 performance and on the first page there are a few reviews that use the same settings and find the same results.

Awesome review guys. Thanks for posting the truth!

vipervoid1 said:

Nice review...

and thx for the review too...

Well, I'm planning to get myself a GTX480 or HD5870...

Now I decided ,that I will get myself a HD 5870...

Staff
Julio Franco Julio Franco, TechSpot Editor, said:

Thank you everyone for your feedback. It's not uncommon for this kind of review to generate some controversy, as mentioned in the article we were surprised by the results ourselves.

Rest assured we are not trying to convince or please anyone other than our loyal readers that want facts. We have described our methods that in our opinion represent a closer picture of real gameplay performance for the games we tested.

Finally, for this article we focused on performance and not temperatures or power consumption, which were measured on our original review of the GeForce GTX 480. You can use those values as reference, we don't believe newer drivers have had any effect on those.

Guest said:

Nice review, I like those line plots better because they highlight places where the gaming is slow. Base on my personal experience, it is not how many 100+fps the graphics card can do to have good enjoyable gaming experience but rather, it is how well the graphics card can do to cope with the minimum fps.

Seems to me that the 5870 virtually the same as the gtx480 with the exception of Metro2033. The GTX470 is painfully slow in almost every benchmark compared to the 5850. That card, to me, is a worst out of the four.

I know Nvidia has got CUDA and Physx. However, the man responsible for CUDA has moved to AMD promoting the development of AMD FUSION APU. I think it is clear that the tide is shifting. The way I see it, the turtle (AMD) has overtaken the hare (Nvidia, who has been napping for too long).

princeton princeton said:

Guest said:

@ princeton - ohh no TechSpot is not trusted by an idiotic Nvidia fan boy that throws links around to try and make his point. I have seen plenty of tests where ATI is faster in JC2. It comes down to the settings used and TS used maximum quality settings. Google GTX 470 Just Cause 2 performance and on the first page there are a few reviews that use the same settings and find the same results.

Awesome review guys. Thanks for posting the truth!

As I already outright proved I am no nvidia fanboy. Infact I have only owned one gaming gpu in my life so there is no possible way I could be biased. But nice try trying to make yourself look intelligent.

OneArmedScissor said:

So your implying that every review for gpu's ever made that has used a benchmark tool is useless?

I can't say I've ever played a benchmark.

I can say I have seen a lot of extremely lazy reviews on so called "PC enthusiast" websites, and you named some of the most serious offenders.

And for the record, you can't replicate a 10 minute run of a game. I mean seriously, you believe benchmarks are worthwhile, but then you ask for that? Where is the logic?

red1776 red1776, Omnipotent Ruler of the Universe, said:

As I already outright prooved I am no nvidia fanboy. Infact I have only owned one gaming gpu in my life so there is no possible way I could be biased.

This makes the top ten of most ridiculous statements ever made on TS.

And for the record, you can't replicate a 10 minute run of a game. I mean seriously, you believe benchmarks are worthwhile, but then you ask for that? Where is the logic?

+1

Why don't you just purchase your GTX 470 and be happy? instead of trying to convince everyone that it's superior? .....BTW...how is the benefit package over at NVidia?

dividebyzero dividebyzero, trainee n00b, said:

Seems to me that the 5870 virtually the same as the gtx480 with the exception of Metro2033. The GTX470 is painfully slow in almost every benchmark compared to the 5850. That card, to me, is a worst out of the four.)

In this review yes.

But like any other review it's in the numbers...

If , for arguments sake, you included HAWX (which up until this review was part of Techspots benchmarking suite), you might see this

And if you included CoD:MW2, a fairly popular game I believe (which up until this review was part of Techspots benchmarking suite), you might see this

Likewise for a full electric chair review; "Sims 3", a title that heavily favours AMD cards could have been included at the expense of say, "Metro 2033"

A fairly simple metric of ascertaining how "good" or "bad" these cards are would be to look at the newegg reviews for both the GTX480/470, the HD 5850 and HD 5870 as well as the incidences of "buyers remorse" in the online auction arena

I know Nvidia has got CUDA and Physx. However, the man responsible for CUDA has moved to AMD promoting the development of AMD FUSION APU. I think it is clear that the tide is shifting. The way I see it, the turtle (AMD) has overtaken the hare (Nvidia, who has been napping for too long).

"The man responsible for CUDA".....oh dear! Don't your arms get tired waving the red flag?

The man in question is Manju Hegde. He was a vice president of PhysX and CUDA MARKETING. While he was the CEO of Ageia (admin, not tech) when nVidia bought the company, he certainly didn't scale the same heights at nVidia. The main man for CUDA is Ian Buck (PhD Stanford I believe).

Feel free to feel the excitement of AMD acquiring one of nVidia's PR guys.

Somehow I doubt that an APU sporting HD5450 level graphics is going to seriously hinder enthusiast graphics cards development for the foreseeable future, and considering ATI graphics are the only thing standing between AMD and a sea of red ink maybe that's just as well.

You might also consider that part of AMD's appeal has been as the plucky underdog...something thats a little hard to maintain when you're heading for parity or better in market share. You might do well to take note of the noticable groundswell of a reversal of opinions in light of the many ill informed bandwagon jumpers that permeate the public domain

Oh, and before you get to carried away in the euphoria of nvidia's impending demise, here's a tidbit regarding Nebulae- the worlds second biggest computer. You'll note that in addition to the 9,280 Intel Xeon X5650 CPU's it also runs 4,640 nVidia C2050 Tesla cards (workstation GTX 470). If certain tech "journalists" are right with their 5,000-8,000 total GTX4xx cards being produced, then nVidia might need to run off another batch fairly soon.

This makes the top ten of most ridiculous statements ever made on TS.

I'll see your ridiculous nVidia fanboy...and raise you a nonsensical AMD fanboy.....watcha got?, I got a full house, kneejerk reactionaries over tards!

red1776 red1776, Omnipotent Ruler of the Universe, said:

I'll see your ridiculous nVidia fanboy...and raise you a nonsensical AMD fanboy.....watcha got?, I got a full house, kneejerk reactionaries over tards!

Oh no, I got ya beat Chef! I read a post on another site that had said that the HD 5870 was actually 20% faster than the GTX 480, however they "faked the numbers" lest Nvidia cut them out of the loop......I think they were sending Charlie to beat them up.:p Thats at least a flush

Im going to try and find it and link. It was rather amusing.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.