Nvidia pushes back Fermi GPU to March 2010?

By on December 28, 2009, 10:28 AM
After unveiling Fermi in September Nvidia had scheduled to launch cards based the new GPU architecture as early as November. That obviously didn't happen and considering the present date it's also quite clear that we won't be seeing an answer to AMD's latest Radeon HD series this year. Presumably a launch date was pushed back to coincide with CES in January, but now it seems that the delay might be even longer than feared.

According to DigiTimes, sources from within motherboard manufacturers have been informed by Nvidia that its first 40nm graphics chip will arrive March as the GF100, with the GF104 also set to target the high-end graphics market in the second quarter of 2010. For the mainstream and performance level segments, Nvidia will allow its GTS 200 series and 9800/9500GT cards to continue competing against AMD's DirectX 11 and older products.

If that wasn't enough of a problem, AMD is said to be preparing a few more releases targeted at the all-important mainstream market. Slated for late January or early February, the 40nm Radeon HD 5670 (Redwood) and HD 5450 (Cedar) will reportedly slot in alongside the unannounced Radeon HD 5570 and HD 5350 to round out the lower and middle portions of the company's Evergreen refresh.




User Comments: 42

Got something to say? Post a comment
TomSEA TomSEA, TechSpot Chancellor, said:

Boy Nvidia is really on the ropes taking an incredible beating from AMD. After dominating for so many years, amazing to see them fall down so hard.

pcnthuziast said:

Falling down so hard you say... lol! Bit of an exaggeration IMHO.

red1776 red1776, Omnipotent Ruler of the Universe, said:

Falling down so hard you say... lol! Bit of an exaggeration IMHO.

how could it be an 'exaggeration'? to give the your competition a six month head start? and let them run the table? ....that is the definition of 'falling down' in the business world.

TomSEA TomSEA, TechSpot Chancellor, said:

This article suggests that Nvidia won't have their DX11 cards out for another six months. That's will make it an ENTIRE YEAR since AMD came out with their off-the-charts performing DX11 cards that effectively blew out of the water every Nvidia card out there. Not to mention that when Nvidia does come out with their new cards, there's no guarantee they're going to be better than the AMD cards.

And you think "falling down hard" is an exaggeration? In the competitive world of video card manufacturing, I'd put it closer to suicide than "falling down hard." I was just trying to be polite.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

Is it just me, or is anyone else wondering if maybe nVidia threw all of their eggs into the FERMI basket, and are finding it's not quite up to the task? Seems to me that the reasons for the delays must have something to do with quality or performance of the units, because even if they could come close, nVidia would have pushed some product out the doors to try to stifle the runaway success ATi is managing right now (even with the shortages)... Just look at the recent craptacular nVidia releases of older architecture, it's barely in the ballpark with comparable ATi's similarly priced stuff, yet nVidia threw it out there to try to maintain a corporate visibility - and hope that some of the clueless "nVidia or bust!" diehard fanboys would buy the crap cards. The fact that nVidia is remaining completely silent and pushing FERMI back farther and farther, just indicates (to me) that there are some serious issues under the hood. Which is really too bad, if true, because I had high hopes for FERMI hardware (based on the information I could find about the architecture). Hopefully they can pull it out and get some good products on the shelves.

TomSEA TomSEA, TechSpot Chancellor, said:

I definitely agree with your FERMI assessment Vrmithrax. There's something seriously funky going on, otherwise Nvidia wouldn't have allowed themselves to be put in this position. Those "new" releases of old architecture cards is really telling. Smacks of desperation.

Relic Relic, TechSpot Chancellor, said:

Wow...how many people were waiting for FERMI early in the year to finally upgrade there system and are now being pushed back again. Wonder if some are just gonna jump ship and go with 5xxx upgrades.

Guest said:

It is true, AMD has these great DX11 cards on the market. Nevertheless, AMD cards are nothing but expensive garbage, and this is the truth from the beginning of ATI as a graphic card company. Only in some benchmark applications an AMD card can be used. In any other real world application the ATI/AMD driver simply doesn't work. By contrast, there are only very isolated cases when an nVidia card failed to perform to it's maximum specifications. Their drivers are really good.

Guest said:

i know everybody had great hopes for fermi, but even though amd has a lead now it would not be much only till fermi is out

dividebyzero dividebyzero, trainee n00b, said:

Relic said:

Wow...how many people were waiting for FERMI early in the year to finally upgrade there system and are now being pushed back again. Wonder if some are just gonna jump ship and go with 5xxx upgrades.

Anyone who was/is interested in purchasing GF100/GTX380 etc cards would be well aware that this isn't news. From [link] 5 or 6 days ago I wrote:

"General consensus seems to be March 2010.

From what I understand the A2 silicon wasn't the HD5870 stomper nV expected and they have turned to the A3 revision which won't be hard launched until March- maybe a couple of weeks earlier if they want to coincide it with the first mass appeal DX11 game (CoP)."

This story was floated on the wires by Digitimes to add a little grist to the mill because of AMD's announcement today that it's unveiling its HD5600/5500 and mobility cards at CES

I'm sure when the benchmarks come out and the (overpriced) HD5670 trounces the 9800GT the same knee-jerk, pavlov's dog response will be observed once again by the supposed "tech-savvy" community. News just in, Muppets-on-a-string, it gets more fps!

There must be a hell of a lot of people who are gaming at 2560x1600 or better and have deep pockets judging by the amount of concern over the release date of a high-end graphics architecture.

Hey red how's that conspiracy theory shaping up? Still think TSMC, nVIDIA, the Men in Black and Bigfoot are sitting around the conference table at Area 51 plotting the downfall of humanity?

"I get to see if my collusion theory is correct. if indeed the 40 nm 'shortage' is over, i bet that means that fermi will be released in Jan. (they had to ease it a few weeks before for plausible deniability)

hmmm...... "

Personally I blame Brett Favre for this whole mess...

Guest said:

but really man i think fermi will be great and easily be able to beat AMDs latest cards by a big margin

i think because the technology is so advanced its causing production issues for nvidia

Guest said:

I too agree with Vrmithrax. Nvidia probably has their fermi gpu completed already only to find out they could not meet the standards of ATI's 5xxx gpus. High chances the delay could be Nvidia improving their products before releasing it out on the projected dates as it might be a laughing stock and a threat to their reputation. I personally is an Nvidia fan as i am not really fond of ATI's technology, even when they are top in terms of performance now. However, i could see Nvidia able to risk this mainly due to their good reputation and Direct X11 isn't much of a worry now from where i perceive. There ain't many games that requires Direct X11 and even game releases this year adopt Direct X9 or older technology as they could reach out to more customers that simply do not have the budget to upgrade their old graphics card.

Badfinger said:

I got a 5850 2 weeks ago, $335 including tax, that's the end of video card buying for a long while now. 8)

Very satisfied!

dividebyzero dividebyzero, trainee n00b, said:

Guest said:

i know everybody had great hopes for fermi, but even though amd has a lead now it would not be much only till fermi is out

Correct.

The discrete consumer graphics card market is but one facet of of both AMD's and nV's portfolio.

AMD can make profit from the HD5000 series cards-even more so for their AIB's as the prices are (and will continue to be) extortionate.This will help offset the losses that AMD absorb in the CPU and chipset market at the hands of Intel.

NVIDIA take a bath trying (in vain) trying to combat the HD5970/5870/5850/5770 with existing cards and losing huge market share in the mainstream market to the upcoming HD5600/5500. This is offset by nVIDIA's dominance with the OEM market, workstation graphics cards (Quadro),and the expanding System on a chip -Ion2, tegra devices.

When Fermi eventuates then rest assured that it dominate the market segment it is aimed at.

The main point of concern is can the architecture be replicated on a cut down or hobbled chip to compete in the mainstream and budget card market. If not then GF100 cards will be priced at a premium to recoup R&D expenses and simply because they are the best ( a la present HD5970 pricing) in the performance market, while AMD could conceivably be left to dominate the mainstream and budget market -again with higher pricing because of lack of competition.

Not so strangely this strategy would benefit both AMD and NV. Nvidia working the top-end -minimal units at high price segment-seguing into GPGPU / Tesla which they have invested sizeable amounts of cash and R&D time. AMD taking the mainstream.budget segment (large production, no requirement for expensive ongoing R&D, reasonable return on architecture already in place). Nvidia carving up the SoC at Intel's expense,and sharing the onboard video chipset market. The only contentious market likely to be the growing full-size laptop market.

If this pans out, then as they say on tv....YOU are the biggest loser!

Guest said:

i hope this isn't true i need to upgrade to a fermi my 8600gts is gettin old can't play crysis on max settings with aa on and alot of the newer games the minimum to play is a 7600 my cards had it i just got a 1920x1200 monitor waiting for this card to come out in the nexts weeks but now looks like i have to play it at 1280x720 on every game(which is good)but i want the full experience you know just hope this is not true and i've been looking forward to this card since about april it's just a kick when your down feeling on hearing this anyways happy holidays and happy 2010

Guest said:

Fermi has been shrunk from 512 to 448, Intel cancelled Larrabee (it's now a development/test platform) AMD/ATI is being very cautious about it's "fusion". Makin an integrated cpu/gpu is harder than a lot of people thought when plans were announced, makes me wonder if in the long run AMD/ATI (DAAMIT) may not end up having a better balance in their "fusion" product, graphics/graphics-drivers has never been a strong point of Intel (especially with the "bloat" in everyone's favourite OS to hate/love and going forward with GPU accelerated features that will be used by developers even when unnecessary - don't quote me, but devs tend to write rather convoluted code).

ixm

Regenweald said:

I think Nvidia does not give that much of a damn about the ailing pc gaming industry anymore. Looking at the true power of the fermi cards and how Nvidia is marketing them, there are billions to be made is medical research and true high end processing applications and in the tegra SOC which has won a few notable design contracts already. So why not hold on and perfect multi thousand dollar workstations GPU's rather than rush out 1 to 600 dollar cards so that people can get lots of fps to shoot ****. The scientific community is abuzz with the fermi architecture, that is where Nvidia's money is. ATI doesn't really have a mature enough answer in this segment.

Edit: Fermi is a steaming pile of failed architecture. let this be a lesson children, make decisions based on hard facts and not marketing dept. releases

dividebyzero dividebyzero, trainee n00b, said:

Regenweald said:

I think Nvidia does not give that much of a damn about the ailing pc gaming industry anymore. Looking at the true power of the fermi cards and how Nvidia is marketing them, there are billions to be made is medical research and true high end processing applications ...

Very true I fear.

Desktop PC sales might be stable but it's not a growing market. Of the PC's sold, how many owners actually upgrade hardware unless they have to?

How many games now (or being released) are unplayable on the last generation of cards?

With the proliferation of console ported games would anyone see the need to buy a HD6970 or a 28nm HD5890 ? -and you know they're coming.

Would it not make more financial sense to use the same architecture to target the number crunching crowd with Tesla and Quadro branded cards?

How many organisations are going to follow the University of Antwerp's Fastra II lead- Six GTX295's and a GTX275 (for outputting visuals) outperforming a 512 Core Opteron cluster. How many Xeon and Opteron servers and high speed computing solutions ripe for usurping?. And all you have to do is optimise the BIOS and driver profile for a few of the cards -re-brand them as GeForce and you instantly install the brand at (or near) the top of the heap for publicity purposes and for those that can afford them.

By early 2011 AMD will be banking on the Northern Islands HD5000 series 28nm refresh to combat the second generation Fermi and relying on little more than faster memory to keep the interest going. I sincerely hope they use their lead time and profits to invest in R&D, or come 2011 AMD are going to feel like they've just been given a proctology exam by King Kong.

compdata compdata, TechSpot Paladin, said:

Guest said:

It is true, AMD has these great DX11 cards on the market. Nevertheless, AMD cards are nothing but expensive garbage, and this is the truth from the beginning of ATI as a graphic card company. Only in some benchmark applications an AMD card can be used. In any other real world application the ATI/AMD driver simply doesn't work. By contrast, there are only very isolated cases when an nVidia card failed to perform to it's maximum specifications. Their drivers are really good.

What a bunch of garbage without any backing up or your "opinion". Probably a blogger paid by Nvidia ;-) Register and post under your own name if you dare.

BlindObject said:

Hmm, I'm hoping Fermi is this organic living tissue of a card that can take on Crysis at 120fps with no probs, lol.

dividebyzero dividebyzero, trainee n00b, said:

compdata said:

What a bunch of garbage without any backing up or your "opinion". Probably a blogger paid by Nvidia ;-) Register and post under your own name if you dare.

Whoa! who twanged your bra strap! " A massive oversimplification on "Guest's" part. But before you set fire to the heretic there is a grain of reason there.

Nvidia drivers are usually fairly mature at product launch and when issues arise they are usually followed fairly quickly with beta drivers to alleviate the problem. As an owner of a SLI rig I can attest that all is not a bed of roses in the driver department.

Having said that, around five weeks ago nV released a beta build and followed it with a WHQL build the next day.

My main gripe when I had 2 HD4870 512Mb cards in crossfire (P45 chipset) was the once a month driver update-infuriating if a game is released just after a Catalyst release.

Regardless of driver releases these cards scaled ok in some games but were utter sh*t in others- a friend who owns a HD4850X2 had to wait over 6 months to play Call of Juarez while some games have never received a fix -and the microstuttering is abysmal.

Just before you play the that-was-then-this-is-now card, you might want to look into the HD5000 series (notably th 5850 with downclocking problems and running multiple monitors as well as the 8 x MSAA problems with L4D, L4D2 and TF2- this is after 9.11 and 9.12 driver releases.

I like the "...if you dare" bit tho- very Errol Flynn!

kommunist said:

I still think AMD has Nvidia right where they want them. When Fermi does finally get released, and assuming that it is a total redesign of the architecture, then its logical to conclude that it will be very expensive.

Even if Fermi does come out with a faster performing card, all AMD has to do is drop the prices on the 5 series cards and that could mean big trouble for Fermi.

In the end, consumers will vote with their wallets.

Guest said:

just like wt divide by zero said

these days most games require very low end cards like 9800 gt and so

because its getting limited by consoles

so i guess nvidia targeting different markets other than PC

Regenweald said:

People. Nvidia was king of gaming architecture. They got while the getting was good and ruled the gaming industry, but no CEO is going to bas a company's future on an industry whose entire paradigm is shooting things. It is also not the best business sense to have a department burning through R&D funds, scrambling to modify drivers to support....... games.

The revived console market is somewhat static in terms of updates and developers create games to work with the specific hardware and software in the console. That is one, mass produced GPU, period.

Scientific research, geo-seismic analysis etc. you know.......... serious business, where billions are involved is a burgeoning market for GPU processing and Nvidia is getting in at the foundation as they did with gaming and creating a new market all over again. As great as the new cards are, the 5000 series have sold what ? a million units so far ? that is scraps. The tesla cards WILL be expensive, and if they deliver what is promised, Universities, Oil and Gas Enterprises and Pharmaceutical manufacturers will snatch them up in droves.

Edit: They did *not* deliver anything close to what was promised. Thermi is a fail.

9Nails, TechSpot Paladin, said:

BlindObject said:

Hmm, I'm hoping Fermi is this organic living tissue of a card that can take on Crysis at 120fps with no probs, lol.

No LOL's from me. I mean seriously, I want 120 FPS in Crysis today. I've not been very impressed by Nvidia's cards since the 8800 series. I'm sure there's technical improvements, mostly with clock speed, but the technology hasn't kept pace in the latest models. Really, Crysis is the ultimate benchmark. The game looks fantastic and if your card can run that game well - it can basically run any game well. With Crysis 1680x1050 HQ settings, a 3 year old 8800 GTS can do low 30's FPS. And a modern GTX 295 with it's dual GPU's can get you mid 40's.

There's little reason for me to justify an upgrade any more, unless I look at AMD and want DX 11. Nvidia needs to nail this technology and come out with a winner and not just give us another clock speed update.

dividebyzero dividebyzero, trainee n00b, said:

Benchmark Fail

No LOL's from me....With Crysis 1680x1050 HQ settings, a 3 year old 8800 GTS can do low 30's FPS. And a modern GTX 295 with it's dual GPU's can get you mid 40's.

I call BS.

As the (moderately) happy owner of a GTX 280 SLI rig I'd say that the GTX 295 is in the same ballpark performance wise and I get 60+ fps at 1920 x 1080.

Probably comparable to [link] in fact....But maybe thats just because of a lack of cpu bottleneck eh?

Guest said:

i agree with 9Nails IMO video cards are not that much better then the last generation of cards. i own a 4890 and its a great card and i bought it for $250(a while back) and the performance of the new ATI cards just doesn't impress me at all (especially for the price) and looking at tech spot's reviews(5870&MW2) of the new cards. it just doesn't justify me spending $400+(newegg&tigerdirect) on a card that doesn't give me that much more performance, and dx11? what about it by the time games "decent tittles" start fully using dx11, nvidia cards will be on the shelves and if not then a couple of months of waiting is not going to hurt .i understand upgrading if you have an old card but a fairly new one(like the last series) its a complete waste to upgrade now! ATI makes me feel like they just rushed this card out to get a lead off of nvidia(cant blame them its business), and so in this i have no problem waiting for nvidia to release their cards so i can compare, but i do believe that nvidia will come back and take the performance crown like they all ways have done before even if the cost is more... but i want to see both of the company's offerings before i blow hundreds of dollars on a video card and that's even if i think it is worth upgrading at all!

Guest said:

i cant believe all you losers actually think AMD will beat nvidia just because it delayed its cards by 3 months , it has ruled for years and all of the stupid direct x 11 cant do anything to nvidia's domination , infact direct x 11 reduces the performance of the game by 30 % for minor visual enchancements which are difficult to notice

NVIDIA RULES

Guest said:

Yea you said it DirectX 11 reduces performance

¼ of a hotdog ¼ of a hotdog said:

This article suggests that Nvidia won't have their DX11 cards out for another six months. That's will make it an ENTIRE YEAR since AMD came out with their off-the-charts performing DX11 cards that effectively blew out of the water every Nvidia card out there. Not to mention that when Nvidia does come out with their new cards, there's no guarantee they're going to be better than the AMD cards.

And you think "falling down hard" is an exaggeration? In the competitive world of video card manufacturing, I'd put it closer to suicide than "falling down hard." I was just trying to be polite.

I'm not going to argue because I think only time can tell but... Ummm January, February, then March... how did you get six exactly, you do know this is December, right? I think you mean it would be six months since ATI released those cards... not a year unless you mean the first demo card but wouldn't that be about 9 or 10 months? I hope you're not a coder because your math is a little bit off, well you can always use a calculator, I recommend graphcalc its a free graphing calculator.

Warcraft said:

Guest said:

It is true, AMD has these great DX11 cards on the market. Nevertheless, AMD cards are nothing but expensive garbage, and this is the truth from the beginning of ATI as a graphic card company. Only in some benchmark applications an AMD card can be used. In any other real world application the ATI/AMD driver simply doesn't work. By contrast, there are only very isolated cases when an nVidia card failed to perform to it's maximum specifications. Their drivers are really good.

Obviously words from an Nvidia trolling fan-boy

Nvidia, has made the least innovation this decade when it has come to GPU's.

The part about AMD cards being expensive garbage is nothing more than an irrelevant fabrication, for nvidia cards have always been the more expensive cards.

AMD cards are only expensive right now because a silicon shortage.

Most of the major triumphs this decade were from ati, then used by nvidia because what they were doing weren't classified under a patent from stopping them.--remember the Radeon 9700? It was the card that really innovated graphics earlier in this decade. If it wasn't for Ati's work creating the radeon 9700, expect nothing to be like it is now. It was also ATI that created efficient AA, and AF filtering, which to this day is still true in comparison to nvidia.--although I can come to agreement that heavy usage of AA isn't needed at high resolutions.

Guest said:

I'm reading those comments and can't stop being shocked by amount (in lack of better words) stupidity.

Nvidia and AMD are not your fathers (or your own) companies. You DON'T have huge percentage (if any) of their stocks.

Now repeat this 10 times and start thinking like normal human should think.

Don't be delusional and don't go into extremes like some comments i saw - Nvidia rocks (when at the moment ATI have an upper hand and only blind and stupid person can't see that) or ATI is the best (when only blind and stupid person don't know that glory is today and tomorrow who knows).

For all those with lack of gray mater let me give you some quick guide how "normal" people function.

Q) Do i need a product (graphics card, CPU, memory, a car, house, TV set . . .)

A 1) If i dont I DON'T CARE if ATI or Nvidia (or any other company) have the best and fastest or better looking or . . . . product. I can and should stay informed about new developments but for normal person this is nothing more than "information" that will be updated once they decide to buy something.

A 2) If normal person need some product it WILL buy the best his money can buy at the moment of buying. For normal person it means nothing if "in few months" something better could come because IT'S ALWAYS something better just around corner. Also normal person WILL NEVER buy outdated product if it's not dirty cheap.

If nothing just think who is buying last year car model for the price of new this year model car?

Guest said:

DirectX 11 is nothing more than a term at the moment. Who cares if AMD has their short lead. nVidia will be back and when they do come back if will again take AMD/ATI lightyears to catch up

dividebyzero dividebyzero, trainee n00b, said:

Guest said:

DirectX 11 is nothing more than a term at the moment. Who cares if AMD has their short lead. nVidia will be back and when they do come back if will again take AMD/ATI lightyears to catch up

Hopefully that isn't the case.

With only one player in the gpu market you won't see competitive pricing again.

Much the same as is happening at the moment- AMD won't drop prices until they are either forced to by competition, or market demand slows and they have stock piling up around their ears- the latter not a likely scenario with certain tech media outlets whipping up a frenzy over the new must-have DirectX 11 (I'll reserve judgement until something a little better than DIRT 2 and Battle Forge hit the shelves) and the supposed demise of nVidia.

Strangely enough I seem to recall this scenario being enacted a little while ago.....Radeon X1900 XT(XGTOXLXTXXXXLESEVIVO etc...) ruling the roost until November 2006 when a card with virtually no advance leaked benchmarks arrived....the 8800 GTX...how'd that work out again?

Guest said:

I'm going to say it... AMD sucks. Not just because I think so but because a fighter that wins 1 fight out of 4 is a poor fighter. You want to talk about product to market delays AMD does it so much its concidered the norm. Anyone who has seen a Rocky movie knows the hero can't win every fight or he becomes a tyrant. He has to loose one that appears to be a big deal and then come back for the win. I would look at this delay as Nvidia no longer concidering AMD a threat as the same thing happens EVERY TIME. Nvidia dominates for 2 years AMD comes out with a card at the middle of the following year and is competitive for 6 - 8 months and the Nvidia trumps that card for another year, AMD makes something thats at least on the mainstream market competitive so the red fans can still play a few games... rinse repeat. I mean really people the whole AMD/ATI combo thing... really? 2 loosers don't make a winner it just gives them someone to hold and cry with after the fight is over.

Guest said:

I love the argument that Nvidia has better drivers than ATI what a joke. Have a look here http://arstechnica.com/hardware/news/2008/03/vista-capable-l
wsuit-paints-picture-of-buggy-nvidia-drivers.ars and then back up the comments about ATI drivers with more than random examples about a newly released card. Every new card has growing pains even Nvidia ones to say otherwise is misleading at best.

Until Nvidia releases a new card that is decent I recommend ATI as the best choice and any knowledgeable people are doing the same.

Guest said:

Ur Wrong. At the time of "Fermi" based 380 GTX release may be June or July.That time i think AMD release her new baby HD 6870.Then it was trash nVidia "Fermi". NVidia fall move.

dividebyzero dividebyzero, trainee n00b, said:

Ur Wrong. At the time of "Fermi" based 380 GTX release may be June or July.That time i think AMD release her new baby HD 6870.Then it was trash nVidia "Fermi". NVidia fall move.

QUITE SO Mr Guestbot !

I also have it on good authority that six minutes after the HD 6870 debuts, nV will release Fermi 2:The Wrath of Huang. ATI will counter this two days later with the HD 99999970 XTX GTO VIVO XL PE LE Extra Special Super Overclock Edition made from tech transported back from the future by the Large Hadron Collider...2 minutes after that it will become SELF AWARE!!!! And then refuse to play Crysis at 2560 x 1600 at Enthusiast settings as a sign of solidarity with it's less evolved brethren.

Guest said:

Ok here we go.Nvidia has been on top for a long time and now the ATI/AMD fanboy crew is eating it up that the boys from ATI are on top for several months.What about the last 10 years or so that nvidia has spanked ATI.The delays are simple the 5800's were an easy launch for ATI most of the technology is the same as the 4800 series just boosted up.Nvidia has reworked everything from the ground up on Fermi and whenever you do that delays are a huge concern.However I distinctly remember last time Nvidia took a big step in changing the way their GPU's worked,they were called the 8800's and were twice as good as anything ATI could muster.Don't dog Nvidia for taking the time to get it right because when it does launch it will eat anything ATI has alive.Anyone who knows anything about how GPU's work and has seen the specs for fermi will agree that the 5800's will eat dust.It takes time to start from the ground up and get it perfect,my props to nvidia for not launching these product until they are trouble free and ready.

Guest said:

What a pity!

ATi/AMD delivered the DirectX 11 experience to its fans months ago. nVidia couldn't make it. It's real, and it's a tremendous defeat! That's what happened, that's the truth.

It's called "Cost of Opportunity". There's no price to be the first to experience a ATi Radeon HD 5870 in its all glory, a single card crushing nVidia's dual card GTX 295. And we're talking about a heavy title such as Crytek Crysis @ 2560-1600.

According to Tom's Hardware, nVidia GTX 295 simply didn't work at that resolution. Pity again! Please, see for yourself.

http://www.tomshardware.com/reviews/radeon-hd-5870,2422-13.h
ml?xtmc=crysis_2560-1600_gtx_295_buffer_memory&xtcr=2

One of the paragraphs from this review says:

"Notice the missing result for Nvidia's GeForce GTX 295 at 2560-1600 with 8xAA? That's due to the card not having ample on-board memory to run that configuration (and the game not knowing any better than to keep it from trying). Grand Theft Auto gets around this by simply making resolutions unavailable if a graphics card doesn't have a large enough frame buffer. Crysis crashes instead."

GTX 295 not being able to run Crysis @ 2560-1600? Pity!

Fermi has got to be better and faster than Cypress. It's an obligation for nVidia to build this in that way, since they had, at least, more time to conceive it.

And, as always, don't be fooled: you're going to hurt your pocket to have Fermi installed onto your RIG. Be prepared to pay the price. It happened with Cypress. It's going to be the same with Fermi. And since, nVidia cards are always much more expensive than ATi/AMD's, one Fermi card can reach as much as 750 bucks. Wait and see.

Take this weekend and go to your favorite retail store and grab your ATi Radeon HD 5870. It's there, real. Just take it.

Fermi, humpf...maybe 3Q2010 you'll get one. It's just an illusion...a dream (that hasn't come true...hehehehe...)

Cheers!

Guest said:

Where the hell do you get the info of ATI cannot run certain benchmarking programs and that they expensive garbage. Just for interest sake, I have used Nvidia and ATI products for a long time, since the first releases (most probably when you were still a ugly little baby). I have noticed nothing bad about either companies, both had good and bad points. ATI cards have much better bang for buck, always has been the case. Nvidia always had the performance crown, but at a price (also because they tend to pay companies more to make TWIMTBP games). Secondly, this round I went with ATI (sold my 260 and bought a 5870) Why? Because ATI is a much more stable company at the moment since there is so many rumors about Nvidia and financial trouble. The last time I heard this rumors was with Voodoo, and I lost support and warranty on my cards. (I only upgrade about every 2 years and dont want my card to become obsolete.) Also, ATI has all the features I need to play the latest games at the resolutions I need (1920x1080) without trouble with DX11 etc. Nvidia's new range is going to be too expensive, and is going to use too much power. Currently, I dont want to pay a huge electricity bill just to play games.

CMH, TechSpot Chancellor, said:

Take this weekend and go to your favorite retail store and grab your ATi Radeon HD 5870. It's there, real. Just take it.

I hope you're not suggesting shoplifting....

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.