Nvidia pushes back Fermi GPU to March 2010?

Status
Not open for further replies.

Jos

Posts: 3,073   +97
Staff

After unveiling Fermi in September Nvidia had scheduled to launch cards based the new GPU architecture as early as November. That obviously didn't happen and considering the present date it's also quite clear that we won't be seeing an answer to AMD's latest Radeon HD series this year. Presumably a launch date was pushed back to coincide with CES in January, but now it seems that the delay might be even longer than feared.

According to DigiTimes, sources from within motherboard manufacturers have been informed by Nvidia that its first 40nm graphics chip will arrive March as the GF100, with the GF104 also set to target the high-end graphics market in the second quarter of 2010. For the mainstream and performance level segments, Nvidia will allow its GTS 200 series and 9800/9500GT cards to continue competing against AMD's DirectX 11 and older products.

If that wasn't enough of a problem, AMD is said to be preparing a few more releases targeted at the all-important mainstream market. Slated for late January or early February, the 40nm Radeon HD 5670 (Redwood) and HD 5450 (Cedar) will reportedly slot in alongside the unannounced Radeon HD 5570 and HD 5350 to round out the lower and middle portions of the company's Evergreen refresh.

Permalink to story.

 
Boy Nvidia is really on the ropes taking an incredible beating from AMD. After dominating for so many years, amazing to see them fall down so hard.
 
Falling down so hard you say... lol! Bit of an exaggeration IMHO.

how could it be an 'exaggeration'? to give the your competition a six month head start? and let them run the table? ....that is the definition of 'falling down' in the business world.
 
This article suggests that Nvidia won't have their DX11 cards out for another six months. That's will make it an ENTIRE YEAR since AMD came out with their off-the-charts performing DX11 cards that effectively blew out of the water every Nvidia card out there. Not to mention that when Nvidia does come out with their new cards, there's no guarantee they're going to be better than the AMD cards.

And you think "falling down hard" is an exaggeration? In the competitive world of video card manufacturing, I'd put it closer to suicide than "falling down hard." I was just trying to be polite.
 
Is it just me, or is anyone else wondering if maybe nVidia threw all of their eggs into the FERMI basket, and are finding it's not quite up to the task? Seems to me that the reasons for the delays must have something to do with quality or performance of the units, because even if they could come close, nVidia would have pushed some product out the doors to try to stifle the runaway success ATi is managing right now (even with the shortages)... Just look at the recent craptacular nVidia releases of older architecture, it's barely in the ballpark with comparable ATi's similarly priced stuff, yet nVidia threw it out there to try to maintain a corporate visibility - and hope that some of the clueless "nVidia or bust!" diehard fanboys would buy the crap cards. The fact that nVidia is remaining completely silent and pushing FERMI back farther and farther, just indicates (to me) that there are some serious issues under the hood. Which is really too bad, if true, because I had high hopes for FERMI hardware (based on the information I could find about the architecture). Hopefully they can pull it out and get some good products on the shelves.
 
I definitely agree with your FERMI assessment Vrmithrax. There's something seriously funky going on, otherwise Nvidia wouldn't have allowed themselves to be put in this position. Those "new" releases of old architecture cards is really telling. Smacks of desperation.
 
Wow...how many people were waiting for FERMI early in the year to finally upgrade there system and are now being pushed back again. Wonder if some are just gonna jump ship and go with 5xxx upgrades.
 
It is true, AMD has these great DX11 cards on the market. Nevertheless, AMD cards are nothing but expensive garbage, and this is the truth from the beginning of ATI as a graphic card company. Only in some benchmark applications an AMD card can be used. In any other real world application the ATI/AMD driver simply doesn't work. By contrast, there are only very isolated cases when an nVidia card failed to perform to it's maximum specifications. Their drivers are really good.
 
i know everybody had great hopes for fermi, but even though amd has a lead now it would not be much only till fermi is out
 
Relic said:
Wow...how many people were waiting for FERMI early in the year to finally upgrade there system and are now being pushed back again. Wonder if some are just gonna jump ship and go with 5xxx upgrades.

Anyone who was/is interested in purchasing GF100/GTX380 etc cards would be well aware that this isn't news. From https://www.techspot.com/news/37419-amd-40nm-yields-have-stabilized-supplies-improving.html 5 or 6 days ago I wrote:

"General consensus seems to be March 2010.
From what I understand the A2 silicon wasn't the HD5870 stomper nV expected and they have turned to the A3 revision which won't be hard launched until March- maybe a couple of weeks earlier if they want to coincide it with the first mass appeal DX11 game (CoP)."

This story was floated on the wires by Digitimes to add a little grist to the mill because of AMD's announcement today that it's unveiling its HD5600/5500 and mobility cards at CES
I'm sure when the benchmarks come out and the (overpriced) HD5670 trounces the 9800GT the same knee-jerk, pavlov's dog response will be observed once again by the supposed "tech-savvy" community. News just in, Muppets-on-a-string, it gets more fps!

There must be a hell of a lot of people who are gaming at 2560x1600 or better and have deep pockets judging by the amount of concern over the release date of a high-end graphics architecture.
Hey red how's that conspiracy theory shaping up? Still think TSMC, nVIDIA, the Men in Black and Bigfoot are sitting around the conference table at Area 51 plotting the downfall of humanity?

"I get to see if my collusion theory is correct. if indeed the 40 nm 'shortage' is over, i bet that means that fermi will be released in Jan. (they had to ease it a few weeks before for plausible deniability)
hmmm...... "

Personally I blame Brett Favre for this whole mess...
 
but really man i think fermi will be great and easily be able to beat AMDs latest cards by a big margin
i think because the technology is so advanced its causing production issues for nvidia
 
I too agree with Vrmithrax. Nvidia probably has their fermi gpu completed already only to find out they could not meet the standards of ATI's 5xxx gpus. High chances the delay could be Nvidia improving their products before releasing it out on the projected dates as it might be a laughing stock and a threat to their reputation. I personally is an Nvidia fan as i am not really fond of ATI's technology, even when they are top in terms of performance now. However, i could see Nvidia able to risk this mainly due to their good reputation and Direct X11 isn't much of a worry now from where i perceive. There ain't many games that requires Direct X11 and even game releases this year adopt Direct X9 or older technology as they could reach out to more customers that simply do not have the budget to upgrade their old graphics card.
 
I got a 5850 2 weeks ago, $335 including tax, that's the end of video card buying for a long while now. 8)
Very satisfied!
 
Guest said:
i know everybody had great hopes for fermi, but even though amd has a lead now it would not be much only till fermi is out

Correct.
The discrete consumer graphics card market is but one facet of of both AMD's and nV's portfolio.
AMD can make profit from the HD5000 series cards-even more so for their AIB's as the prices are (and will continue to be) extortionate.This will help offset the losses that AMD absorb in the CPU and chipset market at the hands of Intel.
NVIDIA take a bath trying (in vain) trying to combat the HD5970/5870/5850/5770 with existing cards and losing huge market share in the mainstream market to the upcoming HD5600/5500. This is offset by nVIDIA's dominance with the OEM market, workstation graphics cards (Quadro),and the expanding System on a chip -Ion2, tegra devices.

When Fermi eventuates then rest assured that it dominate the market segment it is aimed at.
The main point of concern is can the architecture be replicated on a cut down or hobbled chip to compete in the mainstream and budget card market. If not then GF100 cards will be priced at a premium to recoup R&D expenses and simply because they are the best ( a la present HD5970 pricing) in the performance market, while AMD could conceivably be left to dominate the mainstream and budget market -again with higher pricing because of lack of competition.
Not so strangely this strategy would benefit both AMD and NV. Nvidia working the top-end -minimal units at high price segment-seguing into GPGPU / Tesla which they have invested sizeable amounts of cash and R&D time. AMD taking the mainstream.budget segment (large production, no requirement for expensive ongoing R&D, reasonable return on architecture already in place). Nvidia carving up the SoC at Intel's expense,and sharing the onboard video chipset market. The only contentious market likely to be the growing full-size laptop market.
If this pans out, then as they say on tv....YOU are the biggest loser!
 
i hope this isn't true i need to upgrade to a fermi my 8600gts is gettin old can't play crysis on max settings with aa on and alot of the newer games the minimum to play is a 7600 my cards had it i just got a 1920x1200 monitor waiting for this card to come out in the nexts weeks but now looks like i have to play it at 1280x720 on every game(which is good)but i want the full experience you know just hope this is not true and i've been looking forward to this card since about april it's just a kick when your down feeling on hearing this anyways happy holidays and happy 2010
 
Fermi has been shrunk from 512 to 448, Intel cancelled Larrabee (it's now a development/test platform) AMD/ATI is being very cautious about it's "fusion". Makin an integrated cpu/gpu is harder than a lot of people thought when plans were announced, makes me wonder if in the long run AMD/ATI (DAAMIT) may not end up having a better balance in their "fusion" product, graphics/graphics-drivers has never been a strong point of Intel (especially with the "bloat" in everyone's favourite OS to hate/love and going forward with GPU accelerated features that will be used by developers even when unnecessary - don't quote me, but devs tend to write rather convoluted code).

ixm
 
I think Nvidia does not give that much of a damn about the ailing pc gaming industry anymore. Looking at the true power of the fermi cards and how Nvidia is marketing them, there are billions to be made is medical research and true high end processing applications and in the tegra SOC which has won a few notable design contracts already. So why not hold on and perfect multi thousand dollar workstations GPU's rather than rush out 1 to 600 dollar cards so that people can get lots of fps to shoot ****. The scientific community is abuzz with the fermi architecture, that is where Nvidia's money is. ATI doesn't really have a mature enough answer in this segment.

Edit: Fermi is a steaming pile of failed architecture. let this be a lesson children, make decisions based on hard facts and not marketing dept. releases :)
 
Regenweald said:
I think Nvidia does not give that much of a damn about the ailing pc gaming industry anymore. Looking at the true power of the fermi cards and how Nvidia is marketing them, there are billions to be made is medical research and true high end processing applications ...

Very true I fear.
Desktop PC sales might be stable but it's not a growing market. Of the PC's sold, how many owners actually upgrade hardware unless they have to?
How many games now (or being released) are unplayable on the last generation of cards?
With the proliferation of console ported games would anyone see the need to buy a HD6970 or a 28nm HD5890 ? -and you know they're coming.
Would it not make more financial sense to use the same architecture to target the number crunching crowd with Tesla and Quadro branded cards?
How many organisations are going to follow the University of Antwerp's Fastra II lead- Six GTX295's and a GTX275 (for outputting visuals) outperforming a 512 Core Opteron cluster. How many Xeon and Opteron servers and high speed computing solutions ripe for usurping?. And all you have to do is optimise the BIOS and driver profile for a few of the cards -re-brand them as GeForce and you instantly install the brand at (or near) the top of the heap for publicity purposes and for those that can afford them.
By early 2011 AMD will be banking on the Northern Islands HD5000 series 28nm refresh to combat the second generation Fermi and relying on little more than faster memory to keep the interest going. I sincerely hope they use their lead time and profits to invest in R&D, or come 2011 AMD are going to feel like they've just been given a proctology exam by King Kong.
 
Guest said:
It is true, AMD has these great DX11 cards on the market. Nevertheless, AMD cards are nothing but expensive garbage, and this is the truth from the beginning of ATI as a graphic card company. Only in some benchmark applications an AMD card can be used. In any other real world application the ATI/AMD driver simply doesn't work. By contrast, there are only very isolated cases when an nVidia card failed to perform to it's maximum specifications. Their drivers are really good.

What a bunch of garbage without any backing up or your "opinion". Probably a blogger paid by Nvidia ;-) Register and post under your own name if you dare.
 
compdata said:
What a bunch of garbage without any backing up or your "opinion". Probably a blogger paid by Nvidia ;-) Register and post under your own name if you dare.

Whoa! who twanged your bra strap! " A massive oversimplification on "Guest's" part. But before you set fire to the heretic there is a grain of reason there.
Nvidia drivers are usually fairly mature at product launch and when issues arise they are usually followed fairly quickly with beta drivers to alleviate the problem. As an owner of a SLI rig I can attest that all is not a bed of roses in the driver department.
Having said that, around five weeks ago nV released a beta build and followed it with a WHQL build the next day.
My main gripe when I had 2 HD4870 512Mb cards in crossfire (P45 chipset) was the once a month driver update-infuriating if a game is released just after a Catalyst release.
Regardless of driver releases these cards scaled ok in some games but were utter sh*t in others- a friend who owns a HD4850X2 had to wait over 6 months to play Call of Juarez while some games have never received a fix -and the microstuttering is abysmal.
Just before you play the that-was-then-this-is-now card, you might want to look into the HD5000 series (notably th 5850 with downclocking problems and running multiple monitors as well as the 8 x MSAA problems with L4D, L4D2 and TF2- this is after 9.11 and 9.12 driver releases.
I like the "...if you dare" bit tho- very Errol Flynn!
 
I still think AMD has Nvidia right where they want them. When Fermi does finally get released, and assuming that it is a total redesign of the architecture, then its logical to conclude that it will be very expensive.

Even if Fermi does come out with a faster performing card, all AMD has to do is drop the prices on the 5 series cards and that could mean big trouble for Fermi.

In the end, consumers will vote with their wallets.
 
just like wt divide by zero said
these days most games require very low end cards like 9800 gt and so
because its getting limited by consoles
so i guess nvidia targeting different markets other than PC
 
People. Nvidia was king of gaming architecture. They got while the getting was good and ruled the gaming industry, but no CEO is going to bas a company's future on an industry whose entire paradigm is shooting things. It is also not the best business sense to have a department burning through R&D funds, scrambling to modify drivers to support....... games.
The revived console market is somewhat static in terms of updates and developers create games to work with the specific hardware and software in the console. That is one, mass produced GPU, period.

Scientific research, geo-seismic analysis etc. you know.......... serious business, where billions are involved is a burgeoning market for GPU processing and Nvidia is getting in at the foundation as they did with gaming and creating a new market all over again. As great as the new cards are, the 5000 series have sold what ? a million units so far ? that is scraps. The tesla cards WILL be expensive, and if they deliver what is promised, Universities, Oil and Gas Enterprises and Pharmaceutical manufacturers will snatch them up in droves.

Edit: They did *not* deliver anything close to what was promised. Thermi is a fail.
 
Status
Not open for further replies.
Back