Nvidia pushes back Fermi GPU to March 2010?

Status
Not open for further replies.
BlindObject said:
Hmm, I'm hoping Fermi is this organic living tissue of a card that can take on Crysis at 120fps with no probs, lol.

No LOL's from me. I mean seriously, I want 120 FPS in Crysis today. I've not been very impressed by Nvidia's cards since the 8800 series. I'm sure there's technical improvements, mostly with clock speed, but the technology hasn't kept pace in the latest models. Really, Crysis is the ultimate benchmark. The game looks fantastic and if your card can run that game well - it can basically run any game well. With Crysis 1680x1050 HQ settings, a 3 year old 8800 GTS can do low 30's FPS. And a modern GTX 295 with it's dual GPU's can get you mid 40's.

There's little reason for me to justify an upgrade any more, unless I look at AMD and want DX 11. Nvidia needs to nail this technology and come out with a winner and not just give us another clock speed update.
 
Benchmark Fail

No LOL's from me....With Crysis 1680x1050 HQ settings, a 3 year old 8800 GTS can do low 30's FPS. And a modern GTX 295 with it's dual GPU's can get you mid 40's.

I call BS.
As the (moderately) happy owner of a GTX 280 SLI rig I'd say that the GTX 295 is in the same ballpark performance wise and I get 60+ fps at 1920 x 1080.
Probably comparable to http://www.bit-tech.net/hardware/graphics/2009/11/18/amd-ati-radeon-hd-5970-review/9 in fact....But maybe thats just because of a lack of cpu bottleneck eh?
 
i agree with 9Nails IMO video cards are not that much better then the last generation of cards. i own a 4890 and its a great card and i bought it for $250(a while back) and the performance of the new ATI cards just doesn't impress me at all (especially for the price) and looking at tech spot's reviews(5870&MW2) of the new cards. it just doesn't justify me spending $400+(newegg&tigerdirect) on a card that doesn't give me that much more performance, and dx11? what about it by the time games "decent tittles" start fully using dx11, nvidia cards will be on the shelves and if not then a couple of months of waiting is not going to hurt .i understand upgrading if you have an old card but a fairly new one(like the last series) its a complete waste to upgrade now! ATI makes me feel like they just rushed this card out to get a lead off of nvidia(cant blame them its business), and so in this i have no problem waiting for nvidia to release their cards so i can compare, but i do believe that nvidia will come back and take the performance crown like they all ways have done before even if the cost is more... but i want to see both of the company's offerings before i blow hundreds of dollars on a video card and that's even if i think it is worth upgrading at all!
 
i cant believe all you losers actually think AMD will beat nvidia just because it delayed its cards by 3 months , it has ruled for years and all of the stupid direct x 11 cant do anything to nvidia's domination , infact direct x 11 reduces the performance of the game by 30 % for minor visual enchancements which are difficult to notice

NVIDIA RULES
 
This article suggests that Nvidia won't have their DX11 cards out for another six months. That's will make it an ENTIRE YEAR since AMD came out with their off-the-charts performing DX11 cards that effectively blew out of the water every Nvidia card out there. Not to mention that when Nvidia does come out with their new cards, there's no guarantee they're going to be better than the AMD cards.

And you think "falling down hard" is an exaggeration? In the competitive world of video card manufacturing, I'd put it closer to suicide than "falling down hard." I was just trying to be polite.

I'm not going to argue because I think only time can tell but... Ummm January, February, then March... how did you get six exactly, you do know this is December, right? I think you mean it would be six months since ATI released those cards... not a year unless you mean the first demo card but wouldn't that be about 9 or 10 months? I hope you're not a coder because your math is a little bit off, well you can always use a calculator, I recommend graphcalc its a free graphing calculator.
 
Guest said:
It is true, AMD has these great DX11 cards on the market. Nevertheless, AMD cards are nothing but expensive garbage, and this is the truth from the beginning of ATI as a graphic card company. Only in some benchmark applications an AMD card can be used. In any other real world application the ATI/AMD driver simply doesn't work. By contrast, there are only very isolated cases when an nVidia card failed to perform to it's maximum specifications. Their drivers are really good.

Obviously words from an Nvidia trolling fan-boy

Nvidia, has made the least innovation this decade when it has come to GPU's.
The part about AMD cards being expensive garbage is nothing more than an irrelevant fabrication, for nvidia cards have always been the more expensive cards.
AMD cards are only expensive right now because a silicon shortage.
Most of the major triumphs this decade were from ati, then used by nvidia because what they were doing weren't classified under a patent from stopping them.--remember the Radeon 9700? It was the card that really innovated graphics earlier in this decade. If it wasn't for Ati's work creating the radeon 9700, expect nothing to be like it is now. It was also ATI that created efficient AA, and AF filtering, which to this day is still true in comparison to nvidia.--although I can come to agreement that heavy usage of AA isn't needed at high resolutions.
 
I'm reading those comments and can't stop being shocked by amount (in lack of better words) stupidity.

Nvidia and AMD are not your fathers (or your own) companies. You DON'T have huge percentage (if any) of their stocks.

Now repeat this 10 times and start thinking like normal human should think.

Don't be delusional and don't go into extremes like some comments i saw - Nvidia rocks (when at the moment ATI have an upper hand and only blind and stupid person can't see that) or ATI is the best (when only blind and stupid person don't know that glory is today and tomorrow who knows).

For all those with lack of gray mater let me give you some quick guide how "normal" people function.

Q) Do i need a product (graphics card, CPU, memory, a car, house, TV set . . .)

A 1) If i dont I DON'T CARE if ATI or Nvidia (or any other company) have the best and fastest or better looking or . . . . product. I can and should stay informed about new developments but for normal person this is nothing more than "information" that will be updated once they decide to buy something.

A 2) If normal person need some product it WILL buy the best his money can buy at the moment of buying. For normal person it means nothing if "in few months" something better could come because IT'S ALWAYS something better just around corner. Also normal person WILL NEVER buy outdated product if it's not dirty cheap.

If nothing just think who is buying last year car model for the price of new this year model car?
 
DirectX 11 is nothing more than a term at the moment. Who cares if AMD has their short lead. nVidia will be back and when they do come back if will again take AMD/ATI lightyears to catch up
 
Guest said:
DirectX 11 is nothing more than a term at the moment. Who cares if AMD has their short lead. nVidia will be back and when they do come back if will again take AMD/ATI lightyears to catch up

Hopefully that isn't the case.

With only one player in the gpu market you won't see competitive pricing again.
Much the same as is happening at the moment- AMD won't drop prices until they are either forced to by competition, or market demand slows and they have stock piling up around their ears- the latter not a likely scenario with certain tech media outlets whipping up a frenzy over the new must-have DirectX 11 (I'll reserve judgement until something a little better than DIRT 2 and Battle Forge hit the shelves) and the supposed demise of nVidia.

Strangely enough I seem to recall this scenario being enacted a little while ago.....Radeon X1900 XT(XGTOXLXTXXXXLESEVIVO etc...) ruling the roost until November 2006 when a card with virtually no advance leaked benchmarks arrived....the 8800 GTX...how'd that work out again?
 
I'm going to say it... AMD sucks. Not just because I think so but because a fighter that wins 1 fight out of 4 is a poor fighter. You want to talk about product to market delays AMD does it so much its concidered the norm. Anyone who has seen a Rocky movie knows the hero can't win every fight or he becomes a tyrant. He has to loose one that appears to be a big deal and then come back for the win. I would look at this delay as Nvidia no longer concidering AMD a threat as the same thing happens EVERY TIME. Nvidia dominates for 2 years AMD comes out with a card at the middle of the following year and is competitive for 6 - 8 months and the Nvidia trumps that card for another year, AMD makes something thats at least on the mainstream market competitive so the red fans can still play a few games... rinse repeat. I mean really people the whole AMD/ATI combo thing... really? 2 loosers don't make a winner it just gives them someone to hold and cry with after the fight is over.
 
I love the argument that Nvidia has better drivers than ATI what a joke. Have a look here http://arstechnica.com/hardware/news/2008/03/vista-capable-lawsuit-paints-picture-of-buggy-nvidia-drivers.ars and then back up the comments about ATI drivers with more than random examples about a newly released card. Every new card has growing pains even Nvidia ones to say otherwise is misleading at best.

Until Nvidia releases a new card that is decent I recommend ATI as the best choice and any knowledgeable people are doing the same.
 
Ur Wrong. At the time of "Fermi" based 380 GTX release may be June or July.That time i think AMD release her new baby HD 6870.Then it was trash nVidia "Fermi". NVidia fall move.
 
Ur Wrong. At the time of "Fermi" based 380 GTX release may be June or July.That time i think AMD release her new baby HD 6870.Then it was trash nVidia "Fermi". NVidia fall move.

QUITE SO Mr Guestbot !
I also have it on good authority that six minutes after the HD 6870 debuts, nV will release Fermi 2:The Wrath of Huang. ATI will counter this two days later with the HD 99999970 XTX GTO VIVO XL PE LE Extra Special Super Overclock Edition made from tech transported back from the future by the Large Hadron Collider...2 minutes after that it will become SELF AWARE!!!! And then refuse to play Crysis at 2560 x 1600 at Enthusiast settings as a sign of solidarity with it's less evolved brethren.
 
Ok here we go.Nvidia has been on top for a long time and now the ATI/AMD fanboy crew is eating it up that the boys from ATI are on top for several months.What about the last 10 years or so that nvidia has spanked ATI.The delays are simple the 5800's were an easy launch for ATI most of the technology is the same as the 4800 series just boosted up.Nvidia has reworked everything from the ground up on Fermi and whenever you do that delays are a huge concern.However I distinctly remember last time Nvidia took a big step in changing the way their GPU's worked,they were called the 8800's and were twice as good as anything ATI could muster.Don't dog Nvidia for taking the time to get it right because when it does launch it will eat anything ATI has alive.Anyone who knows anything about how GPU's work and has seen the specs for fermi will agree that the 5800's will eat dust.It takes time to start from the ground up and get it perfect,my props to nvidia for not launching these product until they are trouble free and ready.
 
What a pity!

ATi/AMD delivered the DirectX 11 experience to its fans months ago. nVidia couldn’t make it. It’s real, and it’s a tremendous defeat! That’s what happened, that’s the truth.

It’s called “Cost of Opportunity”. There’s no price to be the first to experience a ATi Radeon HD 5870 in its all glory, a single card crushing nVidia’s dual card GTX 295. And we’re talking about a heavy title such as Crytek Crysis @ 2560×1600.

According to Tom’s Hardware, nVidia GTX 295 simply didn’t work at that resolution. Pity again! Please, see for yourself.

http://www.tomshardware.com/reviews/radeon-hd-5870,2422-13.html?xtmc=crysis_2560×1600_gtx_295_buffer_memory&xtcr=2

One of the paragraphs from this review says:

“Notice the missing result for Nvidia’s GeForce GTX 295 at 2560×1600 with 8xAA? That’s due to the card not having ample on-board memory to run that configuration (and the game not knowing any better than to keep it from trying). Grand Theft Auto gets around this by simply making resolutions unavailable if a graphics card doesn’t have a large enough frame buffer. Crysis crashes instead.”

GTX 295 not being able to run Crysis @ 2560×1600? Pity!

Fermi has got to be better and faster than Cypress. It’s an obligation for nVidia to build this in that way, since they had, at least, more time to conceive it.

And, as always, don’t be fooled: you’re going to hurt your pocket to have Fermi installed onto your RIG. Be prepared to pay the price. It happened with Cypress. It’s going to be the same with Fermi. And since, nVidia cards are always much more expensive than ATi/AMD’s, one Fermi card can reach as much as 750 bucks. Wait and see.

Take this weekend and go to your favorite retail store and grab your ATi Radeon HD 5870. It’s there, real. Just take it.

Fermi, humpf…maybe 3Q2010 you’ll get one. It’s just an illusion…a dream (that hasn’t come true…hehehehe…)

Cheers!
 
Where the hell do you get the info of ATI cannot run certain benchmarking programs and that they expensive garbage. Just for interest sake, I have used Nvidia and ATI products for a long time, since the first releases (most probably when you were still a ugly little baby). I have noticed nothing bad about either companies, both had good and bad points. ATI cards have much better bang for buck, always has been the case. Nvidia always had the performance crown, but at a price (also because they tend to pay companies more to make TWIMTBP games). Secondly, this round I went with ATI (sold my 260 and bought a 5870) Why? Because ATI is a much more stable company at the moment since there is so many rumors about Nvidia and financial trouble. The last time I heard this rumors was with Voodoo, and I lost support and warranty on my cards. (I only upgrade about every 2 years and dont want my card to become obsolete.) Also, ATI has all the features I need to play the latest games at the resolutions I need (1920x1080) without trouble with DX11 etc. Nvidia's new range is going to be too expensive, and is going to use too much power. Currently, I dont want to pay a huge electricity bill just to play games.
 
Guest said:
Take this weekend and go to your favorite retail store and grab your ATi Radeon HD 5870. It’s there, real. Just take it.

I hope you're not suggesting shoplifting....
 
Status
Not open for further replies.
Back