Nvidia: AMD's DirectX 11 lead doesn't matter

By on December 17, 2009, 7:27 PM
Nvidia has reportedly shrugged off AMD's current lead in the DirectX 11 graphics market, insisting that it's only a temporary advantage that won't have a long-term effect. The momentary edge of beating Nvidia will be eclipsed by larger changes in the graphics processing market, which is moving toward parallel computing, said Nvidia's Michael Hara last week.

"We go through revolutionary changes every three of four years, and that's exactly where we're at today," Hara said. He continued, "The next big evolution in the API world has come with DirectX 11 (DX11), but we believe that's a part of the experience," adding that technologies like 3D stereo and physics are also important.

Ultimately, Hara said being out of sync with the API for a couple of months isn't as important as what Nvidia is striving to accomplish in the big picture, over the next four or five years.

Fermi
is just around the corner, and AMD's Radeon HD 5970 will finally have the opportunity to pick on something its own size. In the meantime, what do you think about AMD's lead? Is it truly as "insignificant" as Nvidia claims?




User Comments: 32

Got something to say? Post a comment
mattfrompa mattfrompa said:

Obviously the lead only matters when Nvidia has it...

Guest said:

Makes sense. There aren't a huge amount of DirectX 11 games coming right now. There probably won't be either until Nvidia releases their compatible card and starts working with developers to make them. I'm really hoping they figure out some way to reduce the performance hit of enabling 3D in games more than anything with Fermi.

Guest said:

How many DX11 games are there? 2?, the 3rd game will be out in feb and Nvidia cards should be out by that time, considering the TSMC issues have meant almost no stock worldwide for ati, it's not that big a deal

MrAnderson said:

It is a tug of war. The lead only matters for bragging rights. AMD is deserving of it, and needs to stabilize this side of their portfolio.

Both companies help innovate in the field. AMD uses its resources to refine and streamline, while Nvidia pushes the market in a direction the think is forward by investing more into the applications of their new less refined yet powerful tech.

BlindObject said:

nope. Nvidia is just brewing up it's monster, and the more it takes, the bigger it will be.

TomSEA TomSEA, TechSpot Chancellor, said:

This is exactly what one would expect to hear from nVidia. Seriously, they're not going to come out with a, "we're really worried that we're getting our butts kicked right now and can't believe AMD beat us to the punch with DirectX 11."

Totally reminds me of a sports coach whose team is starting the season out poorly and is engaging in some damage control. "It's a long season, we're improving and these losses really don't matter right now."

EXCellR8 EXCellR8, The Conservative, said:

Usually, I would just void everything nvidia claims, but on one account they are right. Just because DX11 is available with Windows 7 and there are a few games doesn't mean it's the new standard. It may get to that point eventually but as far as i'm concerned AMD jumped the gun a bit. I'm all for technological advancements but there is such a small fraction of PC users that are using DX11 (and Windows 7) at the moment, that it almost seems like a waste of money to me.

Nvidia is actually smart in waiting for developers to tailor to DX11-enabled games and software. Don't get me wrong I think the HD5000 series is great, but it's just too early to be releasing new stuff that so few people are going to use. This may be why there has been a shortage, but it's clear that AMD want's to be ahead regarding graphics innovation.

It wouldn't surprise me if nvidia ends up doing better in the DX11-enabled GPU market than AMD, not only because more DX11 games will be released in the future, but because nvidia has more time to develop solid solutions and driver support in the infancy of the tech behind the cards.

Timonius Timonius said:

It's been said before and we'll say it again...A DX11 lead is quite pointless without anyone really using it yet. Come back to us in 6 months about DX11 leads and then we'll have something worthwhile to discuss.

Guest said:

DirectX 11 aside, the 5800 series cards are fast and are selling well.

This is just typical company damage control nonsense.

red1776 red1776, Omnipotent Ruler of the Universe, said:

I have to disagree with you Tim,and EX.

I have been in marketing for 25 years now and there is no way that Nvidia is willingly letting AMD have a six month jump in the new gen GPU's. Being 'first is a an important marketing tool. do you really think that Nvidia is sitting around refining their drivers so they are perfect for the release of Fermi? that on its face make no sense as ATI has their next generation out and that doesn't keep them from working on their drivers, the 9-12 drivers came out today with even more significant performance improvements.

this generation of GPU's that are now coming out....or will shortly in Nvidia's case have been in the works for two plus years now and the Fermi is what it is other than minor tweaks.

don't forget that the hardware capability always fore-runs the advent of the software to make use of it and that's logical, witness PCIE V 2, DDR 3, and USB 3, among a long list of others. Otherwise it would be like someone inventing the album before the turntable and leaving folks to sit around and read the record jackets and say "damn! I wish someone would invent something to play this on!" I mean really, would you expect that a software company would write a DX11 game before there was such a thing as DX11?

Nvidia has a problem, weather it be that they have just rested on their laurels, and just that far behind, or having a problem getting their architecture to work correctly. They have provided clues in issuing ongoing statements that downplay the importance of DX11, and that they are focusing more on the GPGPU technology.

If indeed they were exercising the risky strategy of letting AMD 'show what they got' they would at least release some actual 300 series cards with the new architecture that they could price wherever in the market they wanted, rather than the feeble, re-branded and low end 300 series volley they released a couple of weeks ago.

I contend again that there is no way that this is an Nvidia choice or marketing strategy. they have watched the huge success of the 5xxx series and how they cant keep it in stock...(albeit with help from TSMC) and this i an economy that hasn't been seen since the Jimmy Carter misery index.

Guest said:

Its good if Nvidia have the best card out because then ATI will price drop their cards and when they do that its game over for Nvidia just like in the 4000 series of radeon cards

Relic Relic, TechSpot Chancellor, said:

I agree with Red and Tom. What in the world is Nvidia supposed to say? "It was our intention to let AMD get the lead, give them a false sense of security then...BAM!"

Nvidia screwed up and AMD capitalized on it. Good for them and good for us, finally some real competition again between the two. Which always leads to price wars and happy consumers .

BMfan BMfan said:

This is exactly what one would expect to hear from nVidia. Seriously, they're not going to come out with a, "we're really worried that we're getting our butts kicked right now and can't believe AMD beat us to the punch with DirectX 11."

Totally reminds me of a sports coach whose team is starting the season out poorly and is engaging in some damage control. "It's a long season, we're improving and these losses really don't matter right now."

Well said,i read the news and i just kept thinking damage control.

Puiu Puiu said:

By the time nvidia releases anything AMD will have a mature technology and enough market penetration to make fermi look like it is just there show how great AMD really is. Unless fermi is actually really good with games and nvidia is just hiding that right now.

Guest said:

Another unspoken advantage. Games coming out over the next year or two will be optimised for ATI GPUs. ATI has the hardware in developers hands, where's Nvidias? In fact the Evergreen series has been in developers hands since around April irrc.

That's an massive advantage and mind share, it also negates much of Nvidia's the TWIMTBP.

So ATI is likely has a generational advantage over Nvidia for the foreseeable future if not two, in addition we still don't know the full specs of Fermi or how NVidia is coping with the multiple issues of yields, power consumption or heat. Plus I believe it's using software for tessellation, how much of a impact will that have on game performance? Fermi is a GPGPU not a gaming chip, it's being shoe-horned into the gaming role and that's going to have some kind of impact. Both in optimising drivers for a completely new hardware and dealing with the first generation issues that crop up.

Nvidia in many aspects seems to be betting the farm on GPGPU; talking about CUDA and it's 'advantages'. With Nvidia's past do you think any scientific body is willing to trust a multi-million contract on Nvidia? Plus with AMD and Intel's GPGPU on a processor tech it could be a technological niche or dead end. They see the writing on the wall and are desperate to survive.

Yes this is a tick towards ATI, but for the first time in years there is a clear distinction between both sets of hardware and their focus. It's taken ATI/AMD a long time to get to this position and they should be congratulated.

Guest said:

Not much to say:

AMD: selling products ($) and making people happy (DiRT 2 with Dx11).

nVidia: thinking about the future. Pity!

Cheers!

dividebyzero dividebyzero, trainee n00b, said:

Guest said:

Another unspoken advantage. Games coming out over the next year or two will be optimised for ATI GPUs. ATI has the hardware in developers hands, where's Nvidias? In fact the Evergreen series has been in developers hands since around April irrc.

That's an massive advantage and mind share, it also negates much of Nvidia's the TWIMTBP.

You see the tail wagging the dog often ?

GPU manufacturers tailor graphics drivers for games code. Game developers don't wait for Graphics multinationals to issue hardware then code for it. While some games are optimised for certain graphics enviroments ( AMD/ATI with the S.T.A.L.K.E.R. series, nVIDIA for Crysis, CoD etc) that is more down to the funding allocated by a graphics company and the R&D invested. ATI might be investing in DirectX 11 but I'd personally prefer to see them optimise their drivers for the games already in release- CoD World at War for instance- the game is a year old, my GTX280 (either single of SLI) fly through, my HD4890 turns the game into a series of picture postcards.

On a related note hardly surpising that ATI was first out the gate this time around- the HD5xxx series is after all a pretty much just a refresh of the HD4xxx series on a smaller node, while the nVIDIA offering will be based on a new marchitecture. Last time we were in this position I believe nVIDIA came out with the G80 so I wouldn't be too quick to write them off.

Nice to see that HD5870/5970 stocks are now getting relatively plentiful......puts a bit of a dampener on the TSMC/nVIDIA conspiracy theory....although I'm pretty sure I saw Jen Hsun-Huang lurking in the backround in the Zapruder film...

ken777 said:

This is the kind of thing you say when your competitor has shipped 800k units before you've even been able to soft launch your own product. You can be sure Nvidia would be cranking out press releases trumpeting their success if the situation was flipped. That being said, unless Fermi is a total disaster, I think it's likely Nvidia will be the performance leader again in a few months.

compdata compdata, TechSpot Paladin, said:

yeah, just a bunch of damage control. Of course they care, if they weren't worried why don't they let someone do a review or benchmark their next gen cards yet. Doesn't make sense.

Puiu Puiu said:

dividebyzero said:

Guest said:

Another unspoken advantage. Games coming out over the next year or two will be optimised for ATI GPUs. ATI has the hardware in developers hands, where's Nvidias? In fact the Evergreen series has been in developers hands since around April irrc.

That's an massive advantage and mind share, it also negates much of Nvidia's the TWIMTBP.

You see the tail wagging the dog often ?

GPU manufacturers tailor graphics drivers for games code. Game developers don't wait for Graphics multinationals to issue hardware then code for it. While some games are optimised for certain graphics enviroments ( AMD/ATI with the S.T.A.L.K.E.R. series, nVIDIA for Crysis, CoD etc) that is more down to the funding allocated by a graphics company and the R&D invested. ATI might be investing in DirectX 11 but I'd personally prefer to see them optimise their drivers for the games already in release- CoD World at War for instance- the game is a year old, my GTX280 (either single of SLI) fly through, my HD4890 turns the game into a series of picture postcards.

On a related note hardly surpising that ATI was first out the gate this time around- the HD5xxx series is after all a pretty much just a refresh of the HD4xxx series on a smaller node, while the nVIDIA offering will be based on a new marchitecture. Last time we were in this position I believe nVIDIA came out with the G80 so I wouldn't be too quick to write them off.

Nice to see that HD5870/5970 stocks are now getting relatively plentiful......puts a bit of a dampener on the TSMC/nVIDIA conspiracy theory....although I'm pretty sure I saw Jen Hsun-Huang lurking in the backround in the Zapruder film...

it's nvidia's fault for not doing anything special since the 8xxx series. they re-branded the cards 2 or 3 times.

Timonius Timonius said:

Some interesting points are being made. The reason why I'm still not concerned about DX11 is that it still has to mature and NVidia still has time to put something competitve on the table that supports it. Yes they are 'losing' some of the DX11 market share right now. But consider this, as a gamer I am still on Windows XP (DX9!). What I need is some solid evidence that I should switch up to Windows 7 and play ALL of my games properly (old and new). Only when I am desperate to play the latest Win7/DX 11 only games will I need to upgrade. I won't see the need for DX11 for quite a while. So, no, DX11 is still not an issue (for me at least). In the meantime AMD can do the R&D (and you think Nvidia isn't?).

ET3D, TechSpot Paladin, said:

It doesn't matter if I agree with NVIDIA sometimes, but they're talking way too much. I'd rather they talked less and showed more new products. Or just talked less. It feels like they're trying to talk their way out of a corner.

Guest said:

Does anyone here know the different between REALITY and DAY DREAMING?! Huh, huh?

Kibaruk Kibaruk, TechSpot Paladin, said:

The same thing I said in a post a couple weeks ago.

Fermi here, Fermi there, Fermi nowhere near yet to see and Nvidia still to be the "we are cool" logo about it.

AMD is kicking butts out there, and as some said, gaming industry giants already have ATI's DX11 hardware to play with and start developing games for DX11, which brings TONs to the table. And still, DX11 or not ATI 5xxx series is the fastest out there.

The one saying "I have a 4xxx and looks like postcard showoff, against the 285 that flies through games" according to the benchmarks at 8x MSAA resolution they run with like 5 fps difference, not 40 that is what it requires to look like postcard game.

Guest said:

nVidia has more to worry about than mkaing a more powerful gpu, they need to address the eyefinity which for a gamer....is all the reason one needs to buy 3+ monitors...very cool and I would call it a killer app for the gamer.

now if monitor vendors will start making slim bezel (<3mm) 1080p 30" screens!

carlosp72 said:

Nvidia just don't know what to say, if they can say any thing....

The truth is that AMD made a huge advance, even without DX11 is kicking Nvidia's top grafic cards... in perfomance and price, and don't forget about HD video and eyefinity...

And about what Nvidia says about AMD DX11, just read what Nvidia told us about dx10 back in 2007, when dx10 whas only ment to run on vista and nobody even whanted(whats) to hear about Windows Vista.. and of course not a single game whas ready for Vista and Dx10, when Nvidia released the DX10 grafic cards...

http://www.nvidia.com/object/IO_41448.html

Puiu Puiu said:

Guest said:

nVidia has more to worry about than mkaing a more powerful gpu, they need to address the eyefinity which for a gamer....is all the reason one needs to buy 3+ monitors...very cool and I would call it a killer app for the gamer.

now if monitor vendors will start making slim bezel (<3mm) 1080p 30" screens!

we'll see them very soon. i think samsung said that they'll release something like that but i'm not sure.

dividebyzero dividebyzero, trainee n00b, said:

Kibaruk said:

The one saying "I have a 4xxx and looks like postcard showoff, against the 285 that flies through games" according to the benchmarks at 8x MSAA resolution they run with like 5 fps difference, not 40 that is what it requires to look like postcard game.

First off I have 2 GTX280's (not GTX285)

Secondly...

[link]

As all the hoo-hah seems to be regarding the top-end cards then pay attention to benchmarks at 1920 and 2560.

If I run card "A" in game for 5 minutes, and during that time it runs at a MINIMUM framerate of 15 fps for 30 seconds and the rest of the time runs at 200 fps then the AVERAGE framerate is 181.5 fps.

If card "B" runs the at 60 fps for the entire 5 mins then it's average framerate is 60 fps.

By your logic card "A" gives you the better gameplay experience because the average would be three times higher.

I run two systems that are broadly comparable in performance, one nvidia and one ATI and while some games run better on one card/s or the other I have found that the GTX's work with a new game out of the box generally- not so much with ATI's offerings (Saboteur and Resident Evil 5 for example) and the minimum framerate is the telling factor for good gameplay.

Secondly...

Neither of my systems are primarily used for gaming. The bulk of their usage comes from video transcoding and graphics design. The new nVIDIA architecture (if and when it arrives) holds a great deal of promise as opposed to the HD3xxx > HD4xxx > HD5xxx incremental tweaking of an architecture that is heading towards a dead end.

ATI selling cards, GREAT, channel some of the funds into R&D ! because selling the next best thing based on the fact that it can push out 120 fps instead of a measly 100 fps seems shortsighted at best.

As for when the green teams offering is supposed to rear its head I thought it had been established that the timeframe was Q1 2010.

As for nVidia losing out to AMD...If the HD5870/5970 and GF100 were released at the same time they would be sharing revenue. As it stands AMD get nearly 100% of the enthusiast market for a few months, interest wanes after initial gpu-fever, nVidia launch GF100 with only AMD refreshes and AIB "special edition"/OC as competition and makes it's money in Q1 2010 instead of Q4 2009- assuming GF100 outperforms the HD5870!. This scenario is just a continuation of the HD3xxx > G80/92 > HD4xxx > GTX200 theme. When was the last time AMD/ATI and nVIDIA went head-to-head by releasing a whole new product line each at the same time?

Guest said:

I recently pulled my nvidia card and put in a new AMD card, and I have to say the AMD card definitely has much more kick than the nvidia that is comparable to it. I was quite surprised by the results.

Guest said:

All I have to say is:

GO AMD!

Guest said:

Nvidia's comments are totally damage control. At this point they have dropped the ball and for this they will pay for the time being. Will this always be the case? Well, we shall wait and see upon the arrival of Fermi. Fermi may help or it may hurt them depending on the prices of the cards they will release in their next line-up. It sounds like what they (Nvidia) are doing could turn into something big if it succeeds, but if Fermi doesn't it could hurt them big time. Nvidia has been the leader in video cards for a long time, but AMD seems to have knocked them off their perch (or at least rattled their cage) with the 4XXX series of ATi cards.

I have to give ATi/AMD a hand for the release of their DirectX 11 compatible cards (although it's a little early in the game for them) simply because more DirectX 11 games can be tested with them, and thus be out on the market. Like I said earlier, it's way too early in the game to predict a winner between Nvidia's Fermi and ATi's DirectX 11 cards. I guess after the first half of 2010 we'll know for sure who's plan worked better.

***I have a 1 GB Radeon HD 5770 on the way and I'm curious to see what I can do with it.***

Guest said:

Say this

Nvidia wants fermi is one another G80/90 remarked chip

but they slammed into dx11

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.