Nvidia: AMD's DirectX 11 lead doesn't matter

Status
Not open for further replies.

Matthew DeCarlo

Posts: 5,271   +104
Staff

Nvidia has reportedly shrugged off AMD's current lead in the DirectX 11 graphics market, insisting that it's only a temporary advantage that won't have a long-term effect. The momentary edge of beating Nvidia will be eclipsed by larger changes in the graphics processing market, which is moving toward parallel computing, said Nvidia's Michael Hara last week.

"We go through revolutionary changes every three of four years, and that's exactly where we're at today," Hara said. He continued, "The next big evolution in the API world has come with DirectX 11 (DX11), but we believe that's a part of the experience," adding that technologies like 3D stereo and physics are also important.

Ultimately, Hara said being out of sync with the API for a couple of months isn't as important as what Nvidia is striving to accomplish in the big picture, over the next four or five years.

Fermi is just around the corner, and AMD's Radeon HD 5970 will finally have the opportunity to pick on something its own size. In the meantime, what do you think about AMD's lead? Is it truly as "insignificant" as Nvidia claims?

Permalink to story.

 
Makes sense. There aren't a huge amount of DirectX 11 games coming right now. There probably won't be either until Nvidia releases their compatible card and starts working with developers to make them. I'm really hoping they figure out some way to reduce the performance hit of enabling 3D in games more than anything with Fermi.
 
How many DX11 games are there? 2?, the 3rd game will be out in feb and Nvidia cards should be out by that time, considering the TSMC issues have meant almost no stock worldwide for ati, it's not that big a deal
 
It is a tug of war. The lead only matters for bragging rights. AMD is deserving of it, and needs to stabilize this side of their portfolio.

Both companies help innovate in the field. AMD uses its resources to refine and streamline, while Nvidia pushes the market in a direction the think is forward by investing more into the applications of their new less refined yet powerful tech.
 
This is exactly what one would expect to hear from nVidia. Seriously, they're not going to come out with a, "we're really worried that we're getting our butts kicked right now and can't believe AMD beat us to the punch with DirectX 11."

Totally reminds me of a sports coach whose team is starting the season out poorly and is engaging in some damage control. "It's a long season, we're improving and these losses really don't matter right now."
 
Usually, I would just void everything nvidia claims, but on one account they are right. Just because DX11 is available with Windows 7 and there are a few games doesn't mean it's the new standard. It may get to that point eventually but as far as i'm concerned AMD jumped the gun a bit. I'm all for technological advancements but there is such a small fraction of PC users that are using DX11 (and Windows 7) at the moment, that it almost seems like a waste of money to me.

Nvidia is actually smart in waiting for developers to tailor to DX11-enabled games and software. Don't get me wrong I think the HD5000 series is great, but it's just too early to be releasing new stuff that so few people are going to use. This may be why there has been a shortage, but it's clear that AMD want's to be ahead regarding graphics innovation.

It wouldn't surprise me if nvidia ends up doing better in the DX11-enabled GPU market than AMD, not only because more DX11 games will be released in the future, but because nvidia has more time to develop solid solutions and driver support in the infancy of the tech behind the cards.
 
It's been said before and we'll say it again...A DX11 lead is quite pointless without anyone really using it yet. Come back to us in 6 months about DX11 leads and then we'll have something worthwhile to discuss.
 
DirectX 11 aside, the 5800 series cards are fast and are selling well.

This is just typical company damage control nonsense.
 
I have to disagree with you Tim,and EX.
I have been in marketing for 25 years now and there is no way that Nvidia is willingly letting AMD have a six month jump in the new gen GPU's. Being 'first is a an important marketing tool. do you really think that Nvidia is sitting around refining their drivers so they are perfect for the release of Fermi? that on its face make no sense as ATI has their next generation out and that doesn't keep them from working on their drivers, the 9-12 drivers came out today with even more significant performance improvements.

this generation of GPU's that are now coming out....or will shortly in Nvidia's case have been in the works for two plus years now and the Fermi is what it is other than minor tweaks.

don't forget that the hardware capability always fore-runs the advent of the software to make use of it and that's logical, witness PCIE V 2, DDR 3, and USB 3, among a long list of others. Otherwise it would be like someone inventing the album before the turntable and leaving folks to sit around and read the record jackets and say "damn! I wish someone would invent something to play this on!" I mean really, would you expect that a software company would write a DX11 game before there was such a thing as DX11?
Nvidia has a problem, weather it be that they have just rested on their laurels, and just that far behind, or having a problem getting their architecture to work correctly. They have provided clues in issuing ongoing statements that downplay the importance of DX11, and that they are focusing more on the GPGPU technology.

If indeed they were exercising the risky strategy of letting AMD 'show what they got' they would at least release some actual 300 series cards with the new architecture that they could price wherever in the market they wanted, rather than the feeble, re-branded and low end 300 series volley they released a couple of weeks ago.

I contend again that there is no way that this is an Nvidia choice or marketing strategy. they have watched the huge success of the 5xxx series and how they cant keep it in stock...(albeit with help from TSMC) and this i an economy that hasn't been seen since the Jimmy Carter misery index.
 
Its good if Nvidia have the best card out because then ATI will price drop their cards and when they do that its game over for Nvidia just like in the 4000 series of radeon cards
 
I agree with Red and Tom. What in the world is Nvidia supposed to say? "It was our intention to let AMD get the lead, give them a false sense of security then...BAM!"

Nvidia screwed up and AMD capitalized on it. Good for them and good for us, finally some real competition again between the two. Which always leads to price wars and happy consumers :) .
 
This is exactly what one would expect to hear from nVidia. Seriously, they're not going to come out with a, "we're really worried that we're getting our butts kicked right now and can't believe AMD beat us to the punch with DirectX 11."

Totally reminds me of a sports coach whose team is starting the season out poorly and is engaging in some damage control. "It's a long season, we're improving and these losses really don't matter right now."

Well said,i read the news and i just kept thinking damage control.
 
By the time nvidia releases anything AMD will have a mature technology and enough market penetration to make fermi look like it is just there show how great AMD really is. Unless fermi is actually really good with games and nvidia is just hiding that right now.
 
Another unspoken advantage. Games coming out over the next year or two will be optimised for ATI GPUs. ATI has the hardware in developers hands, where's Nvidias? In fact the Evergreen series has been in developers hands since around April irrc.

That's an massive advantage and mind share, it also negates much of Nvidia's the TWIMTBP.

So ATI is likely has a generational advantage over Nvidia for the foreseeable future if not two, in addition we still don't know the full specs of Fermi or how NVidia is coping with the multiple issues of yields, power consumption or heat. Plus I believe it's using software for tessellation, how much of a impact will that have on game performance? Fermi is a GPGPU not a gaming chip, it's being shoe-horned into the gaming role and that's going to have some kind of impact. Both in optimising drivers for a completely new hardware and dealing with the first generation issues that crop up.

Nvidia in many aspects seems to be betting the farm on GPGPU; talking about CUDA and it's 'advantages'. With Nvidia's past do you think any scientific body is willing to trust a multi-million contract on Nvidia? Plus with AMD and Intel's GPGPU on a processor tech it could be a technological niche or dead end. They see the writing on the wall and are desperate to survive.

Yes this is a tick towards ATI, but for the first time in years there is a clear distinction between both sets of hardware and their focus. It's taken ATI/AMD a long time to get to this position and they should be congratulated.
 
Not much to say:

AMD: selling products ($) and making people happy (DiRT 2 with Dx11).

nVidia: thinking about the future. Pity!


Cheers!
 
Guest said:
Another unspoken advantage. Games coming out over the next year or two will be optimised for ATI GPUs. ATI has the hardware in developers hands, where's Nvidias? In fact the Evergreen series has been in developers hands since around April irrc.

That's an massive advantage and mind share, it also negates much of Nvidia's the TWIMTBP.

You see the tail wagging the dog often ?
GPU manufacturers tailor graphics drivers for games code. Game developers don't wait for Graphics multinationals to issue hardware then code for it. While some games are optimised for certain graphics enviroments ( AMD/ATI with the S.T.A.L.K.E.R. series, nVIDIA for Crysis, CoD etc) that is more down to the funding allocated by a graphics company and the R&D invested. ATI might be investing in DirectX 11 but I'd personally prefer to see them optimise their drivers for the games already in release- CoD World at War for instance- the game is a year old, my GTX280 (either single of SLI) fly through, my HD4890 turns the game into a series of picture postcards.

On a related note hardly surpising that ATI was first out the gate this time around- the HD5xxx series is after all a pretty much just a refresh of the HD4xxx series on a smaller node, while the nVIDIA offering will be based on a new marchitecture. Last time we were in this position I believe nVIDIA came out with the G80 so I wouldn't be too quick to write them off.

Nice to see that HD5870/5970 stocks are now getting relatively plentiful......puts a bit of a dampener on the TSMC/nVIDIA conspiracy theory....although I'm pretty sure I saw Jen Hsun-Huang lurking in the backround in the Zapruder film...
 
This is the kind of thing you say when your competitor has shipped 800k units before you've even been able to soft launch your own product. You can be sure Nvidia would be cranking out press releases trumpeting their success if the situation was flipped. That being said, unless Fermi is a total disaster, I think it's likely Nvidia will be the performance leader again in a few months.
 
yeah, just a bunch of damage control. Of course they care, if they weren't worried why don't they let someone do a review or benchmark their next gen cards yet. Doesn't make sense.
 
dividebyzero said:
Guest said:
Another unspoken advantage. Games coming out over the next year or two will be optimised for ATI GPUs. ATI has the hardware in developers hands, where's Nvidias? In fact the Evergreen series has been in developers hands since around April irrc.

That's an massive advantage and mind share, it also negates much of Nvidia's the TWIMTBP.

You see the tail wagging the dog often ?
GPU manufacturers tailor graphics drivers for games code. Game developers don't wait for Graphics multinationals to issue hardware then code for it. While some games are optimised for certain graphics enviroments ( AMD/ATI with the S.T.A.L.K.E.R. series, nVIDIA for Crysis, CoD etc) that is more down to the funding allocated by a graphics company and the R&D invested. ATI might be investing in DirectX 11 but I'd personally prefer to see them optimise their drivers for the games already in release- CoD World at War for instance- the game is a year old, my GTX280 (either single of SLI) fly through, my HD4890 turns the game into a series of picture postcards.

On a related note hardly surpising that ATI was first out the gate this time around- the HD5xxx series is after all a pretty much just a refresh of the HD4xxx series on a smaller node, while the nVIDIA offering will be based on a new marchitecture. Last time we were in this position I believe nVIDIA came out with the G80 so I wouldn't be too quick to write them off.

Nice to see that HD5870/5970 stocks are now getting relatively plentiful......puts a bit of a dampener on the TSMC/nVIDIA conspiracy theory....although I'm pretty sure I saw Jen Hsun-Huang lurking in the backround in the Zapruder film...
it's nvidia's fault for not doing anything special since the 8xxx series. they re-branded the cards 2 or 3 times.
 
Some interesting points are being made. The reason why I'm still not concerned about DX11 is that it still has to mature and NVidia still has time to put something competitve on the table that supports it. Yes they are 'losing' some of the DX11 market share right now. But consider this, as a gamer I am still on Windows XP (DX9!). What I need is some solid evidence that I should switch up to Windows 7 and play ALL of my games properly (old and new). Only when I am desperate to play the latest Win7/DX 11 only games will I need to upgrade. I won't see the need for DX11 for quite a while. So, no, DX11 is still not an issue (for me at least). In the meantime AMD can do the R&D (and you think Nvidia isn't?).
 
It doesn't matter if I agree with NVIDIA sometimes, but they're talking way too much. I'd rather they talked less and showed more new products. Or just talked less. It feels like they're trying to talk their way out of a corner.
 
Does anyone here know the different between REALITY and DAY DREAMING?! Huh, huh?
 
The same thing I said in a post a couple weeks ago.

Fermi here, Fermi there, Fermi nowhere near yet to see and Nvidia still to be the "we are cool" logo about it.

AMD is kicking butts out there, and as some said, gaming industry giants already have ATI's DX11 hardware to play with and start developing games for DX11, which brings TONs to the table. And still, DX11 or not ATI 5xxx series is the fastest out there.

The one saying "I have a 4xxx and looks like postcard showoff, against the 285 that flies through games" according to the benchmarks at 8x MSAA resolution they run with like 5 fps difference, not 40 that is what it requires to look like postcard game.
 
Status
Not open for further replies.
Back