AMD: DirectX "getting in the way" of PC gaming graphics

By on March 20, 2011, 1:50 PM
AMD believes there is a lack of a great disparity between PC gaming graphics and console gaming graphics, despite the huge advantage the PC has over consoles in terms of hardware. Despite the hardware giant's great relationship with Microsoft, Richard Huddy, AMD's worldwide developer relations manager of the company's GPU division, blames the software giant.

"It's funny," Huddy told bit-tech. "We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way.' Huddy says that one of the most common requests he gets from game developers is to "make the API go away."

This quote comes hot on the heels of a related statement from id Software co-founder John Carmack. While Huddy offers one perspective from the hardware side of things, and says that game developers agree with him, Carmack recently gave a different opinion from the game development angle. Despite being an OpenGL house, he stated that DirectX is a better API than OpenGL.

While Huddy said nothing about OpenGL, he clearly has not talked to Carmack. "I certainly hear this in my conversations with games developers, and I guess it was actually the primary appeal of Larrabee to developers not the hardware, which was hot and slow and unimpressive, but the software being able to have total control over the machine, which is what the very best games developers want," Huddy said. "By giving you access to the hardware at the very low level, you give games developers a chance to innovate, and that's going to put pressure on Microsoft no doubt at all. Wrapping it up in a software layer gives you safety and security but it unfortunately tends to rob you of quite a lot of the performance, and most importantly it robs you of the opportunity to innovate."





User Comments: 44

Got something to say? Post a comment
yRaz yRaz said:

don't look 10 times as good?

http://www.youtube.com/watch?v=1Kvl31g77Z8

Cryengine 3

http://www.youtube.com/watch?v=RSXyztq_0uM

Unreal engine 3

SDFU AMD

negroplasty negroplasty said:

Can't cut it in the graphics market, get out and stfu.

Guest said:

Ignore the 2 comments above, the guy is spot on. Its the low level stuff that would really make a difference.

Amd and Nvidia are on equal ground at the moment so step out and shoot yourselves fanbois of both sides, or revel in the awesome graphics of both companies, knowing that both their hardware would excel with more access than the current API's.

Go Nvidia and go ATI

gwailo247, TechSpot Chancellor, said:

Nothing is stopping AMD from coming out with their own platform or OS for games.

This seems to be the era of OS divergence, might as well take a gamble.

Guest said:

AMD is right, we are waiting for solutions!

dividebyzero dividebyzero, trainee n00b, said:

I'm surprised Richard Huddy gets paid an actual salary at this point in time.

API's like DirectX are hardware abstraction layers to enable code to run on a variety of hardware. Now Mr Huddy seems to think that removing this abstraction layer is good idea.

So it you're programming "direct-to-metal (hardware)" that means you are also programming for a specific architecture - AMD or Nvidia (or VIA for that matter). The bigger problem seems to be that every new GPU architecture will automatically have to be backwards compatible with the game hardware code implemented for the previous generations of cards....unless of course everyone is happy buying a new card and having little or no access to the back catalogue of games.

@gwailo247

Oh, it get's much better than that.

How about Gaming Evolved or TWIMTBP exclusive titles?

How about HD7000/HD8000/GTX600/GTX700 exclusive titles?

As the bit-tech article mentions, the "direct-to-metal" approach is already implemented in consoles- proprietry hardware running proprietry code. Now extrapolate that to include the release of PS4, 5, 6, 7 and Xbox whatever, whatever2 etc. every year. Even then it doesn't come close to the clusterf**k that PC gaming would turn into, given that game dev's aren't exactly falling over themselves using industry standard and well understood API's.

Somehow, given the secrecy involved in developing new graphics architectures, I can't see AMD or Nvidia handing EA, THQ, Crytek et al. the keys to the kingdom so that the devs can code a game and have it ready for release for that particular hardware. Of course if they wait until the hardware is released and then start coding, it would seem that the impending games have a very limited lifespan- since the new architecture is likely to laid down at the same time.

Of course , I suppose if the graphics cards of the future become fully GPGPU capable (top to bottom) then that allows for some wiggle room with coding...but then this is talking-head of AMD talking...

And of course all we need is Nvidia, AMD, Intel and VIA to agree on a standard for homogeneous software coding.......bwahahahahahah

yRaz yRaz said:

the reason graphics aren't as good is they optimize everything for a console and all we get is a crappy console port. All they can really do is increase texture size and AA. Graphics already look 10 times as good. With the Cryengine 3 and the new unreal engine, graphics will be a few orders of magnitude better than consoles. Photo realistic graphics are right around the corner. I give it 5 years and we will be playing games that look like Avatar.

dividebyzero dividebyzero, trainee n00b, said:

Diminishing returns.

If one HD 6970 (for example) can run 7680x3200 (24.6 megapixels) in Eyefinity @ say 4xAA, then wouldn't-using Mr Huddy's analogy- Crossfired HD6990's be able to run 15360x6400 (98.3 megapixels) at 16xAA ?

Kibaruk Kibaruk, TechSpot Paladin, said:

If you please read, Carmack said "one is better than the other" and not "they are the best out there and are problem free" which does not differ from the opinion of Huddy.

And of course YRaz you may give one or two titles that differ... how about you give 10 examples of how better it looks on a PC over consoles?

TeamworkGuy2 said:

dividebyzero said:

... So it you're programming "direct-to-metal (hardware)" that means you are also programming for a specific architecture - AMD or Nvidia (or VIA for that matter). The bigger problem seems to be that every new GPU architecture will automatically have to be backwards compatible

...

Of course , I suppose if the graphics cards of the future become fully GPGPU capable (top to bottom) then that allows for some wiggle room with coding... And of course all we need is Nvidia, AMD, Intel and VIA to agree on a standard for homogeneous software coding.......bwahahahahahah

Exactly, a programmer has to program for a specific architecture.

Maybe in tomorrows utopia of social peace and good will with nanotechnology and the ability to create everything we want with a 3D printer, we could have the major GPU makers get together and decide on a universal architecture or software coding, but until then, good luck abandoning an Application Programming Interface like DirectX.

Win7Dev said:

Well, when an entire game can be made in CUDA or DirectCompute there won't be as much of an issue. With gpu computing things are rapidly speeding up. The only problem is that these technologies are hardware specific, meaning games would have to be made for nvidia or ati/amd systems. There would be no middle ground and there would be little room for new technologies because the new ones would kill support for all past games.

Tanstar said:

The first thing I thought was how the game makers were going to tailor make each of their games for each and every computer combination possible. Secondly I thought, who would want the game makers that regularly sneak security software that screws up computers that much access to their computer?

Guest said:

Back in the day we did all our coding in assembly language & ran our demo's/loaders from dos.

Coders then had direct access to the hardware & were able to unlock many hidden features.

I remember setting 320x240 and tweaking that to 640x400/480. Smooth scrolling of large fonts.

Even now you can't do that!

Check out Futurecrew demos. I think they were early 90's...

Anyway, as soon as you moved to some other language & started using their libraries you pretty much had your hands tied..

MrAnderson said:

Ok this is all smoke an mirrors... you really can not blame microsoft. Direct X is an option.

The Graphics card manufacturers in the end can give lower level access. The fact that they do not just shows that they should be supporting OpenGL better. Also... lower level will mean a lot of branching because you can never count on the same config. which is why we need software layers. They are very performent and pretty powerful. MS could provide a lighter OS mode for gaming. That would be nice.

Guest said:

writing games to the hardware layer may help gaming be less OS specific and more Hardware specific. Making more game easily compatible to alternative operating systems....

treetops treetops said:

They have a good point, to bad there is no firefox equivalent of directx.

Omnislip said:

yRaz said:

don't look 10 times as good?

http://www.youtube.com/watch?v=1Kvl31g77Z8

Cryengine 3

http://www.youtube.com/watch?v=RSXyztq_0uM

Unreal engine 3

SDFU AMD

Let's see a game like that coming out oooooh no there isn't one.

STFU noob.

Guest said:

I think the problem is that we have reached the limit of the human's brain hardware.

Guest said:

pfft comparing graphics is for nerds

Guest said:

If you sell hardware you prefer have as many as possible of the programmers write on your hardware.

But all the programmers haven't the same skills so we need a very skilled programmer like Microsoft writes an api like directx so the rest can exist as programmers and have an opportunity for success.

That's the reason which the evolution of the industry has chosen this path in time.

If you remove the engines from a plane for little time you will fly better but the crash waiting you at the end.

Guest said:

We can't every time rediscover the wheel. If the apis doesn't ever exists the industry would had few big and fat companies with good dlls which they sit on their money and then the people will start go away because the luck of innovate.

Then these companies will start sell the dlls for rescue the market and after few years the champion dll will be the new directx :)

Guest said:

A Bugatti Veyron has 10 x horsepower than a Ford but it doesn't have 10 x speeds. Not even 10 x speed from a single horse :)

Guest said:

I want AMD right now make a cheap gpu at 1nm which can handle fast the ray tracing algorithm. I don't care about physics laws I just want play my games with ray tracing on.

yRaz yRaz said:

Omnislip said:

yRaz said:

don't look 10 times as good?

http://www.youtube.com/watch?v=1Kvl31g77Z8

Cryengine 3

http://www.youtube.com/watch?v=RSXyztq_0uM

Unreal engine 3

SDFU AMD

Let's see a game like that coming out oooooh no there isn't one.

STFU noob.

Crysis 2 comes out....tomorrow? I'm sure that BF3 and ESV will have crap graphics too.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

I think @dividedbyzero definitely summed up the issue best. It's really a bit of a "duh" statement, that DirectX gets in the way in gaming. It's the middle man, giving a constant and simplified interface to a huge variety of (often very different) graphics hardware.

Would things run much faster without DirectX (or OpenGL) in the middle? Certainly, you can't deny that. But, that would push us back to the old early GPU gaming days, where a game only worked (or only worked well) on specific graphics hardware. Even the basic divergence between generations within AMD or nVidia would cause havoc with compatibility and support. It would basically come down to this: either game programmers coded to a specific GPU platform, and the users of other GPUs lost out (cutting off a potentially massive consumer base)... or game programmers have to code multiple variants to cover every GPU possibility, greatly increasing complexity, time and cost to produce the game.

If you really wanted to make the "PC gaming is dying" comments a reality, throw APIs like OpenGL and DirectX out, then see how fast game developers abandon the PC ship for the smoother waters of consoles.

Guest said:

At one point things were written directly to HW however that means you have to write to each and every configuration you wish to support. Abstraction of the HW layers made it easier to write programs to a broader audience.

Forget this argument for the moment; just create better games and people will be happy. You don't have to have all the latest wiz bang graphics to make a great game. A great game has a good story line, is fun and engaging to play, is enjoyable, has repeatable elements and if you play it a few years later it's still good.

You can take an OK movie and make it 3D and it doesn't make it any better, it's still an OK movie and in 3D may even be worse. The same goes for games, you can take an OK console game and port it to PC...

edison5do said:

People Please Read!!!

"To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way"

The guy give all of you the reason!!he says most of the reason for DirectX to be in the way are GOOD!

Guest said:

We're already starting to see some of the problems of fragmentation; look at what happens when a title uses PhysX. If you're running an AMD card, PhysX goes to the CPU and gives you a massive performance hit, instead of going to the GPU like it does with nVidia.

The fundamental problem with "programming direct to metal" is that not every PC has the same metal to program to, which is the entire reason that the DirectX and OpenGL API's were created in the first place. Any developer who wants to go that route would either have to write different code for every piece of hardware out there - multiplying development time substantially - or else sacrifice at least half of their potential sales, probably more, as end users would have to do extensive research before purchasing anything, to check for incompatibilities.

Mizzou Mizzou said:

And of course all we need is Nvidia, AMD, Intel and VIA to agree on a standard for homogeneous software coding.......bwahahahahahah

As I know you are well aware, it was and is the inability of hardware manufacturers to agree on common standards that led to the widespread adoption of API's in the first place, not only in graphics but across virtually all hardware components. Really don't think we want to return to the days of hardwiring games to a specific graphics card architecture. At least today you have a reasonable degree of assurance that your current games will probably run on the next generation of hardware.

Zilpha Zilpha said:

edison5do said:

People Please Read!!!

"To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way"

The guy give all of you the reason!!he says most of the reason for DirectX to be in the way are GOOD!

I agree - I read that part and was like, "huh?"

Gosh, I don't want them to ever get rid of the abstraction layers - how is this guy making the kind of living he is when he clearly understands nothing about what it takes to actually program something? Or maybe he does, but is trying to convince himself he is correct? Oi - guys like this should have never been promoted off the bench.

MrAnderson said:

I still think this is a sound bite to garner attention.

With out Opengl and DirectX console gaming would not even get this far.

EVen with DirectX you can get plenty out of the GPU. EVen with Crysis... people made a big deal that they could not turn everything up. Oh boo hoo was DirectX slowing it down. Please... you could run the game pretty well and with better quality than on a console with the resolutions that are HD. All the extra bells and wistles are future proofing. It only turned into negative media. Why fault a developer for building a system that you cannot fully use until the furture. When even the basic settings were worlds away from what had been going on. I find it annoying...

And again... DirectX is not stopping the GPU manufactures from giving driect access to the metal. But how many games would sell then? In order for GPUs to continue to improve, they need to be able to evolve, and the Software layer allowed all this to happen... so it is like biting the hand that feeds.

Nothing is stoping developers from creating a unique project on hardware... but who is that crazy that has a business riding on it? Only academics and indie developers that are not out for only bucks would be able to experiment in this way.

lipe123 said:

You know what, the whole article is just wrong.

The reason PC gfx is not as far ahead as it could be is because the PC market share is nothing compared to consoles.

Almost every PC game released in the last 2 years is a direct console port.

Consoles make all the money and Pc's just get a half decent working ported version of the same games. I personally hate consoles because back in the day it was the reverse. Pc's had the majority market share and we had the best of everything. I can't exactly fault the console guys because their product is WAY easier to use and doesn't require a upgrade every year.

Would be nice to get some PC exclusive games to try and grab back some market share, but then those games need to properly take advantage of the fact that a PC has more than 6 buttons and a joystick!

dividebyzero dividebyzero, trainee n00b, said:

I still think this is a sound bite to garner attention.

A given I think. At this stage Richard Huddy seems to have descended to media monkey status.

Only academics and indie developers that are not out for only bucks would be able to experiment in this way.

Academics...yes, Indie...no in my opinion.

I doubt there is an indie developer, nor many game studio's with the resources to program direct for hardware.

Case in point using the current situation as example- big release- Battlefield 3:

Announced January 2011. Best case scenario they need to cater for a minimum of around 20 different architectures - bearing in mind the myriad feature/instruction sets available (or not) to many main archs

-Cayman (VLIW4) -HD6990/6970/6950

-Barts (VLIW5) - HD6870/6850

-Evergreen (previous iteration VLIW5): HD5970/58xx, along with possible minor weaks for Juniper, Redwood and Cedar (HD57xx/56xx/55xx/54xx) as well as Mobility Radeon.

-R700 (HD4000)+ Mobility parts

-R600 (HD3000)+ Mobility parts

-Intel IGP and on-die

-Fermi scalar (GF100/110 etc.)

-Fermi Superscalar (GF104/114 etc.)+ mobile

-GT2xx + mobile

-G80/92/94/96/98/MCP7A/MCP78

-S3 Chrome 400/500 series

(basically anything earlier than 2008 you could write off in compatibility I think)

Add in -since the announcement:

-Turks/Caicos (tweaked VLIW5 architecture) HD64xx/65xx/66xx)

-Fusion (Zacate/Ontario) APU's

Launched at, or shortly after the games release:

-Kepler (GTX6xx)

-Southern Islands (HD7xxx)

-Llano APU

-Ivy Bridge

Personally I would think that most game developers would be swallowed up whole. Big game studio's would concentrate almost entirely upon consoles because of the reduced overheads. AAA titles and graphics cards would share a degree of built-in redundancy...and since you are hardcoding for specific architectures, how long would it be before some bright spark decided that DRM should be a part of the hardcoding or hardwired into the GPU if game studio's become de-facto subsidiaries of AMD/Nvidia/Intel ......a hundred milliseconds?...less?

Guest said:

Game consoles= mp3 compressed quality.

PC gaming = True HD quality.

If you have an eye for better graphics you'll go with the better graphics, but if lossy compressd video games don't bother you in terms of differentiating any difference in video quality then it's your choice.

I prefer PC gaming since it offers a far more visually stunning.

If AMD Nvidia or microsoft could create a OS specific to gaming, then that market share they've lost to console will be reduced over time.

Guest said:

I find it amusing the Crytek demo was a console demo. The PC graphics capabilities of today far out perform the capabilities of the Xbox360 and PS3.

Just like the guy from AMD was try to convey. If the consoles can do that, then imagine what the PC hardware of today is capable of doing.

His point is quite valid and it applies to Nvidia just as much as it applies to AMD.

Guest said:

Hey, it's 4k intro all over again! Too bad you don't have a GUS, I didn't include SB... sorry! But you know what, why stop at DirectX, let's take out Windows and boot into the game, shall we?

On the other hand, I must say, DirectX is like everything else the big brother produces. No further comment on this one.

Guest said:

Obviously, a gaming OS would open up a new world... I thought about that a long time ago. Maybe MS could come up with Windows Gaming Edition(R).... :)

Darth Shiv Darth Shiv said:

Mr Huddy also pointed out that Intel's general purpose graphics performance was pathetic. So it's not only DirectX that is holding people back, but the replacement for it.

Anyway, for backwards compatibility, a GPGPU can run DirectX and OpenGL abstraction layers. Because they are general purpose!

Darth Shiv Darth Shiv said:

Guest said:

Obviously, a gaming OS would open up a new world... I thought about that a long time ago. Maybe MS could come up with Windows Gaming Edition(R)....

Only if you are running homogeneous hardware would that beat the issue this article is talking about.

What you are asking for therefore already exists... consoles.

dividebyzero dividebyzero, trainee n00b, said:

His point is quite valid and it applies to Nvidia just as much as it applies to AMD.

No one is disputing that the point isn't valid. What is at issue is that if the present hardware abstraction layer can done away with without incurring the obvious pitfalls of hardcoding a game for every graphics architecture in use.

All Mr Huddy is doing is pointing at a problem and saying it can be better without giving any indication on how to solve the inherent problems of hardcoding ...well, lets see just how hard it is to make a valid statement such as Mr. Huddy's...

"World hunger and oppression are serious problems. Giving these people more to eat and affording them their civil rights should solve the problem"

How easy was that! Convinced? then follow me and Richard on Twitter!

So while Richard Huddy may have a valid point, the way it is couched turns it into a vapid point- a puff piece with softball questions (not a first for the bit-tech/Huddy marriage ) and very little, if anything substantive. Of course even when giving vague non-commital answers PR usually manage to screw things up

'Wrapping it up in a software layer gives you safety and security,' says Huddy, 'but it unfortunately tends to rob you of quite a lot of the performance, and most importantly it robs you of the opportunity to innovate.'-Richard Huddy, AMD Developer Relations Manager 16th March, 2011

"The bottom line for us is that we support open standards, such as OpenCL and DirectCompute, we feel this to be a way to move the whole industry forward."- Neal Robison, AMD Senior Director of Content and Application Support -18th March,2011

So on one hand, API's (D3D, OpenGL/CL) are robbing -presumeably- devs and AMD (along with VIA/Intel/Sony/MS etc.) of the opportunities to innovate, but AMD also feels that API's (OpenCL and D3D) are the way forward....I see.

Guest said:

Why in the world would you want to run 640 x 480 now?

gingerbill said:

This article is kinda stating the obvious . It's like me saying " wouldnt cars be brilliant without petrol , god i'm a genuis" . I do think now though PC graphics are far ahead of console again , for a few years after console's came out it was just a slight edge , now it seems pretty big to me again. I'm an avid PC and console gamer but the console is losing its appeal as the PC pulls far ahead again and all console games start to have the same washed out look about them . Was impressive what they got out of the console's performance wise.

Guest said:

Considering AMD can't even get OpenGL support working properly for their own cards from one driver release to the next I don't see what this guy is pissfarting about.... Maybe if you "AMD" work/support more with game DEVs like nVidia does their games would look better on your hardware

T77 T77 said:

The games may not be ten times as good,but are surely better than their counterparts.

I hope everyone agrees.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.