AMD: DirectX "getting in the way" of PC gaming graphics

I think @dividedbyzero definitely summed up the issue best. It's really a bit of a "duh" statement, that DirectX gets in the way in gaming. It's the middle man, giving a constant and simplified interface to a huge variety of (often very different) graphics hardware.

Would things run much faster without DirectX (or OpenGL) in the middle? Certainly, you can't deny that. But, that would push us back to the old early GPU gaming days, where a game only worked (or only worked well) on specific graphics hardware. Even the basic divergence between generations within AMD or nVidia would cause havoc with compatibility and support. It would basically come down to this: either game programmers coded to a specific GPU platform, and the users of other GPUs lost out (cutting off a potentially massive consumer base)... or game programmers have to code multiple variants to cover every GPU possibility, greatly increasing complexity, time and cost to produce the game.

If you really wanted to make the "PC gaming is dying" comments a reality, throw APIs like OpenGL and DirectX out, then see how fast game developers abandon the PC ship for the smoother waters of consoles.
 
At one point things were written directly to HW however that means you have to write to each and every configuration you wish to support. Abstraction of the HW layers made it easier to write programs to a broader audience.

Forget this argument for the moment; just create better games and people will be happy. You don’t have to have all the latest wiz bang graphics to make a great game. A great game has a good story line, is fun and engaging to play, is enjoyable, has repeatable elements and if you play it a few years later it’s still good.

You can take an OK movie and make it 3D and it doesn’t make it any better, it’s still an OK movie and in 3D may even be worse. The same goes for games, you can take an OK console game and port it to PC…
 
People Please Read!!!

"To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way"

The guy give all of you the reason!!he says most of the reason for DirectX to be in the way are GOOD!
 
We're already starting to see some of the problems of fragmentation; look at what happens when a title uses PhysX. If you're running an AMD card, PhysX goes to the CPU and gives you a massive performance hit, instead of going to the GPU like it does with nVidia.

The fundamental problem with "programming direct to metal" is that not every PC has the same metal to program to, which is the entire reason that the DirectX and OpenGL API's were created in the first place. Any developer who wants to go that route would either have to write different code for every piece of hardware out there - multiplying development time substantially - or else sacrifice at least half of their potential sales, probably more, as end users would have to do extensive research before purchasing anything, to check for incompatibilities.
 
And of course all we need is Nvidia, AMD, Intel and VIA to agree on a standard for homogeneous software coding.......bwahahahahahah

As I know you are well aware, it was and is the inability of hardware manufacturers to agree on common standards that led to the widespread adoption of API's in the first place, not only in graphics but across virtually all hardware components. Really don't think we want to return to the days of hardwiring games to a specific graphics card architecture. At least today you have a reasonable degree of assurance that your current games will probably run on the next generation of hardware.
 
edison5do said:
People Please Read!!!

"To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way"

The guy give all of you the reason!!he says most of the reason for DirectX to be in the way are GOOD!

I agree - I read that part and was like, "huh?"
Gosh, I don't want them to ever get rid of the abstraction layers - how is this guy making the kind of living he is when he clearly understands nothing about what it takes to actually program something? Or maybe he does, but is trying to convince himself he is correct? Oi - guys like this should have never been promoted off the bench.
 
I still think this is a sound bite to garner attention.

With out Opengl and DirectX console gaming would not even get this far.

EVen with DirectX you can get plenty out of the GPU. EVen with Crysis... people made a big deal that they could not turn everything up. Oh boo hoo was DirectX slowing it down. Please... you could run the game pretty well and with better quality than on a console with the resolutions that are HD. All the extra bells and wistles are future proofing. It only turned into negative media. Why fault a developer for building a system that you cannot fully use until the furture. When even the basic settings were worlds away from what had been going on. I find it annoying...

And again... DirectX is not stopping the GPU manufactures from giving driect access to the metal. But how many games would sell then? In order for GPUs to continue to improve, they need to be able to evolve, and the Software layer allowed all this to happen... so it is like biting the hand that feeds.

Nothing is stoping developers from creating a unique project on hardware... but who is that crazy that has a business riding on it? Only academics and indie developers that are not out for only bucks would be able to experiment in this way.
 
You know what, the whole article is just wrong.

The reason PC gfx is not as far ahead as it could be is because the PC market share is nothing compared to consoles.
Almost every PC game released in the last 2 years is a direct console port.

Consoles make all the money and Pc's just get a half decent working ported version of the same games. I personally hate consoles because back in the day it was the reverse. Pc's had the majority market share and we had the best of everything. I can't exactly fault the console guys because their product is WAY easier to use and doesn't require a upgrade every year.

Would be nice to get some PC exclusive games to try and grab back some market share, but then those games need to properly take advantage of the fact that a PC has more than 6 buttons and a joystick!
 
I still think this is a sound bite to garner attention.
A given I think. At this stage Richard Huddy seems to have descended to media monkey status.
Only academics and indie developers that are not out for only bucks would be able to experiment in this way.
Academics...yes, Indie...no in my opinion.
I doubt there is an indie developer, nor many game studio's with the resources to program direct for hardware.
Case in point using the current situation as example- big release- Battlefield 3:
Announced January 2011. Best case scenario they need to cater for a minimum of around 20 different architectures - bearing in mind the myriad feature/instruction sets available (or not) to many main archs
-Cayman (VLIW4) -HD6990/6970/6950
-Barts (VLIW5) - HD6870/6850
-Evergreen (previous iteration VLIW5): HD5970/58xx, along with possible minor weaks for Juniper, Redwood and Cedar (HD57xx/56xx/55xx/54xx) as well as Mobility Radeon.
-R700 (HD4000)+ Mobility parts
-R600 (HD3000)+ Mobility parts
-Intel IGP and on-die
-Fermi scalar (GF100/110 etc.)
-Fermi Superscalar (GF104/114 etc.)+ mobile
-GT2xx + mobile
-G80/92/94/96/98/MCP7A/MCP78
-S3 Chrome 400/500 series
(basically anything earlier than 2008 you could write off in compatibility I think)

Add in -since the announcement:
-Turks/Caicos (tweaked VLIW5 architecture) HD64xx/65xx/66xx)
-Fusion (Zacate/Ontario) APU's

Launched at, or shortly after the games release:
-Kepler (GTX6xx)
-Southern Islands (HD7xxx)
-Llano APU
-Ivy Bridge

Personally I would think that most game developers would be swallowed up whole. Big game studio's would concentrate almost entirely upon consoles because of the reduced overheads. AAA titles and graphics cards would share a degree of built-in redundancy...and since you are hardcoding for specific architectures, how long would it be before some bright spark decided that DRM should be a part of the hardcoding or hardwired into the GPU if game studio's become de-facto subsidiaries of AMD/Nvidia/Intel ......a hundred milliseconds?...less?
 
Game consoles= mp3 compressed quality.

PC gaming = True HD quality.

If you have an eye for better graphics you'll go with the better graphics, but if lossy compressd video games don't bother you in terms of differentiating any difference in video quality then it's your choice.

I prefer PC gaming since it offers a far more visually stunning.

If AMD Nvidia or microsoft could create a OS specific to gaming, then that market share they've lost to console will be reduced over time.
 
I find it amusing the Crytek demo was a console demo. The PC graphics capabilities of today far out perform the capabilities of the Xbox360 and PS3.

Just like the guy from AMD was try to convey. If the consoles can do that, then imagine what the PC hardware of today is capable of doing.

His point is quite valid and it applies to Nvidia just as much as it applies to AMD.
 
Hey, it's 4k intro all over again! Too bad you don't have a GUS, I didn't include SB... sorry! But you know what, why stop at DirectX, let's take out Windows and boot into the game, shall we?

On the other hand, I must say, DirectX is like everything else the big brother produces. No further comment on this one.
 
Obviously, a gaming OS would open up a new world... I thought about that a long time ago. Maybe MS could come up with Windows Gaming Edition(R).... :)
 
Mr Huddy also pointed out that Intel's general purpose graphics performance was pathetic. So it's not only DirectX that is holding people back, but the replacement for it.

Anyway, for backwards compatibility, a GPGPU can run DirectX and OpenGL abstraction layers. Because they are general purpose!
 
Guest said:
Obviously, a gaming OS would open up a new world... I thought about that a long time ago. Maybe MS could come up with Windows Gaming Edition(R).... :)
Only if you are running homogeneous hardware would that beat the issue this article is talking about.

What you are asking for therefore already exists... consoles.
 
His point is quite valid and it applies to Nvidia just as much as it applies to AMD.
No one is disputing that the point isn't valid. What is at issue is that if the present hardware abstraction layer can done away with without incurring the obvious pitfalls of hardcoding a game for every graphics architecture in use.
All Mr Huddy is doing is pointing at a problem and saying it can be better without giving any indication on how to solve the inherent problems of hardcoding ...well, lets see just how hard it is to make a valid statement such as Mr. Huddy's...
"World hunger and oppression are serious problems. Giving these people more to eat and affording them their civil rights should solve the problem"
How easy was that! Convinced? then follow me and Richard on Twitter!

So while Richard Huddy may have a valid point, the way it is couched turns it into a vapid point- a puff piece with softball questions (not a first for the bit-tech/Huddy marriage ) and very little, if anything substantive. Of course even when giving vague non-commital answers PR usually manage to screw things up
'Wrapping it up in a software layer gives you safety and security,' says Huddy, 'but it unfortunately tends to rob you of quite a lot of the performance, and most importantly it robs you of the opportunity to innovate.'-Richard Huddy, AMD Developer Relations Manager 16th March, 2011
"The bottom line for us is that we support open standards, such as OpenCL and DirectCompute, we feel this to be a way to move the whole industry forward."- Neal Robison, AMD Senior Director of Content and Application Support -18th March,2011

So on one hand, API's (D3D, OpenGL/CL) are robbing -presumeably- devs and AMD (along with VIA/Intel/Sony/MS etc.) of the opportunities to innovate, but AMD also feels that API's (OpenCL and D3D) are the way forward....I see.
 
This article is kinda stating the obvious . It's like me saying " wouldnt cars be brilliant without petrol , god i'm a genuis" . I do think now though PC graphics are far ahead of console again , for a few years after console's came out it was just a slight edge , now it seems pretty big to me again. I'm an avid PC and console gamer but the console is losing its appeal as the PC pulls far ahead again and all console games start to have the same washed out look about them . Was impressive what they got out of the console's performance wise.
 
Considering AMD can't even get OpenGL support working properly for their own cards from one driver release to the next I don't see what this guy is pissfarting about.... Maybe if you "AMD" work/support more with game DEVs like nVidia does their games would look better on your hardware
 
The games may not be ten times as good,but are surely better than their counterparts.
I hope everyone agrees.
 
Back