Microsoft to unveil DirectX 12 at GDC

What I am wondering is....if AMD is going to be supporting DirectX 12 then they probably already knew about this philosophy of getting closer to the 'metal' long before the general public probably did. And yet they already released an API of their own that does this. Why would they make such a dumb move. It almost sounds like DirectX 12 will make AMDs Mantle API obsolete in record time. Nobody is going to want to have to write code for two different APIs for their games.
I'm starting to view AMD as Mr. Bill. They just can't catch a break. They try and try and then a long tall bus comes along and squashes them.


I see it differently. AMD has always been on the side of pushing the envelop. look at integrated memory controllers, they were the first to come out with that with their A64. Intel merely copied it and made a more successful version of it. AMD also were the very first to mass produce and made use of RISC based instruction set on a X86 platforum before Intel went from multi stage CISC to CISC/RISC hybrid, and again made a more successful version of it. AMD merely lacks the "follow through" and poor management. without companies like AMD poking at the innovation boundaries, you're left with giant corps not thinking outside of the box. I applaud AMD for all their innovations, although not all saw fruition, but none the less making progress for the general tech industry.
 
I see it differently. AMD has always been on the side of pushing the envelop. look at integrated memory controllers, they were the first to come out with that with their A64.

Since AMD is nowhere with their processors today, keeping the graphics division is their last straw. And even there they have been having many problems when it comes to the quality of the hardware and the drivers for years. Their latest offers suffer from excessive noise and heat. The only way they can keep it afloat is by throwing in everything they got in terms of the technology, ready or not. And that's what we all see nowadays, whereas nVidia products look far more mature and refined.
 
Since AMD is nowhere with their processors today, keeping the graphics division is their last straw. And even there they have been having many problems when it comes to the quality of the hardware and the drivers for years. Their latest offers suffer from excessive noise and heat. The only way they can keep it afloat is by throwing in everything they got in terms of the technology, ready or not. And that's what we all see nowadays, whereas nVidia products look far more mature and refined.

I happen to go back and forth between Nvidia and ATI and find that in terms of driver and heat issues, they both suck. hence why I always mod my own graphics card and overlock on my own. so at least on that front, I find both of them to be at the same level.
 
What I am wondering is....if AMD is going to be supporting DirectX 12 then they probably already knew about this philosophy of getting closer to the 'metal' long before the general public probably did. And yet they already released an API of their own that does this. Why would they make such a dumb move. It almost sounds like DirectX 12 will make AMDs Mantle API obsolete in record time. Nobody is going to want to have to write code for two different APIs for their games.
I'm starting to view AMD as Mr. Bill. They just can't catch a break. They try and try and then a long tall bus comes along and squashes them.

I have a feeling AMD didn't know about this, and this is more-or-less a direct response towards them and the mobile community with the addition of Qualcomm. Microsoft may not pay much attention to us PC gamers, but they do have a stranglehold over us with DirectX, and AMD is encroaching on that. Unfortunately for AMD, Mantle has been very slow going, and depending on Microsofts timeline, this may kill off any real Mantle adoption before the majority of gamers even experience it.
 
Whoa it took ages for games to support dx10 when it came out then DX 11 and it seems 20-30games are using it, BUT the big question is how many are truely utilizing DX11 instead of just adding it or using 1 feature out of how many it has.
 
I guess this means I'll be replacing my GTX 770s sooner rather than later (it's a shame too since I really like the cards).

There were right around 25 games released for dx 11 in the first 2 1/2 years and only around 90 in its entire 4 1/2 years.

I think you will be fine with your 770's for a quite some time yet.

Dave
 
Modern video cards could easily do a firmware+software update to support updated versions of DirectX naturally. The only reason they do not do that is to maximize profit from selling negligible updates over and over again. The famous trio of Microsoft+ATI+nVidia have been doing it for years, and it has worked out well for them so far, sucking money from your pockets. It is difficult for a timid eye to see this through, because in that area they are a monopoly now, and sing the same song everywhere.

EDITED: B.T.W. I am an owner of nVidia GTX 780 which I purchased to keep for 4 years. Before that it was Radeon 5870 which I used for 4 years. Although I do buy premium systems, I do it only once in 4 years, to spend on true updates, versus minor ones.
We know this but it's sales that keep them in business. Contrary to what we read and hear about phones, computers tablets etc. being obsolete and dated after a new model has released is bull faeces/feces but generates billions in sales. Like yourself I recently upgraded my rig (my wallets still receiving CPR btw) I work on a four year cycle as well.
 
I hope we see ray tracing in DX12!
Ask AMD's Roy Taylor - he seems to have his finger on the pulse ;)

Ray Tracing would be great. We'd need to see much more powerful GPUs, or the API has to be far more efficient than DX11. Either way PC gaming would instantly become what the NEXT gen console "MAY" look like about 5 years ahead of the curve.

Even if Ray Tracing isn't in the cards, it'd be great if midrange cards become useful again. I remember when a $300 video card was the king of the hill in performance, but you pretty much get the same graphic fidelity out of a $150-$200 card. Today you need to drop $250 to even play a game (1080p/60 is a minimum in my book,) and if you want max details you're looking at $500 (This gen I've owned an AMD 7770, 7850/R9 265, GTX 680/770, GTX 760, AMD R9 290, GTX 780, so I've seen quite a few.) Since Titan things have gotten out of hand. Hopefully this brings price/performance back to Earth.
 
Ray Tracing would be great. We'd need to see much more powerful GPUs, or the API has to be far more efficient than DX11. Either way PC gaming would instantly become what the NEXT gen console "MAY" look like about 5 years ahead of the curve.
I wouldn't get your hopes up just yet. There is a very good chance that DX12 might just be a software orientated revision of the DX11 spec. Maximising draw calls/minimizing overhead- that sort of thing. With Qualcomm on board it tends to point to software rather than hardware, as does the abbreviated timeframe. DirectX10 had a gestation of 2+ years. DirectX11 got underway around the same time DX10.1 came out and took a year and a half or so to reach primetime.
All in all it seems a knee-jerk reaction to the OpenGL extensions, Mantle, and MS's move in to the mobile space, so it wouldn't surprise me if DX12 was compatible with DX11 (or even DX10) hardware excepting some of the hardware specific 11.2 features.

A lot depends upon whether the 20th marks the launch of the specification, or just some preliminary PR bumpf / wishlist.
 
Ray Tracing would be great. We'd need to see much more powerful GPUs, or the API has to be far more efficient than DX11. Either way PC gaming would instantly become what the NEXT gen console "MAY" look like about 5 years ahead of the curve.

Even if Ray Tracing isn't in the cards, it'd be great if midrange cards become useful again. I remember when a $300 video card was the king of the hill in performance, but you pretty much get the same graphic fidelity out of a $150-$200 card. Today you need to drop $250 to even play a game (1080p/60 is a minimum in my book,) and if you want max details you're looking at $500 (This gen I've owned an AMD 7770, 7850/R9 265, GTX 680/770, GTX 760, AMD R9 290, GTX 780, so I've seen quite a few.) Since Titan things have gotten out of hand. Hopefully this brings price/performance back to Earth.
Currently you need a powerhouse GPU for ray tracing. nVidia was able to show off ray tracing, but with dual Tesla K20s.
 
Actually, a video game scene can be done with acceptable performance (720p + 30FPS) via path tracing on today's high end video cards: [Link]
 
Actually, a video game scene can be done with acceptable performance (720p + 30FPS) via path tracing on today's high end video cards: [Link]
The definition of acceptable performance differs for some people. 720p with a GTX Titan with that level of grain (noise) still looks like a work in progress to me. Brigade (to my understanding) isn't looking past 30 fps, and isn't appreciably different from established ray/path tracing renderers like Octane (here's the same scene in your link rendered via Octane) and V-Ray RT

EDIT: In regards to my previous post surmising that DX12 might be software orientated rather than hardware, the DX12 twitter feed now also tends to support that view.
 
Last edited:
Back