1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

AMD shows off DirectX 11 graphics chip

By Jos · 16 replies
Jun 3, 2009
  1. This morning at Computex, Rick Bergman from AMD gave attendees a first peek at the world’s first graphics processor capable of supporting DirectX 11. The demonstration was not so much about the specific hardware the company plans to introduce later in 2009, but rather to show some of the improvements DX11 brings over its predecessor and to assure us all that it will deliver the technology first.

    Read the whole story
  2. Star*Dagger

    Star*Dagger TS Rookie

    DX11 will be the most important advance in PC graphics since the 3dfx cards.

    Note: Keep those Codemaster devs in mp3s, they should stay away from video interviews, the guys at Emergent looked decent though.
  3. TomSEA

    TomSEA TechSpot Chancellor Posts: 2,633   +695

    "DX11 will be the most important advance in PC graphics since the 3dfx cards."

    That's a pretty bold statement, Star*Dagger. What is it about DX11 that makes you believe that?
  4. Captain828

    Captain828 TS Guru Posts: 287   +10

    While no one can know for sure if DX11 will be the next "DX9" boom, IMHO DX11's success will reside heavily on Windows 7's adoption rate.

    Vista didn't do so well, so "true" DX10 games where few, even if the latest Steam survey data show DX10 hardware as being widely available. And let's not forget how taxing Crysis and WiC were when they first came out... and they still are.
    This makes it difficult for a dev to make full use of DX10 and most tended to go the DX9 path instead, which makes sense.

    Though I agree with Tom that S*D's comment is a bold one, DX11 would finally bring features that can truly help with rendering.
    Those mentioned in the article are some of them. Others would be less taxing AA on the GPU and even more programmable DX than before.
  5. TomSEA

    TomSEA TechSpot Chancellor Posts: 2,633   +695

    That makes sense. DX10 just never really happened.....
  6. Rage_3K_Moiz

    Rage_3K_Moiz Sith Lord Posts: 5,443   +36

    Tessellation and multi-threading are the main reasons. Especially the fact that tessellation is being done without triangles for the first time ever, making it quite a leap. With the new method, massive amounts of detail is possible, and it can potentially eliminate the pop-up effect common in games with large open environments, instead allowing developers to have a continuous level of detail all over the environment, making games more immersive than ever before. It will literally allow developers to take games to the next level.
  7. TBolt

    TBolt TS Rookie Posts: 65

    Looking forward to it...but time will tell wether DX11 is a "revolution" or "evolution".

    That being said, DxDiag in the Win7 RC is reporting DX11...I assume that if I had a DX11 card now or get one prior to the RC going inactive next year I'd be able to see what the whole skinny is? Or is the DX11 being reported in the RC just a tease and actually DX 10.1 or something (Nvidia's control panel - most current drivers - is reporting it as DX10)?
  8. Proofix

    Proofix TS Rookie Posts: 23

    @Captain828 Very True m8!!
  9. DX10 hasn't had much support, mainly due to the fact of Nvidia not adding it to their own cards. Nvidia has tried to control the gaming market, and they even forced one game company to remove the DX10.1 support. Its Nvidias fault.
  10. PanicX

    PanicX TechSpot Ambassador Posts: 669

    I'd tend to think its Microsofts fault. Windows XP is still the most widely used OS, especially by gamers and Microsoft refused to give XP directX 10 support.

    Why would developers code games and purposely exclude their largest market?
  11. There's DX10 support, and there's DX10 exclusive development... DX10 support is everywhere, in both hardware and games - look at many of the latest games, and they have optimized settings for DX10, but they still support DX9 modes. What Microsoft tried to accomplish was to have developers program EXCLUSIVELY in DX10, thus absolutely requiring an upgrade to Vista. That heavy handed attempt to shove a crappy and bugged operating system down gamers throats made many developers backpedal fast and keep the DX9 support going, so they didn't alienate their customer base. There were only a small handful of titles that came out in pure DX10 format, like Shadowrun and one of the Halo series, I believe? Sales were pretty dismal on those titles, btw...

    If game developers support DX11, more power to them. At least DX11 will work on Vista as well, so those out there who bought a PC recently (or for some reason upgraded) won't be left out in the cold. But if the developers don't go DX9/DX11, the XP gaming community will be forced to upgrade. Unless, of course, the DX11 API is released to support XP as well... But we all know that won't happen, just as the DX10 API wasn't ever integrated into XP (and there was no reason it couldn't other than cutting into Vista sales profits).
  12. A little off there. DX10 support is in all the latest nVidia products. If you mean DX10.1, it's fully supported in all the ATi products, but really means very little - it was a caveat added to the DX10 standard, and has no real bonus effect or bearing on gaming in general (other than some basic optimization and parallel cube mapping, which is effectively ignored by most developers).

    nVidia has chosen to just concentrate on developing to the DX11 standard, since any DX11 cards will have to comply with the DX10.1 API along with all the new stuff. Why spend the time tinkering with a minor tweak that won't make any difference, when you can concentrate on making hardware and drivers that regularly squash the competition's entry? As a long-time ATi user, I always cringe seeing nVidia trounce comparable ATi card performance... *sigh*
  13. raybay

    raybay TS Evangelist Posts: 7,241   +10

    Microsoft continues to make the same mistake over and over... force users into upgrades that will make money for Microsoft... rather than creating software so appealing that most users would want to leap upward to the latest advance...
    After dealing with hundreds upon hundreds of unhappy VISTA users, and impossible downgrades to WXPP because the drivers do not exist and I don't have the time to rewrite the existing drivers... I am beginning to resent Microsoft.
    As a Stockholder, I see Microsoft becoming more stoopid while costing themselves lost profits.

    The video graphics issue with DX9 vs DX10 vs the possibilities in DX11 quickly and easily highlight how users and developers are being disrespected by the Microsoft folks... to the loss of everybody including Microsoft.
  14. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,223   +163

    I dont agree with nvidia "trouncing" ati. nvidias answer thus far has been to hang on to the same architecture and bolt two cards together in the name of shear horsepower, and while the gtx295 is the current frame rate champ, i prefer ATI innovation approach and getting more performance with efficiency and innovation. I have to wonder why nvidia will not make use of DDR5, its logical i think to believe that it is because of a limit in their architecture. Im sure ATI will have an answer coming in response to the 285/295 meanwhile nvidia has run out of bandwidth. if i understand it correctly the gtx 295 pushes 222 GB/s and the max transfer rate of PCIE 2.0 is 250 GB/s. the gtx 300 due out soon will have to exceed the PCIE 2.0 limit, and PCIE 3.0 and its 500 GB/s transfer rate is not due out for another 3 years. I have more faith that ATI will design chips that are more efficient and optimized as the GPU has become as important as the CPU for the enthusiast and bumps up against the current limits of the PCIE bus.
    okay now the disclaimer, I have had so much fun with my triple crossfire machine, im fixin to build a quad crossfire monster. so if im all wet about this theory someone set me straight, no fanboys please, im interested in facts and learning something.

  15. Rage_3K_Moiz

    Rage_3K_Moiz Sith Lord Posts: 5,443   +36

    I agree with red1776. NVIDIA's approach is akin to that of GM; chips that are blazing fast, but hot and power-hungry as well (compared to competing products). AMD has always been the innovator in the GPU and CPU industry, although they have been slow to catch up to their competitors of late (take the ATI HD 3000 series and the Phenom series for example), but their new products are fiercely competitive, with the HD 4000 series and Phenom II chips being priced at far more attractive price points than competiting products.

    That being said however, I don't mind NVIDIA's approach; competition, even if it is only between two companies, ultimately benefits me, the consumer.

    @red1776, PCI-E 2.0 has a transfer speed of 500MB/s per lane; it isn't possible to push more than 8GB/s per GPU. Are you mixing up the rates?
  16. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,223   +163

    @ Rage:
    opps, my bad brain cramp, :)

    per lane:

    pcie 1.0 250MB/s
    pcie 2.0 500MB/s
    pcie 3.0 1 GB/s

    and Yes I think the competion is great as well.
  17. T77

    T77 TS Enthusiast Posts: 300   +6

    dx10 has had support in many games.these games also have dx9 mode so that they can run on xp.nvidia cards have support for dx10,but not dx10.1 ati cards have support for dx10.1
    i dont think theres any fault on nvidias part.
    its ms who restricted dx10 to vista to boost its sales.
Topic Status:
Not open for further replies.

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...