The History of the Modern Graphics Processor, Part 1

By Julio Franco · 37 replies
Mar 27, 2013
Post New Reply
  1. Scalibq

    Scalibq TS Rookie

    IBM did go better than VGA actually, they had the 8514/A adapter and XGA. It's just that neither achieved the standard status of the MDA/CGA/EGA/VGA, mainly because third-party SVGA solutions were much cheaper (and not compatible with 8514/A or XGA), so that's what went into most clones.

    Speaking of CGA, there seems to be an error in the article:
    "This became the basis for the IBM PC’s Monochrome and Color Display Adapter (MDA/CDA) cards of 1981"
    It's Color Graphics Adapter: CGA. Not CDA.

    Not at all. Not sure why people see it that way. I think it is severely skewed by the fact that early GeForce/Radeon cards benefited from their T&L only in OpenGL, because of D3D being lower-level, and T&L acceleration could not be integrated without an API overhaul.
    Microsoft already fixed that in DX7 though. And DX8/9 were mostly evolutionary updates from DX7 (DX8 adding programmable shaders and making windowed rendering easier to do, DX9 mainly updating shaders to SM2.0 and later 3.0).

    That is not stupid at all. What they update is the DirectX runtime. Microsoft updates these runtimes from time to time to fix some bugs, improve the compiler, things like that.
    Games have to make sure that the runtime on the system is at least as new as the one that the game is compiled against. The easiest way to do that is to run Microsoft's DirectX runtime installer, which will automatically update any components if necessary:
    Phr3d and dividebyzero like this.
  2. Mbloof

    Mbloof TS Rookie Posts: 56   +7

    I call it a 'train wreck' as it kept changing. Users would have to match card+driver+DX version on every new game release. Oddly we are still stuck with games attempting to install DX.
  3. Scalibq

    Scalibq TS Rookie

    That doesn't make sense.
    Firstly, you did NOT have to match card, driver and DX version. Up to and including DX7, all lower driver versions were supported (and obviously also video cards which didn't support all the latest features). After that, a minimum driver version was imposed, but aside from DX10, this version was always lower than the latest API version (eg DDI6 driver for DX8, DDI7 driver for DX9, and DDI9 driver for DX11).

    Secondly, change is not necessarily a bad thing. Video cards evolved at an alarming rate. Microsoft kept updating their API to incorporate the latest features. OpenGL also received tons of API updates, and things spun out of control with vendor-specific extension hell in an attempt to keep up. Which is why OpenGL was abandoned in favour of D3D. If anything, OpenGL was the trainwreck.

    And I already explained why games install the DX runtime. Next time, bother to read my WHOLE post. I even linked to the DX runtime so you can read in Microsoft's own words why it should be installed:
    "The DirectX redist installation includes all the latest and previous released DirectX runtime. This includes D3DX, XInput, and Managed DirectX components."

    Note: latest AND previous versions. For certain parts of the runtime, several versions are installed side-by-side. Not having the latest version installed will mean the game cannot start because it cannot resolve a dependency on certain DLLs. So, they HAVE to install it.
  4. Justin Powell

    Justin Powell TS Rookie

    Im sure your going to mention the N64 sometime b/c 3dfx wasnt the game-changer. N64 came out in japan in june/july of 1996 a full six months before 3dfx came around. It was the first main stream product that the avagage joe could afford, so I believe it desevers more credit than the 3dfx.
  5. ATI's mach series was awesome. I had a mach32 vlb card in my 486. Along with a vlb IO board too.

    I still have a Mach 64 PCI card. It's still in active use too. Only as a graphics card for an esxi vmware box that rarely sees a monitor, but for access to the text console, it's a great card. Doesn't use hardly any power, only uses a pci slot, leaving pci-e slots free for other cards, etc.

    It's dated 1995/96. Not bad for a graphics card to be in active service 17/18 years later.
  6. Awesome article, waiting for part two.:)
  7. digitalsyrup

    digitalsyrup TS Rookie

    OMFG. I really feel old now. I've seen and had pretty much all of the cards in this article in the PC's I've owned growing up. Thanks for the stroll down memory lane!
  8. Nice article
  9. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,224   +164

    Remember these ?
  10. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,264

  11. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,224   +164

    was that ISA or even before that interface?
  12. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,224   +164

  13. UniqueName2

    UniqueName2 TS Rookie

    The Intel 82720 was an OEM NEC 7220, which was used in a *lot* of different graphics platforms at the time.

    P.S. - This isn't the "history of the GPU". It's the "history of GPU technology that was adapted to the PC platform". Would have been much more instructive if you'd showed where those technologies actually came from.

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...