The History of the Modern Graphics Processor

Not bad, jumped around quite a bit but missed some of the important milestones. The 'main stream' standard(s) followed the IBM-PC - CGA (4 color), EGA (16 color) and the introduction of VGA gave us (for the 1st time) 'photo realistic' 320x200x256 colors. While IBM never actually released anything better than 'VGA' just about everybody attempted to improve on it.

IBM did go better than VGA actually, they had the 8514/A adapter and XGA. It's just that neither achieved the standard status of the MDA/CGA/EGA/VGA, mainly because third-party SVGA solutions were much cheaper (and not compatible with 8514/A or XGA), so that's what went into most clones.

Speaking of CGA, there seems to be an error in the article:
"This became the basis for the IBM PC?s Monochrome and Color Display Adapter (MDA/CDA) cards of 1981"
It's Color Graphics Adapter: CGA. Not CDA.

However DX was its own 'train wreck' until DX9C.

Not at all. Not sure why people see it that way. I think it is severely skewed by the fact that early GeForce/Radeon cards benefited from their T&L only in OpenGL, because of D3D being lower-level, and T&L acceleration could not be integrated without an API overhaul.
Microsoft already fixed that in DX7 though. And DX8/9 were mostly evolutionary updates from DX7 (DX8 adding programmable shaders and making windowed rendering easier to do, DX9 mainly updating shaders to SM2.0 and later 3.0).

Stupidly even todays games will attempt to install DX on Vista W7 and W8 systems that come with DX already installed.

That is not stupid at all. What they update is the DirectX runtime. Microsoft updates these runtimes from time to time to fix some bugs, improve the compiler, things like that.
Games have to make sure that the runtime on the system is at least as new as the one that the game is compiled against. The easiest way to do that is to run Microsoft's DirectX runtime installer, which will automatically update any components if necessary:
http://www.microsoft.com/en-us/download/details.aspx?id=8109

I call it a 'train wreck' as it kept changing. Users would have to match card+driver+DX version on every new game release. Oddly we are still stuck with games attempting to install DX.
 
I call it a 'train wreck' as it kept changing. Users would have to match card+driver+DX version on every new game release. Oddly we are still stuck with games attempting to install DX.

That doesn't make sense.
Firstly, you did NOT have to match card, driver and DX version. Up to and including DX7, all lower driver versions were supported (and obviously also video cards which didn't support all the latest features). After that, a minimum driver version was imposed, but aside from DX10, this version was always lower than the latest API version (eg DDI6 driver for DX8, DDI7 driver for DX9, and DDI9 driver for DX11).

Secondly, change is not necessarily a bad thing. Video cards evolved at an alarming rate. Microsoft kept updating their API to incorporate the latest features. OpenGL also received tons of API updates, and things spun out of control with vendor-specific extension hell in an attempt to keep up. Which is why OpenGL was abandoned in favour of D3D. If anything, OpenGL was the trainwreck.

And I already explained why games install the DX runtime. Next time, bother to read my WHOLE post. I even linked to the DX runtime so you can read in Microsoft's own words why it should be installed:
"The DirectX redist installation includes all the latest and previous released DirectX runtime. This includes D3DX, XInput, and Managed DirectX components."

Note: latest AND previous versions. For certain parts of the runtime, several versions are installed side-by-side. Not having the latest version installed will mean the game cannot start because it cannot resolve a dependency on certain DLLs. So, they HAVE to install it.
 
Im sure your going to mention the N64 sometime b/c 3dfx wasnt the game-changer. N64 came out in japan in june/july of 1996 a full six months before 3dfx came around. It was the first main stream product that the avagage joe could afford, so I believe it desevers more credit than the 3dfx.
 
ATI's mach series was awesome. I had a mach32 vlb card in my 486. Along with a vlb IO board too.

I still have a Mach 64 PCI card. It's still in active use too. Only as a graphics card for an esxi vmware box that rarely sees a monitor, but for access to the text console, it's a great card. Doesn't use hardly any power, only uses a pci slot, leaving pci-e slots free for other cards, etc.

It's dated 1995/96. Not bad for a graphics card to be in active service 17/18 years later.
 
OMFG. I really feel old now. I've seen and had pretty much all of the cards in this article in the PC's I've owned growing up. Thanks for the stroll down memory lane!
 
Yep. One of the better remembered Motorola MC6845 derivatives.
734_color_graphics_motorola_mc6845p_bottom_hq.jpg


Most of the others probably aren't even known as graphics manufacturers by many.
Hitachi's HD6845
UMC's UM6845/R
 
The Intel 82720 was an OEM NEC 7220, which was used in a *lot* of different graphics platforms at the time.

P.S. - This isn't the "history of the GPU". It's the "history of GPU technology that was adapted to the PC platform". Would have been much more instructive if you'd showed where those technologies actually came from.
 
People take current graphics in games for granted. Those days, the advent of 3D acceleration in games was a revelation and a revolution, the feeling that is unsurpassed even today. Even the advent of "ray tracing" can't dampen the excitement of first seeing 3D acceleration on your own PC in your bedroom those days.
 
Last edited:
A PDP-11 is a minicomputer, not a mainframe, even if there was quite a range in size for PDP-11 systems.
 
I'd suggest that you maybe read the relevant article before bagging it.

Mystique... pretty good 2D performance and middling 3D performance. The articles concern the rise of 3D graphics in general, where the Mystique found a better home as the 2D card companion for a 3D-only card like the Voodoo Graphics.

FWIW, the original Millennium probably deserves more words devoted to it than the Mystique, whose performance dated quite quickly (lack of bilinear filtering, near non-existent OpenGL support, stipple patterning- no alpha blending, standard 2MB framebuffer which basically killed any kind of texture support). Anyhow, here's a blast from the past for you:

avefps.png

I had a Mystique. It never exceeded expectations but it allowed basic 3D acceleration at a time when many machines otherwise had no such capability.

Fond memories.
 
I think the author is mixing 2D graphics cards with 3D. And even within 2D not all cards were made equal, as some of them were just displaying framebuffers, while others were accelerating 2D graphics (which was needed to draw windows, lines, circles, ellipses and text).

Most of S3 graphics cards were actually 2D graphics cards (until their last models, which weren't yet mentioned in the article). S3 graphics cards were sold as 2D graphics accelerators, but they were actually slower than when drawing 2D graphics primitives using a good CPU of the time (such as Pentium).

For that reason S3 graphics cards (which were incredibly popular) were often called "graphics decelerators".
 
I miss DBZ. very informative. I still have some of those old cards around , a 1 gig Pentium 3 IBM Aptiva, with a 64meg all in wonder with the remote wonder. home video editing.anyone.
 
The article does a poor job at highlighting that the CT5 3D capabilities were absolutely revolutionary.
I had no idea that true, complex 3D rendering and shading existed as early as 1981!
 
Back