IBM did go better than VGA actually, they had the 8514/A adapter and XGA. It's just that neither achieved the standard status of the MDA/CGA/EGA/VGA, mainly because third-party SVGA solutions were much cheaper (and not compatible with 8514/A or XGA), so that's what went into most clones.Not bad, jumped around quite a bit but missed some of the important milestones. The 'main stream' standard(s) followed the IBM-PC - CGA (4 color), EGA (16 color) and the introduction of VGA gave us (for the 1st time) 'photo realistic' 320x200x256 colors. While IBM never actually released anything better than 'VGA' just about everybody attempted to improve on it.
Speaking of CGA, there seems to be an error in the article:
"[SIZE=14px]This became the basis for the IBM PC’s Monochrome and Color Display Adapter (MDA/CDA) cards of 1981"[/SIZE]
[SIZE=14px]It's Color Graphics Adapter: CGA. Not CDA.[/SIZE]
Not at all. Not sure why people see it that way. I think it is severely skewed by the fact that early GeForce/Radeon cards benefited from their T&L only in OpenGL, because of D3D being lower-level, and T&L acceleration could not be integrated without an API overhaul.However DX was its own 'train wreck' until DX9C.
Microsoft already fixed that in DX7 though. And DX8/9 were mostly evolutionary updates from DX7 (DX8 adding programmable shaders and making windowed rendering easier to do, DX9 mainly updating shaders to SM2.0 and later 3.0).
That is not stupid at all. What they update is the DirectX runtime. Microsoft updates these runtimes from time to time to fix some bugs, improve the compiler, things like that.Stupidly even todays games will attempt to install DX on Vista W7 and W8 systems that come with DX already installed.
Games have to make sure that the runtime on the system is at least as new as the one that the game is compiled against. The easiest way to do that is to run Microsoft's DirectX runtime installer, which will automatically update any components if necessary: