The evolution of the modern graphics processor began with the introduction of the first 3D add-in cards in 1995. This development coincided with the widespread adoption of 32-bit operating systems and the affordability of personal computers. The graphics industry that existed before the PC was largely dominated by more prosaic 2D architecture. Graphics boards were known for their chip's alphanumeric naming conventions and huge price tags. 3D gaming and PC virtualization graphics emerged from a variety of sources, including arcade and console gaming, military applications, robotics, space simulators, and medical imaging.

The early days of 3D consumer graphics were marked by a "Wild West" of competing ideas, ranging from hardware implementation to various rendering techniques, application and data interfaces, and persistent naming hyperbole. These early graphics systems featured a fixed function pipeline (FFP) and an architecture that adhered to a very rigid processing path. There were almost as many graphics APIs as there were 3D chip makers.

3D graphics transformed the somewhat dull PC industry into a spectacle of light and magic after generations of innovative endeavor. This article is the first installment of a TechSpot Special series that extensively explores the history of the GPU. We revisit the early days of 3D consumer graphics, highlighting the game-changing impact of 3Dfx, and the industry's consolidation at the turn of the century. The series concludes (for now) with a look at today's modern general-purpose GPUs, which we have come to love and use for gaming. These GPUs are now transforming the entire industry.

TechSpot's History of the GPU

The modern graphics processor has become one of the largest, most complex, and most expensive components found in almost any computing device. From the early VGA days to the modern GPU. The history and evolution of the chip that came to dominate gaming, and later AI and compute.

Part 1: The Early Days of 3D Consumer Graphics
1976 - 1995
Part 2: 3Dfx Voodoo: The Game-changer
1995 - 1999
Part 3: The Nvidia vs. ATI Era Begins
2000 - 2006
Part 4: The Modern GPU: Stream Processing Units
2006 - 2013
Part 5: Pushing GPU Technology Into New Territory
2013 - 2020

1976 - 1995:

The Early Days of 3D Consumer Graphics

The first true 3D graphics originated with early display controllers, known as video shifters and video address generators. These devices acted as a pass-through between the main processor and the display. They converted the incoming data stream into a serial bitmapped video output, which included luminance, color, and vertical and horizontal composite sync. This synchronization was crucial for maintaining the alignment of pixels in a display generation, ensuring the orderly progression of each successive line, and managing the blanking interval (the time between ending one scan line and starting the next).

A flurry of designs emerged in the latter half of the 1970s, laying the foundation for 3D graphics as we know them. One notable example was RCA's "Pixie" video chip (CDP1861), introduced in 1976. It was capable of outputting an NTSC-compatible video signal at a resolution of 62x128, or 64x32 for the short-lived RCA Studio II console.

The video chip was soon succeeded by the Television Interface Adapter (TIA) 1A in 1977. This chip, integral to the Atari 2600, was responsible for generating the screen display, creating sound effects, and reading input from controllers. Jay Miner, who later spearheaded the design of the custom chips for the Commodore Amiga computer, led the development of the TIA.

In 1978, Motorola unveiled the MC6845 video address generator. This device laid the foundation for the Monochrome and Color Display Adapter (MDA/CDA) cards used in the IBM PC in 1981 and also provided similar functionality for the Apple II. Later that year, Motorola introduced the MC6847 video display generator, which was incorporated into several first-generation personal computers, including the Tandy TRS-80.

A similar solution came from Commodore's MOS Technology subsidiary with the VIC (Video Interface Chip), which delivered graphics output for Commodore home computers produced between 1980 and 1983.

In November of the following year, LSI introduced the ANTIC (Alphanumeric Television Interface Controller) and the CTIA/GTIA co-processor (Color or Graphics Television Interface Adapter) in the Atari 400. The ANTIC processed 2D display instructions using Direct Memory Access (DMA).

Like many video co-processors of the time, it was capable of generating playfield graphics (such as background, title screens, and scoring displays), while the CTIA was responsible for generating colors and movable objects. Yamaha and Texas Instruments provided similar integrated circuits (ICs) to a range of early home computer manufacturers.

The next significant steps in the graphics evolution occurred primarily in professional fields. Intel's 82720 graphics chip served as the foundation for the $1,000 iSBX 275 Video Graphics Controller Multimode Board. This board was capable of displaying eight-color data at a resolution of 256x256 or monochrome at 512x512.

Its 32KB of display memory enabled it to draw lines, arcs, circles, rectangles, and character bitmaps. The chip also featured capabilities for zooming, screen partitioning, and scrolling.

Silicon Graphics (SGI) soon followed with their IRIS Graphics for workstations. This was a GR1.x graphics board that allowed for the addition of separate daughter boards for enhanced color options, geometry processing, Z-buffering, and Overlay/Underlay functionalities.

Intel's $1,000 iSBX 275 Video Graphics Controller Multimode Board was capable of displaying eight color data at a resolution of 256x256 (or monochrome at 512x512).

Industrial and military 3D virtualization technologies were relatively advanced at this time. Companies like IBM, General Electric, and Martin Marietta (which would later acquire GE's aerospace division in 1992), along with a multitude of military contractors, technology institutes, and NASA, undertook various projects that required this technology for military and space simulations.

The U.S. Navy developed a flight simulator in 1951 using 3D virtualization technology from MIT's Whirlwind computer. In addition to defense contractors, there were companies that bridged the gap between military markets and professional graphics sectors.

Evans & Sutherland, who would later provide professional graphics card series such as the Freedom and REALimage, also developed graphics for the CT5 flight simulator. This $20 million package was powered by a DEC PDP-11 mainframe. Ivan Sutherland, the company's co-founder, created a computer program in 1961 called Sketchpad, which allowed the drawing geometric shapes and displaying on a CRT in real-time using a light pen. This was the progenitor of the modern Graphical User Interface (GUI).

In the less esoteric field of personal computing, Chips and Technologies introduced the 82C43x series of EGA (Extended Graphics Adapter). These adapters provided much-needed competition to IBM's offerings and were commonly found in many PC/AT clones around 1985. That year was also noteworthy for the Commodore Amiga, which was released with the OCS chipset. This chipset consisted of three main component chips – Agnus, Denise, and Paula – allowing some graphics and audio processing to be independent of the CPU.

In August of 1985, three Hong Kong immigrants, Kwok Yuan Ho, Lee Lau, and Benny Lau, established Array Technology Inc in Canada. By the end of the year, the company had been renamed ATI Technologies Inc.

ATI released their first product the following year: the OEM Color Emulation Card. It was designed for outputting monochrome green, amber, or white phosphor text against a black background to a TTL monitor via a 9-pin DE-9 connector. The card was equipped with a minimum of 16KB of memory and contributed significantly to ATI's CAD$10 million in sales during the company's first year, largely thanks to a contract to supply around 7,000 chips per week to Commodore Computers.

ATI's Color Emulation Card came with a minimum 16KB of memory and was responsible for a large part of the company's CAD$10 million in sales the first year of operation.

The advent of color monitors and the absence of a standard among the various competitors led to the formation of the Video Electronics Standards Association (VESA), of which ATI was a founding member alongside NEC and six other graphics adapter manufacturers.

In 1987, ATI expanded its product line for OEMs with the Graphics Solution Plus series. This series utilized IBM's PC/XT ISA 8-bit bus for Intel 8086/8088-based IBM PCs. The chip supported MDA, CGA, and EGA graphics modes through dip switches. Essentially, it was a clone of the Plantronics Colorplus board but with capacity for 64KB of memory. Paradise Systems' PEGA1, 1a, and 2a (256KB), released in 1987, were also clones of Plantronics boards.

The EGA Wonder series 1 to 4, launched in March for $399, featured 256KB of DRAM and was compatible with CGA, EGA, and MDA emulation, supporting resolutions up to 640x350 with 16 colors. Extended EGA was available for series 2, 3, and 4.

The high-end segment included the EGA Wonder 800, which offered 16-color VGA emulation and supported a resolution of 800x600, and the VGA Improved Performance (VIP) card. The VIP card was essentially an EGA Wonder with a digital-to-analog converter (DAC) added to provide limited VGA compatibility. The VIP card was priced at $449, with an additional $99 for the Compaq expansion module.

ATI was not alone in capitalizing on the growing consumer interest in personal computing. Numerous companies and products emerged during that period. New entrants included Trident, SiS, Tamerack, Realtek, Oak Technology, LSI's G-2 Inc., Hualon, Cornerstone Imaging, and Winbond – all established between 1986 and 1987. Meanwhile, companies like AMD, Western Digital/Paradise Systems, Intergraph, Cirrus Logic, Texas Instruments, Gemini, and Genoa began producing their first graphics products around this time.

ATI's Wonder series continued to receive significant updates over the following years. In 1988, the Small Wonder Graphics Solution, featuring a game controller port and composite output options (for CGA and MDA emulation), was released, along with the EGA Wonder 480 and 800+ with Extended EGA and 16-bit VGA support. The VGA Wonder and Wonder 16, with added VGA and SVGA support, were also introduced.

A Wonder 16 equipped with 256KB of memory retailed for $499, while a variant with 512KB was priced at $699. In 1989, the VGA Wonder/Wonder 16 series was updated, including the lower-cost VGA Edge 16 (Wonder 1024 series). New features included a bus mouse port and support for the VESA Feature Connector, a gold-fingered connector similar to a shortened data bus slot connector, used to link via a ribbon cable to another video controller to bypass a congested data bus.

The Wonder series continued to move apace in 1991. The Wonder XL card added VESA 32K color compatibility and a Sierra RAMDAC, increasing the maximum display resolution to 640x480 @ 72Hz or 800x600 @ 60Hz. Prices varied from $249 (256KB) to $349 (512KB), with a 1MB RAM option available for $399. A lower-cost version, the VGA Charger, based on the previous year's Basic-16, was also released.

The Mach series was launched with the Mach8 in May of that year. It was sold either as a chip or a board, allowing for the offloading of limited 2D drawing operations, such as line-drawing, color-filling, and bitmap combination (BitBLT), via a programming interface (API).

ATI also introduced a variation of the Wonder XL that incorporated a Creative Sound Blaster 1.5 chip on an extended PCB. Known as the VGA Stereo-F/X, it was capable of simulating stereo sound from Sound Blaster mono files at a quality comparable to FM radio.

Graphics boards like the ATI VGAWonder GT offered a 2D + 3D option, combining the Mach8 with the graphics core (28800-2) of the VGA Wonder+ for 3D duties. The success of the Wonder and Mach8 series helped ATI surpass the CAD$100 million sales milestone for the year, largely driven by the adoption of Windows 3.0 and the increased 2D workloads it facilitated.

S3 Graphics was formed in early 1989 and produced its first 2D accelerator chip and a graphics card 18 months later, the S3 911 (or 86C911). Key specs for the latter included 1MB of VRAM and 16-bit color support.

The S3 911 was quickly succeeded by the 924 later that year – essentially a revised 911 with 24-bit color support – and was further updated the following year with the 928, which added 32-bit color support, and the 801 and 805 accelerators. The 801 utilized an ISA interface, while the 805 used VLB (VESA Local Bus). From the introduction of the 911 to the emergence of the 3D accelerator, the market was inundated with 2D GUI designs based on S3's original architecture, notably from Tseng Labs, Cirrus Logic, Trident, IIT, ATI's Mach32, and Matrox's MAGIC RGB.

In January 1992, Silicon Graphics (SGI) released OpenGL 1.0, a multi-platform, vendor-agnostic application programming interface (API) for both 2D and 3D graphics.

Microsoft was developing a rival API of their own called Direct3D and didn't exactly break a sweat making sure OpenGL ran as well as it could under Windows.

OpenGL evolved from SGI's proprietary API, called IRIS GL (Integrated Raster Imaging System Graphical Library). This development was an initiative to separate non-graphical functionality from IRIS and enable the API to run on non-SGI systems, especially as rival vendors began emerging on the horizon with their proprietary APIs. Initially, OpenGL targeted the professional UNIX-based markets. However, due to its developer-friendly support for extension implementation, it was quickly adopted for 3D gaming.

Meanwhile, Microsoft was developing its own rival API, Direct3D, and didn't exactly break a sweat making sure OpenGL ran as well as it could under the new Windows operating systems. Tensions peaked a few years later when John Carmack of id Software, whose release of Doom had revolutionized PC gaming, ported Quake to use OpenGL on Windows. He openly criticized Direct3D, highlighting the growing competition between these two graphics APIs.

Microsoft's intransigence intensified when they refused to license the Mini-Client Driver (MCD) for OpenGL on Windows 95. The MCD would have allowed vendors to select specific features for hardware acceleration. In response, SGI developed the Installable Client Driver (ICD), which not only provided the same capabilities but also improved upon them. Unlike the MCD, which only covered rasterization, the ICD added lighting and transform functionality (T&L).

During the rise of OpenGL, which first gained prominence in the workstation market, Microsoft was focusing on the burgeoning gaming market with plans for its proprietary API. In February 1995, they acquired RenderMorphics, whose Reality Lab API was gaining popularity among developers, and this became the foundation for Direct3D.

Around the same time, 3Dfx's Brian Hook was developing the Glide API, which would become the dominant gaming API. This was partly due to Microsoft's involvement in the Talisman project (a tile-based rendering ecosystem), which diverted resources from DirectX.

As Direct3D became widely available, bolstered by the widespread adoption of Windows, proprietary APIs like S3d (S3), Matrox Simple Interface, Creative Graphics Library, C Interface (ATI), SGL (PowerVR), NVLIB (Nvidia), RRedline (Rendition), and Glide started losing favor with developers.

This decline was exacerbated by the fact that some of these proprietary APIs were tied to board manufacturers who were under increasing pressure to enhance their rapidly expanding feature lists. These enhancements included higher screen resolutions, greater color depth (progressing from 16-bit to 24 and then 32 bits), and image quality improvements such as anti-aliasing. All these features demanded increased bandwidth, greater graphics efficiency, and faster product development cycles.

By 1993, market volatility had already forced a number of graphics companies to withdraw from the business, or to be absorbed by competitors.

The year 1993 ushered in a flurry of new graphics competitors, most notably Nvidia, founded in January of that year by Jen-Hsun Huang, Curtis Priem, and Chris Malachowsky. Huang was previously the Director of Coreware at LSI, while Priem and Malachowsky both hailed from Sun Microsystems, where they had developed the SunSPARC-based GX graphics architecture.

Soon joining the fray were fellow newcomers Dynamic Pictures, ARK Logic, and Rendition. The market's volatility had already led to the exit of several graphics companies or their absorption by competitors. Among these were Tamerack, Gemini Technology, Genoa Systems, Hualon, Headland Technology (acquired by SPEA), Acer, Motorola, and Acumos (purchased by Cirrus Logic).

However, one company consistently gaining strength was ATI. As a forerunner to the All-In-Wonder series, ATI announced the 68890 PC TV decoder chip in late November, which debuted in the Video-It! card. This chip was capable of capturing video at 320x240 at 15 fps or 160x120 at 30 fps. It also featured real-time compression/decompression thanks to the onboard Intel i750PD VCP (Video Compression Processor) and could communicate with the graphics board via the data bus, eliminating the need for dongles, ports, and ribbon cables. The Video-It! was priced at $399, while a less feature-rich model named Video-Basic rounded out the product line.

Five months later, in March, ATI belatedly introduced a 64-bit accelerator: the Mach64. The financial year proved challenging for ATI, with a CAD$2.7 million loss as the company struggled amid fierce competition. Rival boards included the S3 Vision 968, which was popular among many board vendors, and the Trio64, which secured OEM contracts with Dell (Dimension XPS), Compaq (Presario 7170/7180), AT&T (Globalyst), HP (Vectra VE 4), and DEC (Venturis/Celebris).

Released in 1995, the Mach64 achieved several notable firsts. It was the first graphics adapter available for both PC and Mac computers, as demonstrated by the Xclaim ($450 and $650, depending on onboard memory). Along with S3's Trio, it offered full-motion video playback acceleration.

The Mach64 also ushered in ATI's entry into professional graphics cards with the 3D Pro Turbo and 3D Pro Turbo+PC2TV, priced at $599 for the 2MB option and $899 for the 4MB variant.

The following month witnessed the emergence of a technology startup, 3DLabs, born from DuPont's Pixel graphics division acquiring its subsidiary. They brought the GLINT 300SX processor, capable of OpenGL rendering, fragment processing, and rasterization.

Initially targeting the professional market due to their high prices, the Fujitsu Sapphire2SX 4MB retailed for $1,600-$2,000, while the 8MB ELSA GLoria 8 was priced between $2,600 and $2,850. However, the 300SX was intended for the gaming market.

S3 seemed to be everywhere at that time. The high-end OEM market was dominated by the company's Trio64 chipsets, which integrated a DAC, graphics controller, and clock synthesizer into a single chip.

The 1995 Gaming GLINT 300SX, featuring a reduced 2MB of memory (1MB for textures and Z-buffer and 1MB for the frame buffer), had an option to increase VRAM for Direct3D compatibility for an additional $50 over the $349 base price. Despite its capabilities, the card struggled in an already saturated market, but 3DLabs was already developing its successor in the Permedia series.

S3 seemed to be everywhere at that time. The high-end OEM marked was dominated by the company's Trio64 chipsets that integrated DAC, a graphics controller, and clock synthesiser into a single chip. They also utilized a unified frame buffer and supported hardware video overlay (a dedicated portion of graphics memory for rendering video as the application requires). The Trio64 and its 32-bit memory bus sibling, the Trio32, were available as OEM units and standalone cards from vendors such as Diamond, ELSA, Sparkle, STB, Orchid, Hercules and Number Nine. Diamond Multimedia's prices ranged from $169 for a ViRGE based card, to $569 for a Trio64+ based Diamond Stealth64 Video with 4MB of VRAM.

The mainstream market also saw offerings from Trident, a long-time OEM supplier of basic 2D graphics adapters, who had recently added the 9680 chip to their lineup. This chip shared many features with the Trio64, with boards typically priced around $170-200. They offered acceptable 3D performance in that price range, with good video playback capability.

Other newcomers in the mainstream market included Weitek's Power Player 9130 and Alliance Semiconductor's ProMotion 6410 (often branded as the Alaris Matinee or FIS's OptiViewPro). Both showed excellent scaling with CPU speed, with the latter combining a strong scaling engine with anti-blocking circuitry for smoother video playback, outperforming previous chips like the ATI Mach64, Matrox MGA 2064W, and S3 Vision968.

Nvidia launched their first graphics chip, the NV1, in May, and it became the first commercial graphics processor capable of 3D rendering, video acceleration, and integrated GUI acceleration.

They partnered with ST Microelectronics to produce the chip using their 500nm process. ST Microelectronics also promoted their version of the chip, the STG2000. Although it wasn't a huge success, it did represent Nvidia's first financial return. Unfortunately for Nvidia, just as the first vendor boards began shipping (notably the Diamond Edge 3D) in September, Microsoft finalized and released DirectX 1.0.

The D3D graphics API was based on rendering triangular polygons, whereas the NV1 used quad texture mapping. Limited D3D compatibility was added via drivers to wrap triangles as quadratic surfaces, but the lack of games tailored for the NV1 doomed the card as a jack of all trades, master of none.

Most of the games for the NV1 were ported from the Sega Saturn. A 4MB NV1 with integrated Saturn ports (two per expansion bracket connected to the card via a ribbon cable) retailed for around $450 in September 1995.

Microsoft's late changes and the launch of the DirectX SDK left board manufacturers unable to directly access hardware for digital video playback, resulting in functionality issues for virtually all discrete graphics cards in Windows 95. By contrast, drivers under Win 3.1 from various companies were generally faultless.

ATI announced their first 3D accelerator chip, the 3D Rage (also known as the Mach 64 GT), in November 1995. The first public demonstration of it was at the E3 video game conference held in Los Angeles in May the following year, and the card became available a month later. The 3D Rage combined the 2D core of the Mach64 with 3D capabilities.

Late revisions to the DirectX specification meant that the 3D Rage had compatibility problems with many games using the API, mainly due to the lack of depth buffering. With an onboard 2MB EDO RAM frame buffer, 3D modality was limited to 640x480x16-bit or 400x300x32-bit. Attempting 32-bit color at 600x480 generally resulted in on-screen color corruption, and 2D resolution peaked at 1280x1024. While gaming performance was mediocre, its full-screen MPEG playback ability at least somewhat balanced the feature set.

The 3D performance race was effectively over before it had started, with the 3Dfx Voodoo Graphics annihilating all competition.

ATI reworked the chip, and in September, the Rage II launched. It rectified the D3DX issues of the first chip and added MPEG2 playback support. Initial cards still shipped with 2MB of memory, hampering performance and causing issues with perspective/geometry transform. As the series expanded to include the Rage II+DVD and 3D Xpression+, memory capacity options grew to 8MB.

While ATI was first to market with a 3D graphics solution, it wasn't long before other competitors with different approaches to 3D implementation arrived on the scene, namely 3Dfx, Rendition, and VideoLogic.

In the race to release new products into the marketplace, 3Dfx won over Rendition and VideoLogic. The performance race was over before it had started, with the 3Dfx Voodoo Graphics effectively annihilating all competition.

This is the first article on our History of the GPU series. If you enjoyed this, keep reading as we take a stroll down memory lane to the heyday of 3Dfx, Rendition, Matrox, and a young company called Nvidia.

TechSpot's History of the GPU

The modern graphics processor has become one of the largest, most complex, and most expensive components found in almost any computing device. From the early VGA days to the modern GPU. The history and evolution of the chip that came to dominate gaming, and later AI and compute.

Part 1: The Early Days of 3D Consumer Graphics
1976 - 1995
Part 2: 3Dfx Voodoo: The Game-changer
1995 - 1999
Part 3: The Nvidia vs. ATI Era Begins
2000 - 2006
Part 4: The Modern GPU: Stream Processing Units
2006 - 2013
Part 5: Pushing GPU Technology Into New Territory
2013 - 2020