TechSpot

history articles

History of the Personal Computer, Part 5: Computing goes mainstream, mobile, ubiquitous

The new millennium brought a closer relationship between people and computers. More portable devices became the conduit that enabled humans' basic need to connect. It's no surprise that computers transitioned from productivity tool to indispensable companion as connectivity proliferated. This is the fifth and final installment in a series exploring the history of the microprocessor and personal computing.

history personal computer part computing apple microsoft windows amd intel arm

History of the Personal Computer, Part 4: The mighty Wintel empire

Intel's existence traced back to the breakup of Shockley Electronics and Fairchild Semiconductor. Determined to avoid the same fate, lawsuits became object lessons to employees, a means of protecting its IP, and a method of tying up a competitor's financial resources. This is the fourth installment in a five part series, where we look at the history of the microprocessor and personal computing, from the invention of the transistor to modern day chips.

History of the Personal Computer, Part 3: IBM PC Model 5150 and the attack of the clones

IBM's stature guaranteed the PC to initiate a level of standardization required for a technology to attain widespread usage. That same stature also ensured competitors would have unfettered access to the technical specifications of the Model 5150. This is the third installment in a five part series, where we look at the history of the microprocessor and personal computing, from the invention of the transistor to modern day chips powering our connected devices.

History of the Personal Computer: Leading up to Intel's 4004, the first commercial microprocessor

The personal computing business as we know it owes itself to an environment of enthusiasts, entrepreneurs and happenstance. The invention of the microprocessor, DRAM, and EPROM integrated circuits would help bring computing to the mainstream. This is the first in a five-part series exploring the history of the microprocessor and personal computing, from the invention of the transistor to modern day chips powering our connected devices.

What the creators of Doom and Myst thought of each other's games in 1993

It's been more than 20 years since Doom and Myst changed the landscape of video games forever. One exponentially advanced the language and possibilities of the first-person shooter genre and the other pulled players into a fictional world like never before. Despite the fact that they spoke to vastly different audiences, both games were huge, phenomenal hits. Not surprisingly, two of the men instrumental to each game's success initially hated the game the other guy produced.

Iconic Hardware: The products that made a dent on the PC industry

What makes a product iconic? Design, functionality, styling, and innovation will get you part of the way there, but the true tests are how these products distinguished themselves from their competitors, how widely those traits were imitated by those competitors, and how history remembers their status. Here are some products that left their mark on the PC industry, whether in the form of full systems, CPUs, graphics cards, motherboards, cases or peripherals.

From 1982's E.T. to Present Day's Watch Dogs: How Much Does It Cost to Make a Video Game?

You pay $60 for many of the new games you play, but how much does a blockbuster game cost to make? Many in the industry don't even know the budgets of games. It is not unusual for developer working on a big-budget game to have no idea of the game's budget. To answer the question, we've pulled a bunch of scattered data from public sources as a first attempt to get a comprehensive sense of how much money the world's biggest and most expensive games cost.

video game production cost

Looking Back at 2013: The Year's Top Tech Stories

Twenty-thirteen has been an eventful year marked by a changing landscape in the computing world, product refreshes in the mobile scene, a new generation of consoles, the rise of Bitcoin and the harsh realization that pretty much nothing you do online (and sometimes offline) is private. Our yearly recount of the most relevant tech happenings awaits...

top tech news 2013

The History of the Modern Graphics Processor, Part 4: The Coming of General Purpose GPUs

With DX10's arrival, vertex and pixel shaders maintained a large level of common function, so moving to a unified shader arch eliminated a lot of unnecessary duplication of processing blocks. The first GPU to utilize this architecture was Nvidia's iconic G80.

Four years in development and $475 million produced a 681 million-transistor, 484mm² behemoth -- first as the 8800 GTX flagship and then with cards aimed at several segments. Aided by the new Coverage Sample anti-aliasing (CSAA) algorithm, Nvidia saw its GTX demolish every single competitor in outright performance.

The History of the Modern Graphics Processor, Part 3: The Nvidia vs. ATI era begins

With the turn of the century the graphics industry bore witness to further consolidation. Where 3dfx was once a byword for raw performance, its strengths before its dismissal laid in its full screen antialiasing image quality. By the time 2001 dawned, the PC graphics market consisted of a discrete card duopoly (Nvidia and ATI), with both of them in addition to Intel supplying the vast majority of integrated graphics chipsets.

Prior to the Voodoo 5’s arrival, ATI had announced the Radeon DDR as “the most powerful graphics processor ever designed for desktop PCs.” Previews of the card had already gone public on April 25, and only twenty-four hours later Nvidia countered with the announcement of the GeForce 2 GTS (GigaTexel Shader).

The History of the Modern Graphics Processor, Part 2: 3Dfx Voodoo, the game-changer

Launched on November 1996, 3Dfx's Voodoo graphics consisted of a 3D-only card that required a VGA cable pass-through from a separate 2D card to the Voodoo, which then connected to the display. Voodoo Graphics revolutionized personal computer graphics nearly overnight and rendered many other designs obsolete, including a vast swathe of 2D-only graphics producers.

The 3D landscape in 1996 favoured S3 with around 50% of the market. That was to change soon, however. It was estimated that 3Dfx accounted for 80-85% of the 3D accelerator market during the heyday of Voodoo’s reign. Later on Nvidia would revive with the RIVA series and eventually land their greatest success with the first GeForce graphics card.

The History of the Modern Graphics Processor, Part 1

The evolution of the modern graphics processor begins with the introduction of the first 3D add-in cards in 1995, followed by the widespread adoption of the 32-bit operating systems and the affordable personal computer.

While 3D graphics turned a fairly dull PC industry into a light and magic show, they owe their existence to generations of innovative endeavour. Over the next few weeks we'll be taking an extensive look at the history of the GPU, going from the early days of 3D consumer graphics, to the 3Dfx Voodoo game-changer, the industry's consolidation at the turn of the century, and today's modern GPGPU.

Looking Back at 2012: The Year's Top Tech Stories

As the year comes to an end it's time to look back at the most interesting and relevant tech stories of 2012. Numerous trends consolidated during 2012: Apple’s dominance in the sector, mobile growth, fast-paced releases on the smartphone world, the Windows 8 launch, only to name a few.

This year we have divided stories in 12 heavily packed categories, with nearly 500 hand-picked headlines total. Feel free to jump around between your favored topics, but try not to miss the tech culture section where we revisit some of the most entertaining stories we covered this year. Here’s our take on 2012…

Hardware Apple The Web CPU/Graphics Google Tech Culture Software Microsoft Gaming Mobile Patent Wars Security

Looking Back at 2011: The Year's Most Relevant Tech Stories

Twenty-eleven is almost over and as we conclude our end-of-year articles it's time to look back at some of the most relevant stories of 2011. We'll do a brief recount of tech happenings in six categories: Desktop CPUs and Graphics, Hardware Industry, Devices and Components, Software, Gaming, Mobile Computing and The Web.

It's been quite an eventful year that's for sure. We hope you enjoyed our daily dispatch of PC technology news and analysis as much as we’ve enjoyed bringing it to you. Without further ado, here’s our take on 2011.

PARC: How Xerox contributed to the first laser printer, GUI, ethernet and other mainstream technologies

Launched in 1970, Xerox's PARC has played an instrumental role in the engineering of laser printing and many of the technologies that compose the PC you're reading this on: ethernet, the mouse, graphical user interface, among others.

However despite its vast industry contributions, the group has been criticized for failing to capitalize on its many innovations. While some of our older readers might be familiar with the prolific Palo Alto Research Center, we think its accomplishments have largely escaped the younger tech crowd. We'd like to take a few minutes to give credit where credit's due.