The very first transistor -- the foundational building block which almost all of modern civilization was built from -- was created at AT&T’s Bell Labs on December 23 1947. As you can see above, this first transistor was huge and looked nothing like the millions...
You pay $60 for many of the new games you play, but how much does a blockbuster game cost to make? Many in the industry don't even know the budgets of games. It is not unusual for developer working on a big-budget game to have no idea of the game's budget. To answer the question, we've pulled a bunch of scattered data from public sources as a first attempt to get a comprehensive sense of how much money the world's biggest and most expensive games cost.
Twenty-thirteen has been an eventful year marked by a changing landscape in the computing world, product refreshes in the mobile scene, a new generation of consoles, the rise of Bitcoin and the harsh realization that pretty much nothing you do online (and sometimes offline) is private. Our yearly recount of the most relevant tech happenings awaits...
With DX10's arrival, vertex and pixel shaders maintained a large level of common function, so moving to a unified shader arch eliminated a lot of unnecessary duplication of processing blocks. The first GPU to utilize this architecture was Nvidia's iconic G80.
Four years in development and $475 million produced a 681 million-transistor, 484mm² behemoth -- first as the 8800 GTX flagship and then with cards aimed at several segments. Aided by the new Coverage Sample anti-aliasing (CSAA) algorithm, Nvidia saw its GTX demolish every single competitor in outright performance.
With the turn of the century the graphics industry bore witness to further consolidation. Where 3dfx was once a byword for raw performance, its strengths before its dismissal laid in its full screen antialiasing image quality. By the time 2001 dawned, the PC graphics market consisted of a discrete card duopoly (Nvidia and ATI), with both of them in addition to Intel supplying the vast majority of integrated graphics chipsets.
Prior to the Voodoo 5’s arrival, ATI had announced the Radeon DDR as “the most powerful graphics processor ever designed for desktop PCs.” Previews of the card had already gone public on April 25, and only twenty-four hours later Nvidia countered with the announcement of the GeForce 2 GTS (GigaTexel Shader).
Launched on November 1996, 3Dfx's Voodoo graphics consisted of a 3D-only card that required a VGA cable pass-through from a separate 2D card to the Voodoo, which then connected to the display. Voodoo Graphics revolutionized personal computer graphics nearly overnight and rendered many other designs obsolete, including a vast swathe of 2D-only graphics producers.
The 3D landscape in 1996 favoured S3 with around 50% of the market. That was to change soon, however. It was estimated that 3Dfx accounted for 80-85% of the 3D accelerator market during the heyday of Voodoo’s reign. Later on Nvidia would revive with the RIVA series and eventually land their greatest success with the first GeForce graphics card.
The evolution of the modern graphics processor begins with the introduction of the first 3D add-in cards in 1995, followed by the widespread adoption of the 32-bit operating systems and the affordable personal computer.
While 3D graphics turned a fairly dull PC industry into a light and magic show, they owe their existence to generations of innovative endeavour. Over the next few weeks we'll be taking an extensive look at the history of the GPU, going from the early days of 3D consumer graphics, to the 3Dfx Voodoo game-changer, the industry's consolidation at the turn of the century, and today's modern GPGPU.
As the year comes to an end it's time to look back at the most interesting and relevant tech stories of 2012. Numerous trends consolidated during 2012: Apple’s dominance in the sector, mobile growth, fast-paced releases on the smartphone world, the Windows 8 launch, only to name a few.
This year we have divided stories in 12 heavily packed categories, with nearly 500 hand-picked headlines total. Feel free to jump around between your favored topics, but try not to miss the tech culture section where we revisit some of the most entertaining stories we covered this year. Here’s our take on 2012…
Twenty-eleven is almost over and as we conclude our end-of-year articles it's time to look back at some of the most relevant stories of 2011. We'll do a brief recount of tech happenings in six categories: Desktop CPUs and Graphics, Hardware Industry, Devices and Components, Software, Gaming, Mobile Computing and The Web.
It's been quite an eventful year that's for sure. We hope you enjoyed our daily dispatch of PC technology news and analysis as much as we’ve enjoyed bringing it to you. Without further ado, here’s our take on 2011.
PARC: How Xerox contributed to the first laser printer, GUI, ethernet and other mainstream technologies
Launched in 1970, Xerox's PARC has played an instrumental role in the engineering of laser printing and many of the technologies that compose the PC you're reading this on: ethernet, the mouse, graphical user interface, among others.
However despite its vast industry contributions, the group has been criticized for failing to capitalize on its many innovations. While some of our older readers might be familiar with the prolific Palo Alto Research Center, we think its accomplishments have largely escaped the younger tech crowd. We'd like to take a few minutes to give credit where credit's due.