There have been many popular and important games included with operating systems over the years. But only one game can lay claim to having once been the most-used Windows application in the world, as Microsoft’s Chris Sells described Solitaire back in 2004. This is the story of Solitaire, which has been included with every copy of Windows since version 3.0.
#ThrowbackThursday Launched in 1970, Xerox's PARC has played an instrumental role in the engineering of laser printing and many of the technologies that compose the PC you're reading this on: the graphical user interface, ethernet, the mouse, among others. We'd like to take a few and give credit where credit's due.
50 Years of Moore's Law: Fun facts, a timeline infographic and Gordon's own thoughts 5 decades later
Enthusiasts have been pushing the limits of silicon for as long as microprocessors have existed. Early overclocking endeavors involved soldering and replacing crystal clock oscillators, but evolving standards brought options for changing system bus speeds, while some of the most daring would gain boosts through hard modding. These are but a few of the landmark processors revered for their overclocking prowess.
The new millennium brought a closer relationship between people and computers. More portable devices became the conduit that enabled humans' basic need to connect. It's no surprise that computers transitioned from productivity tool to indispensable companion as connectivity proliferated. This is the fifth and final installment in a series exploring the history of the microprocessor and personal computing.
Intel's existence traced back to the breakup of Shockley Electronics and Fairchild Semiconductor. Determined to avoid the same fate, lawsuits became object lessons to employees, a means of protecting its IP, and a method of tying up a competitor's financial resources. This is the fourth installment in a five part series, where we look at the history of the microprocessor and personal computing, from the invention of the transistor to modern day chips.
IBM's stature guaranteed the PC to initiate a level of standardization required for a technology to attain widespread usage. That same stature also ensured competitors would have unfettered access to the technical specifications of the Model 5150. This is the third installment in a five part series, where we look at the history of the microprocessor and personal computing, from the invention of the transistor to modern day chips powering our connected devices.
The personal computing business as we know it owes itself to an environment of enthusiasts, entrepreneurs and happenstance. The invention of the microprocessor, DRAM, and EPROM integrated circuits would help bring computing to the mainstream. This is the first in a five-part series exploring the history of the microprocessor and personal computing, from the invention of the transistor to modern day chips powering our connected devices.
It's been more than 20 years since Doom and Myst changed the landscape of video games forever. One exponentially advanced the language and possibilities of the first-person shooter genre and the other pulled players into a fictional world like never before. Despite the fact that they spoke to vastly different audiences, both games were huge, phenomenal hits. Not surprisingly, two of the men instrumental to each game's success initially hated the game the other guy produced.
What makes a product iconic? Design, functionality, styling, and innovation will get you part of the way there, but the true tests are how these products distinguished themselves from their competitors, how widely those traits were imitated by those competitors, and how history remembers their status. Here are some products that left their mark on the PC industry, whether in the form of full systems, CPUs, graphics cards, motherboards, cases or peripherals.
The very first transistor -- the foundational building block which almost all of modern civilization was built from -- was created at AT&T’s Bell Labs on December 23 1947. As you can see above, this first transistor was huge and looked nothing like the millions...
You pay $60 for many of the new games you play, but how much does a blockbuster game cost to make? Many in the industry don't even know the budgets of games. It is not unusual for developer working on a big-budget game to have no idea of the game's budget. To answer the question, we've pulled a bunch of scattered data from public sources as a first attempt to get a comprehensive sense of how much money the world's biggest and most expensive games cost.
Twenty-thirteen has been an eventful year marked by a changing landscape in the computing world, product refreshes in the mobile scene, a new generation of consoles, the rise of Bitcoin and the harsh realization that pretty much nothing you do online (and sometimes offline) is private. Our yearly recount of the most relevant tech happenings awaits...
With DX10's arrival, vertex and pixel shaders maintained a large level of common function, so moving to a unified shader arch eliminated a lot of unnecessary duplication of processing blocks. The first GPU to utilize this architecture was Nvidia's iconic G80.
Four years in development and $475 million produced a 681 million-transistor, 484mm² behemoth -- first as the 8800 GTX flagship and then with cards aimed at several segments. Aided by the new Coverage Sample anti-aliasing (CSAA) algorithm, Nvidia saw its GTX demolish every single competitor in outright performance.
With the turn of the century the graphics industry bore witness to further consolidation. Where 3dfx was once a byword for raw performance, its strengths before its dismissal laid in its full screen antialiasing image quality. By the time 2001 dawned, the PC graphics market consisted of a discrete card duopoly (Nvidia and ATI), with both of them in addition to Intel supplying the vast majority of integrated graphics chipsets.
Prior to the Voodoo 5’s arrival, ATI had announced the Radeon DDR as “the most powerful graphics processor ever designed for desktop PCs.” Previews of the card had already gone public on April 25, and only twenty-four hours later Nvidia countered with the announcement of the GeForce 2 GTS (GigaTexel Shader).
Launched on November 1996, 3Dfx's Voodoo graphics consisted of a 3D-only card that required a VGA cable pass-through from a separate 2D card to the Voodoo, which then connected to the display. Voodoo Graphics revolutionized personal computer graphics nearly overnight and rendered many other designs obsolete, including a vast swathe of 2D-only graphics producers.
The 3D landscape in 1996 favoured S3 with around 50% of the market. That was to change soon, however. It was estimated that 3Dfx accounted for 80-85% of the 3D accelerator market during the heyday of Voodoo’s reign. Later on Nvidia would revive with the RIVA series and eventually land their greatest success with the first GeForce graphics card.
The evolution of the modern graphics processor begins with the introduction of the first 3D add-in cards in 1995, followed by the widespread adoption of the 32-bit operating systems and the affordable personal computer.
While 3D graphics turned a fairly dull PC industry into a light and magic show, they owe their existence to generations of innovative endeavour. Over the next few weeks we'll be taking an extensive look at the history of the GPU, going from the early days of 3D consumer graphics, to the 3Dfx Voodoo game-changer, the industry's consolidation at the turn of the century, and today's modern GPGPU.
As the year comes to an end it's time to look back at the most interesting and relevant tech stories of 2012. Numerous trends consolidated during 2012: Apple’s dominance in the sector, mobile growth, fast-paced releases on the smartphone world, the Windows 8 launch, only to name a few.
This year we have divided stories in 12 heavily packed categories, with nearly 500 hand-picked headlines total. Feel free to jump around between your favored topics, but try not to miss the tech culture section where we revisit some of the most entertaining stories we covered this year. Here’s our take on 2012…