Top 10 Most Significant AMD GPUs of All Time
There's plenty of graphics history and technology to unpack here after AMD purchased ATI, it didn't just absorb the company, but continued its reputation of being a graphics powerhouse for years to come.
There's plenty of graphics history and technology to unpack here after AMD purchased ATI, it didn't just absorb the company, but continued its reputation of being a graphics powerhouse for years to come.
Many of the breakthroughs made since the advent of the transistor were simply inconceivable a century ago, but what makes tech culture even more interesting are the anecdotes and fun facts that came along the way.
The Hitman series helped define the modern stealth action game, with the formula evolving and building up a large following spanning 20 years. Here's a recap of the key titles featuring genetically engineered and professional hitman Agent 47.
In our last installment of the history of the modern graphics processor, we had reached a point where the market consisted of just three competitors: AMD, Intel, and Nvidia. However, in the following years, graphics processors became one of the largest, most complex, and most expensive components that could be found in almost any computing device.
Who would have guessed that stealthily scaling a building with a trusty hidden blade would become an annual tradition, but back in 2007, the exploits of Altair in Assassin's Creed were practically revolutionary.
One of the greatest tools to be spun out of AOL was its instant messaging client, affectionately known as AIM. Released in the spring of 1997 it allowed users to register an online handle, create buddy lists, and chat with friends in near real-time.
From what began life as a tech demo, the series has endured for over 16 years, producing 14 games (including Far Cry 6) that have sold over 50 million copies and spawned an utterly atrocious movie. This is the story of Far Cry.
Few companies can brush off failure and come back stronger like the world's most famous gaming company. Nintendo has been synonymous with video games for decades, but it's not always been rosy for the Japanese giant.
AMD has overcome Intel in terms of performance before, but previous wins against the chip giant have been rare over the years. Furthermore, every time Intel looked inferior, it responded swiftly and effectively.
Call of Duty has made everlasting memories for millions of players around the world, including an entire generation of gamers who've spent their adolescent years with the franchise and continue to stick with it to this day.
Will history repeat itself? Intel's setting out to make a name for itself in the discrete GPU space with its upcoming Xe-HP GPU lineup. We look at Project Larrabee - the last time Intel tried making a graphics card - to understand how things might turn out.
For many, the 1980s was the golden era in home computing. Fighting among new companies was Sinclair who made cheap and basic computers, but helped give rise to the world of bedroom programming and game developers like Rare, Codemasters and Rockstar North.
#ThrowBackThursday The PC business as we know it owes itself to an environment of enthusiasts, entrepreneurs and happenstance. The invention of the microprocessor, DRAM, and EPROM integrated circuits would help bring computing to the mainstream. This 5-part series explores the history of personal computing, from the invention of the transistor to modern day chips powering our connected devices.
When it comes to graphics cards, more is nearly always better. More shaders, more RAM, more bandwidth. So how about more GPUs? Here's a brief stroll through the story of multi-GPU graphics cards.