Intel has been beating AMD on every front but price for a couple of generations now as the Bulldozer microarchitecture and its descendants have had an unpleasant uphill climb. Power consumption, performance per clock, it all takes its toll. However, we took a couple of AMD’s most popular chips for a test drive and found that things aren’t anywhere near as bad as benchmarks might lead you to believe. Quite the opposite, actually.
Enthusiasts have been pushing the limits of silicon for as long as microprocessors have existed. Early overclocking endeavors involved soldering and replacing crystal clock oscillators, but evolving standards brought options for changing system bus speeds, while some of the most daring would gain boosts through hard modding. These are but a few of the landmark processors revered for their overclocking prowess.
Older CPUs would simply fail if they started to overheat, but modern CPUs adjust their frequency based on temperature (among other things) to prevent a dramatic failure. Because of this, it stands to reason that once you reach certain temps, you will no longer be getting the maximum performance from your CPU because it will be busy protecting itself. But what is that temperature? And do you really need a high-end liquid-cooled system to get peak performance?
Ubisoft's new action-driving MMO has 7,000 miles of roads, 15 cities and 15 million individual objects. The Crew has been designed for the latest generation consoles and PCs, though the former are capped at 30fps/1080p while our preferred platform ought to look and feel better with a 60fps cap and a higher res. Here's our full PC benchmark test.
Intel's existence traced back to the breakup of Shockley Electronics and Fairchild Semiconductor. Determined to avoid the same fate, lawsuits became object lessons to employees, a means of protecting its IP, and a method of tying up a competitor's financial resources. This is the fourth installment in a five part series, where we look at the history of the microprocessor and personal computing, from the invention of the transistor to modern day chips.
IBM's stature guaranteed the PC to initiate a level of standardization required for a technology to attain widespread usage. That same stature also ensured competitors would have unfettered access to the technical specifications of the Model 5150. This is the third installment in a five part series, where we look at the history of the microprocessor and personal computing, from the invention of the transistor to modern day chips powering our connected devices.
The personal computing business as we know it owes itself to an environment of enthusiasts, entrepreneurs and happenstance. The invention of the microprocessor, DRAM, and EPROM integrated circuits would help bring computing to the mainstream. This is the first in a five-part series exploring the history of the microprocessor and personal computing, from the invention of the transistor to modern day chips powering our connected devices.
Already one of the most iconic and atmospheric first-person shooters around, Metro has received some post-release polish that should present a greater challenge for today's GPUs. Metro Redux features improved versions of both Metro 2033 and Metro: Last Light, including completely remastered visuals.
Intel's Extreme Edition processor line is over a decade old now, starting way back in 2003 with the single-core Pentium 4 EE 3.4GHz. Fast forward to today, the chip we'll be looking at boasts eight cores, a massive 20MB smart cache, support for the latest DDR4 memory, and is accompanied by the new X99 chipset for more SATA 6Gb/s ports (10 rather than just two) and finally brings native USB 3.0 to Intel's flagship platform.