AMD surprised everyone last month by delivering Titan-like performance for nearly half the price with the R9 290X. However before Nvidia can strike back, it'll have to eat another blow in the form of the new Radeon R9 290. At $400, the R9 290 offers fantastic value when you consider it still packs much of what made the R9 290X a GeForce killer.
With roots that stretch back more than a decade and enough fans to justify new content every year, Battlefield is among the handful of franchises that needs no introduction around here. Even if you hate EA's approach modern military madness, you can typically expect Battlefield's graphics to raise the bar. This year's release is no different, of course, having been built with an updated version of the Frostbite engine.
The GeForce GTX Titan blew us all away eight months ago with its mindblowingly fast GPU. The catch, of course, was that Nvidia wanted $1,000 for it. In a sense, the Radeon R9 290X could be considered AMD's Titan, as it takes the Tahiti architecture and stuffs with nearly 2000 million more transistors. It's the most complex GPU AMD has created and by no coincidence, it's also one of the most expensive, but before you click away, that's "only" $550, which is substantially cheaper than Nvidia's solution.
AMD announced the next generation Volcanic Islands GPUs last month at their GPU14 Tech Day event in Hawaii. Previous years have seen the release of a new GPU generation every year which makes the Radeon HD 7000's shelf life surprising, even more so considering the majority of the new RX 200 series cards rebadges from existing HD 7000 products.
The RX 200 series will consist of the Radeon R7 240, R7 250, R7 260X, R9 270X, R9 280X and later this month the R9 290 and R9 290X. Confused yet? Well let us try and clear a few things up.
Weekend tech reading: AMD Eyefinity vs. Nvidia Surround framerate analysis, first Apple A7 die shot, looking ahead at 'post-post-PC' tech
In January 2013 I revealed a new testing methodology for graphics cards that I dubbed Frame Rating. At the time I was only able to talk about the process, using capture hardware to record the output directly from the DVI connections on graphics cards, but over the course of a few months started...