The biggest news for Mantle since being announced as a method of improving performance in games by allowing them to use your CPU and GPU more efficiently, has been support from DICE's Frostbite 3 engine (and by extension, Battlefield 4). Recently that support expanded to Eidos' Thief, while Crytek revealed at GDC 2014 that CryEngine will support it too. AMD says its latest update is of "tremendous benefit to a large cross-section of the gaming public" so we are keen to check it out.
World of Warcraft is considered a massive success, yet it's dwarfed by World of Tanks' 1 million concurrent players and 75 million total users. You can also find more than half a million people playing Dota 2 on any given day and League of Legends has over 7.5 million players online during peak hours.
While you may not need a Radeon R9 290X or a GeForce GTX 780 Ti to get the most out of these games, we're curious to see how hard those titles can push today's hardware.
Despite being built with the aging Unreal Engine 3, Thief touts some cutting edge rendering techniques that have put the game on our radar. Thief's built-in benchmark appears to do a good job of demonstrating a worst-case performance scenario, so if your system can average 60fps in the benchmark you should enjoy perfectly smooth gameplay from start to finish.
Marking the introduction of its Maxwell architecture, Nvidia has targeted AMD's $150 Radeon R7 265 with the new GeForce GTX 750 Ti. With fewer cores being used to get more performance, Maxwell consumes less power and improves Kepler's performance per watt. Does that mean AMD's newly relaunched Radeon R7 265 could be in trouble considering it's essentially a slightly overclocked and steeply discounted HD 7850?
If you're looking to treat virtual coin mining as a hobby, Litecoins are probably the best bet right now and we'll show you how to get started with choosing and configuring the hardware and software you'll need. Also note we are aiming this article to PC enthusiasts who likely have spare hardware around, separating our project from milk crate builds, this seems like the most logical approach for us to get started.
Gigabyte Radeon R9 290X OC & R9 290 OC Review: Immense potential lost to GPU shortages and inflated prices
AMD's Radeon R9 290 and 290X made a strong case against Nvidia's GTX 780 and Titan late last year, but that position soon weakened with unexpectedly high prices and limited options from board partners. This time we'll revisit the cards with actual production units from Gigabyte so we can weigh in on third-party performance at actual market prices.
Welcome back to TechSpot's 2013 Holiday Gift Guide! All of this year's tech gift recommendations are coming individually from our staff members and editors with their own picks on gifts they would like to receive or give. Julio is our founder and executive editor, he's usually the one pulling the strings to make a certain review or feature happen on time, while providing content direction around the site day in and day out.
AMD surprised everyone last month by delivering Titan-like performance for nearly half the price with the R9 290X. However before Nvidia can strike back, it'll have to eat another blow in the form of the new Radeon R9 290. At $400, the R9 290 offers fantastic value when you consider it still packs much of what made the R9 290X a GeForce killer.
With roots that stretch back more than a decade and enough fans to justify new content every year, Battlefield is among the handful of franchises that needs no introduction around here. Even if you hate EA's approach modern military madness, you can typically expect Battlefield's graphics to raise the bar. This year's release is no different, of course, having been built with an updated version of the Frostbite engine.
The GeForce GTX Titan blew us all away eight months ago with its mindblowingly fast GPU. The catch, of course, was that Nvidia wanted $1,000 for it. In a sense, the Radeon R9 290X could be considered AMD's Titan, as it takes the Tahiti architecture and stuffs with nearly 2000 million more transistors. It's the most complex GPU AMD has created and by no coincidence, it's also one of the most expensive, but before you click away, that's "only" $550, which is substantially cheaper than Nvidia's solution.