Wrap Up: Making Some Sense of the Results

Ashes of the Singularity looks to be an awesome RTS game and an exciting insight into what DX12 has to offer. Gamers should bear in mind that this is still just a single DX12 game and is unlikely to represent DX12 performance as a whole – no single game could. As is the case with DX11 gaming, it's likely that some DX12 games will favor AMD while others prefer Nvidia.

Putting the blinders on and looking specifically at Ashes of the Singularity, is Nvidia as doomed as AMD fanboys would have you believe? No, we wouldn't say so.

Although Nvidia does go backwards in DX12 as AMD goes forward, the margins are far from catastrophic. Take the GTX 980 Ti vs. Fury X battle at 1080p for example. Under DX11 the 980 Ti is 15% faster while it is just 2% faster when using DX12. The exact same thing was seen when comparing the GTX 980 and R9 390X.

There has been a lot of confusion surrounding Ashes of the Singularity's performance since the first benchmarks were released a few months ago by a select few tech sites. For example, Ars Technica's results paint a different picture than ours, showing that at 1440p the Radeon 290X gained over 40% performance when using DX12 instead of DX11, going from 28fps to 40fps.

Meanwhile, in our tests, using a very similar GPU (the 390X) going from DX11 to DX12 only resulted in a 5% increase. The huge difference is explained by the fact that AMD has improved their DX11 performance in recent weeks/months, so the gap has been reduced considerably. Moving up to 4K, the margins become even narrower and the DX12 performance looks very similar to what we come to expect from a DX11 title between the various GPUs.

It's worth noting that our GPU testing was conducted with the ultra-fast Core i7-6700K processor, which is why DX12 didn't show vast improvements over DX11 – there were loads of processing power available. That said, when we turned to test different CPUs, even with a Core i3 processor we had to reduce the graphics settings drastically to alleviate the GPU bottleneck.

The medium quality settings provided some rather shocking results for the AMD FX-8350 as it trailed the Core i3-6100 by a wide margin, somewhat akin to certain DX11 titles such as Thief. Also the results seen when using the R9 Fury X really highlighted AMD's driver overhead issue using DX11 with a weaker processor.

It's hard to determine exactly just how much of an improvement DX12 makes over DX11 in Ashes of the Singularity.

Evidently Nvidia graphics cards are of no help here as they go backwards. On the AMD side, they haven't necessarily optimized as heavily for DX11 in Ashes of the Singularity. This makes sense as AMD isn't looking to push asynchronous shading to developers because the company's architecture is well suited for the task. It's doing this because it costs them far less in driver research and development as asynchronous shading enables the GCN architecture to reach almost full efficiency without requiring a driver.

Nvidia has heavily optimized its DX11 driver for Ashes of the Singularity while there is little it can do to optimize its driver for DX12 as the game engine communicates almost directly with the GPU. The company is limited by its Maxwell architecture which suffers from a call bottleneck due to the game being programmed for thread parallelism. Nvidia is dependent on game developers to make efficient use of the Maxwell architecture as best they can, so don't expect to see DirectX 12 driver improvements from team green.

For now it seems DX12 is doing a decent job of leveling the playing field and we can't wait to see more titles tapping the tech in the near future.