DirectX 11 vs. DirectX 12 performance: Ashes of the Singularity benchmarked

Steve

Posts: 3,041   +3,150
Staff member

Windows 10 has been out for a few months now and with it came a number of improvements that make it Microsoft's best OS yet. However, one of the operating system's biggest features has yet to be seen in action. DirectX 12 is a key ingredient dedicated to PC gamers but the long wait is coming to an end.

Finally, we have what looks to be a very accurate means of gauging DX12 performance to see just what it means for the future of PC gaming. Recently Stardock provided gamers with Steam Early Access to Ashes of the Singularity, one of the first games to use DX12.

Ashes of the Singularity could be described as a war across an entire world without abstraction. Thousands or even tens of thousands of individual actors can engage in dozens of battles simultaneously. The built-in benchmark tool that we used was initially designed as a developer tool for internal testing and therefore does an excellent job of reproducing the conditions gamers can expect to find when playing Ashes.

Read the complete article.

 
Great review! For now DX12 performance seems more of a hype rather than a performance improvement over DX11. I'm quite disappointed how NVidia did not include hardware-based asynchronous compute despite of them knowing that DX12 was coming out.
 
Great review! Thanks Techspot!
I'm curious when big game titles are going to support DX12 with their next releases. Do you guys know when we can expect AAA titles released with DX12 support?
I don't remember how long this process was with DX11...
 
Weird and unexpected results. Once again Techspot gets results that do not line up with anyone else...
next to no difference between DX11 and DX12 results... +- 3 fps at best... ~5%


1.
http://www.techpowerup.com/forums/t...shes-of-singularity-benchmark-r9-280x.217209/
he tests with completely different hardware (Xeon and 280X) ... and resolution (1680x1050)... but...
high settings DX11 => DX12 gain 41.3%
low settings DX11 => DX12 gain 86.6%


2.
he tests i5-4690K + Nano and gets:
DX11: average 32.6 fps
DX12: average 47.5 fps
gain: 45.7%

A10-7870K + something... Nano?
DX11: average 22.5fps
DX12: average 42.9fps
gain: 90.6%


So can anyone explain to me, what do they do differently that techspot DX11 => DX12 gain is 10 times less??
 
Great review! Thanks Techspot!
I'm curious when big game titles are going to support DX12 with their next releases. Do you guys know when we can expect AAA titles released with DX12 support?
I don't remember how long this process was with DX11...

There are plenty of great looking DX12 games on the horizon. How well they will take advantage of DX12 compared to titles that will follow later is something only time will tell.

Weird and unexpected results. Once again Techspot gets results that do not line up with anyone else...
next to no difference between DX11 and DX12 results... +- 3 fps at best... ~5%

1. http://www.techpowerup.com/forums/t...shes-of-singularity-benchmark-r9-280x.217209/
he tests with completely different hardware (Xeon and 280X) ... and resolution (1680x1050)... but...
high settings DX11 => DX12 gain 41.3%
low settings DX11 => DX12 gain 86.6%

2.
he tests i5-4690K + Nano and gets:
DX11: average 32.6 fps
DX12: average 47.5 fps
gain: 45.7%

A10-7870K + something... Nano?
DX11: average 22.5fps
DX12: average 42.9fps
gain: 90.6%

So can anyone explain to me, what do they do differently that techspot DX11 => DX12 gain is 10 times less??

Your trolling is getting boring now. You obviously haven't look at any of the other professional Ashes of the Singularity benchmarks and you are struggling to understand ours.

Anandtech: GTX 980 Ti @ 1440p [DX12] = 45.6fps
TechSpot: GTX 980 Ti @ 1440p [DX12] = 46fps <- rounded up from 45.8fps

Anandtech: Fury X @ 1440p [DX12] = 42.4fps
TechSpot: Fury X @ 1440p [DX12] = 43fps <- rounded up from 42.5fps

FYI we saw almost a 40% gain on the Intel side when going from DX11 to DX12 using a Radeon graphics card and almost 60% for AMD. This was shown in the article, go to page 5 and remove your goggles.

The DX11 vs. DX12 results are very heavily weighted on the in-game quality settings being used as well as the CPU and GPU combo. Where we are seeing big gains is with lower quality settings, not the crazy quality settings that we did all out primary GPU testing with.

Moreover unless you are using the latest Catalyst drivers the AMD DX11 peformance is very weak in Ashes of the Singularity which we explained in our article. So with weak AMD DX11 performance and strong DX12 performance it is obvious .... ahh what the hell I am just repeating the article, read it rather than just trying to troll!
 
Adding fuel to the fire Oxide claims that Nvidia pressured them not to include the asynchronous compute feature in their benchmark, removing the GeForce 900 series disadvantage from the equation when competing against AMD's DirectX 12 compliant GCN architecture.

That's the part that got my attention. If I were Oxide I would have included it anyway just to pressure nVidia to fix that ****.
 
Adding fuel to the fire Oxide claims that Nvidia pressured them not to include the asynchronous compute feature in their benchmark, removing the GeForce 900 series disadvantage from the equation when competing against AMD's DirectX 12 compliant GCN architecture.

That's the part that got my attention. If I were Oxide I would have included it anyway just to pressure nVidia to fix that ****.

Well they did and Nvidia (with their help) did ;)
 
If I'm understanding this correctly, and please correct me if I'm not, DX12 games are less driver-dependent and communicate in a more direct way to the GPU?
 
If I'm understanding this correctly, and please correct me if I'm not, DX12 games are less driver-dependent and communicate in a more direct way to the GPU?

Exactly that is correct. There are other advantages as well such as new rendering technologies/methods as well as better and more flexible multi-GPU support. That said the low-level API design is the biggest thing DX12 brings to the table for PC gamers.
 
The results are so closed and somehow in favor of Nvidia because :
1. As Steven honestly disclosed, no asynchronous computing " Adding fuel to the fire Oxide claims that Nvidia pressured them not to include the asynchronous compute feature in their benchmark, removing the GeForce 900 series disadvantage from the equation when competing against AMD's DirectX 12 compliant GCN architecture."
2. Testing on a 64 bit operating system, 8 GB seems a little low. By comparison, other tests were conducted with 24, respective 32 GB of RAM.
Besides that, I really would like to see some medium cards in Crossfire or SLI configurations and some testing that covers a wider range from low res to multi monitors setups.
 
The results are so closed and somehow in favor of Nvidia because :
1. As Steven honestly disclosed, no asynchronous computing " Adding fuel to the fire Oxide claims that Nvidia pressured them not to include the asynchronous compute feature in their benchmark, removing the GeForce 900 series disadvantage from the equation when competing against AMD's DirectX 12 compliant GCN architecture."
2. Testing on a 64 bit operating system, 8 GB seems a little low. By comparison, other tests were conducted with 24, respective 32 GB of RAM.
Besides that, I really would like to see some medium cards in Crossfire or SLI configurations and some testing that covers a wider range from low res to multi monitors setups.

1. Sorry if that was misleading but the game and benchmark do feature asynchronous computing. Nvidia pressured Oxide to remove it from the benchmark months ago and they refused. Therefore, Nvidia was forced to actively work with Oxide to implement DirectX 12 Async Compute support for the GeForce 900 series GPUs in Ashes of the Singularity.
2. That was a typo, we used a 16GB kit. FYI for 99% of the benchmark the system memory usage was at between 4 - 5GB's and only spiked at 7GB's for a brief second when loading the benchmark for the firs time.

We probably won't do anymore in-depth testing till the game is released. This was certainly just a preview from us.
 
Interesting article. Got a bit confused when the colours for dx12 and dx11 were switched round in the charts (think page 3 and 4 they flipped).

Comparing these results, I always expected console-like performance gains would be more than a couple of frames.
 
Good little preview of things to come, looks like Nvidia has some work ahead of them to get ready for DX12, probably in the Pascal batch of GPUs, but that's another 6-9 months away right now. Not like it's terribly important at the moment seeing how Ashes is the only title currently with DX12 support and it's just an early access build, not to mention the forced switch to Windows 10 to take advantage of it.

As has been said, we need a full comparison across multiple games with more cards tested in various configuration, what I'm really looking forward to seeing is how DX12 allows for both brands to be used simultaneously, that's truly the game changing moment.
 
Great preview, thanks!
The best news, at least for me, is that we wont need to wait for Nvidia and AMD to lauch their drivers since it´s all done by the game developers, am I wrong? That wait they wont be able to cheat the benchmarks...
 
Great write-up, and excellent explanation of where each company (nVidia and AMD) have had their areas of focus. Maybe I'm mistaken: isn't DX12 supposed to help support higher-core (6 and 8 core) CPUs? I am pretty surprised to see a Skylake Core i3 be on top of AMD's offerings with 6 and 8 cores.
 
So I guess that until Zen launches, the new best buy CPU would be the i3 6100. Too bad the RAM and MOBO are still crazy expensive.
 
Well that's fair, let's talk about how disappointing the FX series CPU's are in relation to the newest Intel offerings lol. I give this article a 5/10, some valuable information, and a little stupidity. Sounds like the majority of our population. #yolo
 
Lol, yikes, you don't get it either. Well, off to work I go. Good day lads.
Oh I get it. The same argument can be used each and every time a new product is released and reviewed. It has to be compared to something. It is not the reviewers fault if there is nothing to compare with a recent date. And as for the negativity toward a 3 year old AMD processor, why do you think that is a new development? The reviews wasn't much different 3 years ago.
 
Fantastic job! This kind of articles make me miss on the days you actually had to install DX and try out many MANY detonator drivers to get the perfect performance. Now everything comes automated and boring.

Good Read!
 
Back