DirectX 11 vs. DirectX 12 performance: Ashes of the Singularity benchmarked

Haven't there been articles giving bias to AMD cards with this particular game?

Don't get me wrong giving AMD an edge in DX12 is only going to push NVIDIA to get better, but only having ONE game with ONE benchmark is hardly enough information to declare a winner right now.

I'll check back when we have at least 5 titles that can use DX12.
 
Well that's fair, let's talk about how disappointing the FX series CPU's are in relation to the newest Intel offerings lol. I give this article a 5/10, some valuable information, and a little stupidity. Sounds like the majority of our population. #yolo
If I may please allow me to remove the rock that you have been living under so I can bring you up to speed with what has been happening in the real world.

Yes 3 years ago AMD released their latest and greatest Piledriver-based architecture with the FX-8350, some 7 months after Intel’s Ivy Bridge Core i7-3770K which I might add clock-for-clock isn’t a great deal slower than the now current 6700K.

In the 3 years that AMD has been touting its ‘8-core’ four module Piledriver FX processors we have tested dozens of triple AAA titles. Any title that has been remotely CPU dependant has seen AMD FX-series get lit up by Intel’s Core i5, Core i7 and even Core i3 processors dating all the way back to 2011.

Apparent AMD fanboys have been claiming for years that this is all down to poor programing and that those DX11 games couldn’t or at least didn’t take full advantage of the FX processors eight cores. They claimed with tightly clenched fists that DX12 would be Intel’s undoing and that the FX series would finally reach its true potential.

The truth is exactly what we expected, even with greater utilization the weak IPC performance of the FX processors sees them still getting demolished by considerably more efficient Core i3’s, though DX12 has helped to reduce the damage.

It certainly wasn’t unreasonable of us to look into the FX vs. Core battle and then comment on the results.

Finally, if we made the comparison between Piledriver and the older Sandy Bridge architecture the outcome would still be the same. Here's to hoping Zen puts AMD back in contention or if we are allowed to dream a little back out in front like the good old days.
 
Moreover unless you are using the latest Catalyst drivers the AMD DX11 peformance is very weak in Ashes of the Singularity which we explained in our article. So with weak AMD DX11 performance and strong DX12 performance it is obvious

So you are suggesting the other reviewers used old AMD drivers for testing their DX11, therefore their DX11 perf was so low, and therefore they had huge gain.... whereas your DX11 perf was much better and thus less gain? ok. sure... why not.

.... ahh what the hell I am just repeating the article, read it rather than just trying to troll!

tl;dr obviously. and I haven't even mastered the art of reading... I just watch graphs ... like a proper troll.
 
So you are suggesting the other reviewers used old AMD drivers for testing their DX11, therefore their DX11 perf was so low, and therefore they had huge gain.... whereas your DX11 perf was much better and thus less gain? ok. sure... why not.

tl;dr obviously. and I haven't even mastered the art of reading... I just watch graphs ... like a proper troll.

Yes, that is exactly what I am saying, logic would also dictate that is the case. Like I said other professional reviews have produced similar results such as Anandtech making your “Once again Techspot gets results that do not line up with anyone else...” comment stupid and somewhat trolly.

Three months ago PCper found that the Nvidia cards produced similar performance under DX11 and DX12 just as we did, though DX11 was faster. Back then the AMD DX11 performance was horrible and the DX12 performance was often on par or better than Nvidia’s (GTX 980 vs R9 390X).

So their DX11 vs. DX12 testing using a Nvidia GPU matches ours and our DX12 results match Anandtech’s exactly using both AMD and Nvidia GPUs. Hmm yes that is strange.

Also again as I pointed out there were cases where we saw a 40 – 60% performance boost when going from DX11 to DX12. It is very important you note the quality settings used as well as the CPU and GPU combo. Handpicking some random guys results where he has tested just one setup ONCE isn’t the smartest way to make a case against a tech site that has done thorough testing across a range of GPUs, platforms and quality settings.
 
The AMD processors are comically slow.
Why do the fanboys throw themselves infront of any negative comments towards AMD? Like Lemmings from a cliff !
 
Sorry but this benchmark has one big problem.

The game and its optimization isn't finished. From what I heard the game is far too ressource hungry for what it delivers in graphical fidelity. And benchmarking a game that is still in the process of being developed hardly tells anything about what DX11 or 12 is capable of.

Its the same with other games. You can benchmark Rome2 all you want, the GPU performance is so weak because the game is such a mess.

"Finally, we have what looks to be a very accurate means of gauging DX12 performance to see just what it means for the future of PC gaming."

An early access game being a very accurate means for gauging performance. I might be a tech noob but it doesn't take more than a tech noob to understand that something here is going awry.
 
Last edited by a moderator:
Sorry but this benchmark has one big problem.

The game and its optimization isn't finished. From what I heard the game is far too ressource hungry for what it delivers in graphical fidelity. And benchmarking a game that is still in the process of being developed hardly tells anything about what DX11 or 12 is capable of.

Its the same with other games. You can benchmark Rome2 all you want, the GPU performance is so weak because the game is such a mess.

"Finally, we have what looks to be a very accurate means of gauging DX12 performance to see just what it means for the future of PC gaming."

An early access game being a very accurate means for gauging performance. I might be a tech noob but it doesn't take more than a tech noob to understand that something here is going awry.

Ashes of the Singularity has been in development for a very long time now and the developer has been working very close with AMD and Nvidia on a daily basis. In its current form I would suggest that the game is more polished then 70% of the AAA titles that have been released in the past few years.

“From what I heard the game is far too ressource hungry for what it delivers in graphical fidelity”

Seems pretty impressive to me given the extreme volume of units, I can’t think of a single game that runs that well with so much going on.
 
Ashes of the Singularity has been in development for a very long time now and the developer has been working very close with AMD and Nvidia on a daily basis. In its current form I would suggest that the game is more polished then 70% of the AAA titles that have been released in the past few years.

“From what I heard the game is far too ressource hungry for what it delivers in graphical fidelity”

Seems pretty impressive to me given the extreme volume of units, I can’t think of a single game that runs that well with so much going on.
Which is why I wouldn't trust the state of the game one bit. This game is being pushed and marketed as the showcase of what dx12 can do and how much better it is than dx11. You can say that the better the game runs with dx12 than with 11 but it goes the other way as well. The worse the game runs with dx11 the better it makes dx12 look.

In game development the performance optimization is a very late task.

All that said I have no doubt that dx12 will be a significant performance boost.

As you can see here the game starts with a disclaimer

Other games running well with massive unit count? Hmm... Supreme Commander is obvious with its dx9 but a more recent example and with dx11... Wargame perhaps.

AFAIK the whole "thousands of units" thing is just a promise atm. Couldn't find footage of tens of thousands of units. Here from their FAQ:

"
How is the player expected to manage thousands of units?
Just as a military general must manage entire armies, the Ashes user interface allows players to easily take units and build "meta-units" that act together as a single, coherent, massive unit."

Why someone would want to have more than 1000 units on the screen is beyond me too. More massive than what Supreme Commander has done is hardly possible to comprehend and manage for a single player. So they morph huge groups of units into a single one.
 
Last edited:
Same deal different day. I bought a new video card every 8 months from 1994 to 2004. When they tripled the price of cards to gain 5% of visible difference I bailed out. So eleven years of lost sales so far. Now I come to see what's what and it is exactly the same. Talking about 5% performance gains, ad nauseum, and virtually invisible when playing the game. Whole articles devoted to ambiguity and obfuscation in yet another pathetic attempt to get people to shell out a few hundred just to play the latest marginally better game. Truth is, every one of these tech companies are led by people unqualified to develop high quality products that deliver a significant performance change.
 
Same deal different day. I bought a new video card every 8 months from 1994 to 2004. When they tripled the price of cards to gain 5% of visible difference I bailed out. So eleven years of lost sales so far. Now I come to see what's what and it is exactly the same. Talking about 5% performance gains, ad nauseum, and virtually invisible when playing the game. Whole articles devoted to ambiguity and obfuscation in yet another pathetic attempt to get people to shell out a few hundred just to play the latest marginally better game. Truth is, every one of these tech companies are led by people unqualified to develop high quality products that deliver a significant performance change.

</rant> :)

Which is why I wouldn't trust the state of the game one bit. This game is being pushed and marketed as the showcase of what dx12 can do and how much better it is than dx11. You can say that the better the game runs with dx12 than with 11 but it goes the other way as well. The worse the game runs with dx11 the better it makes dx12 look.

In game development the performance optimization is a very late task.

All that said I have no doubt that dx12 will be a significant performance boost.

As you can see here the game starts with a disclaimer

Other games running well with massive unit count? Hmm... Supreme Commander is obvious with its dx9 but a more recent example and with dx11... Wargame perhaps.

AFAIK the whole "thousands of units" thing is just a promise atm. Couldn't find footage of tens of thousands of units. Here from their FAQ:

"
How is the player expected to manage thousands of units?
Just as a military general must manage entire armies, the Ashes user interface allows players to easily take units and build "meta-units" that act together as a single, coherent, massive unit."

Why someone would want to have more than 1000 units on the screen is beyond me too. More massive than what Supreme Commander has done is hardly possible to comprehend and manage for a single player. So they morph huge groups of units into a single one.

I don't disagree with you, the game is pre-beta. I said this is a preview. So far it is the best example of DX12 performance that we have.
 
Two trends:

1: GPU manufacturers can no longer improve performance via driver updates. As a result: Both NVIDIA and AMD are going to be VERY skittish making architecture changes going forward, since any major changes could tanks performance on legacy titles.

2: DX12 isn't a magic bullet for lower CPU performance.
 
Same deal different day. I bought a new video card every 8 months from 1994 to 2004. When they tripled the price of cards to gain 5% of visible difference I bailed out. So eleven years of lost sales so far. Now I come to see what's what and it is exactly the same. Talking about 5% performance gains, ad nauseum, and virtually invisible when playing the game. Whole articles devoted to ambiguity and obfuscation in yet another pathetic attempt to get people to shell out a few hundred just to play the latest marginally better game. Truth is, every one of these tech companies are led by people unqualified to develop high quality products that deliver a significant performance change.

I blame the Devs and Consoles. In my opinion there hasent been a proper breakthrough in PC graphics since Crysis. And that was in 2007. I saw many good looking games in the meanwhile but none of them made my jaw drops regarding that subject.
 
There is a lot more to DX12 than just async compute (unless you are talking to/about AMD that is). It will be interesting to see what happens with other styles of games.

There is but async offers a serious performance boost, so lots of people care about it. The thing about this game is that most of the gains come from simply reducing driver overhead for AMD. Async is an afterthought. So if we are seeing nvidia's performance drop and AMDs most popular dx12 feature not having a major impact, one would think later on these numbers could change drastically.
 
Really good article.
Maybe, and just maybe, it would be interesting add some Vulkan to the mixture, once it´s out.
 
Great article Steve! Though, like you said, the game is in pre-beta and this article is a preview but it gives us an idea of what to expect. Maybe, hopefully, when the game is finally released out of beta we will see even better dx12 performance.

I am not going to get into the AMD vrs Nvidia debate. It seems a hopeless debate between the two sides....

What I would like to know is what happened with the idea (and rage) that dx12 was going to enable gpu RAM stacking? Did that idea drop out of a window somewhere, or has that myth been debunked?
 
Well that's fair, let's talk about how disappointing the FX series CPU's are in relation to the newest Intel offerings lol. I give this article a 5/10, some valuable information, and a little stupidity. Sounds like the majority of our population. #yolo

Considering how old the fx 8350 is it does well to compete with the latest i3 skylake processors.
I see that in dx12 at 1440p the old fx is only just a couple fps behind the newer more exspensive i3.
I pray the amd will release a driver to finally take advantage of it's cpu and gpu combos better.
Have a nice day.
 
1. I am really happy to see the NVidia GTX 980 Ti winning as that is the GPU I purchased for my tower a couple of months ago.

2. I noticed you did not do any tests using SLI or CrossFire. I know there are very few if any systems out there with 4 way SLI or 3 way SLI but there are a lot with 2 way SLI including 2 way SLI using 980 Ti's. Considering you indicate DirectX 12 is supposed to be doing low level talking to the GPUs, and it is strong on parallelism, it seems there would might be significant gains using SLI or Crossfire.

3. Is there any one that is doing tests for DX12 using multiple processor GPUs or multiple GPUs? I would really like to see those numbers.

I am still on Windows 7 because of Windows Media Center is not supported in Windows 10. I know DX12 will not be available for Windows 7 but if it supports multiple GPUs with significant improvement that would be a reason to consider migrating. (I am starting to test out Kobi and NextPVR. I have 4 TV Tuners so I can record and watch all the shows I want to even when 2 or more are on at the same time. It is much cheaper than a DVR especially since no subscriptions to keep using and I can add storage as I please, currently at 13TB and planning on buying a couple of 8TB drives for even more.)
 
</rant> :)

I was hoping for a more technical response :D. One striking difference between the computer card buying/direct X anticipating/new and improved game waiting hardware/software buys of the 1990's (you know the ones that increased the entire US economy) was the expectation that there would be a meaningful performance increase. These articles attempt to dissect microscopic gains to see if they are even REAL. The next card the millions of us that bailed out back in the mid-2000's will be buying will be using 3D holographic headsets on all three axis with 20,000 x 20,000 dpi with integrated 7.1 sound and cost $89 for the whole setup. The reason this is not happening is the bungling engineer-operated GPU/CPU card/ makers are so full of tiny thinking it has made it's way to a smaller and smaller portion of customers. Only an engineer would nickel and dime their product out of the business while simultaneously over charging for insignificant gains.
 
I blame the Devs and Consoles. In my opinion there hasent been a proper breakthrough in PC graphics since Crysis. And that was in 2007. I saw many good looking games in the meanwhile but none of them made my jaw drops regarding that subject.
Three years earlier they drastically increased the price of video cards and developers knew their was virtually no one that would overpay for a card just to play whatever they developed. It is all AMD and Nvidia's engineering based awful "management" decisions.
 
Back