Early Fable Legends' DirectX 12 performance is similar to current games

Scorpus

Posts: 2,162   +239
Staff member

Many people are hoping that DirectX 12 will bring better performance in PC games no matter the circumstances, thanks to its low level, low overhead capabilities. Unfortunately this isn't always going to be the case, as the latest DirectX 12 benchmark has revealed.

Our friends over at The Tech Report recently got their hands on an early preview of Lionhead Studos' upcoming game Fable Legends, which is built using Unreal Engine 4 and supports DirectX 12. This early preview build includes a graphically-impressive benchmark that tests the engine and DirectX 12's performance in a 3D-heavy role-playing game scenario.

The Tech Report tested a variety of cards from AMD and Nvidia, as well as variety of CPU configurations, and found that the game running in DirectX 12 mode (the only mode available at this stage) resulted in performance figures that looks just like any other current DirectX 11 game.

For starters, the Nvidia GeForce GTX 980 Ti delivered the best performance at both 1080p and 4K, which differs from an earlier DirectX 12 benchmark of Oxide Games' Ashes of the Singularity that showed better-than-usual performance for AMD's hardware.

In Fable Legends, AMD's Hawaii-based cards actually reported performance relatively close to the newer and supposedly more powerful Fury cards. The R9 390X in particularly looks like a very attractive option in this game when placed against the GTX 980, considering the price of both cards.

When the benchmark was performed on a CPU-limited platform, performance increased as more cores and threads were added up until the system reached four cores and four threads, after which performance plateaued. However at 1080p with ultra settings enabled (settings that gamers would typically use), the game became GPU-limited, which resulted in less dependence on CPU cores and made less use of DirectX 12's strengths.

In general, performance in Fable Legends using DirectX 12 looks very similar to what you'd get from current non-DX12 games, even though at this stage it's not clear what performance advantage DirectX 12 gives over DirectX 11.

If you're interested in a closer look at Fable Legends' DirectX 12 performance, check out the full report from The Tech Report here.

Permalink to story.

 
This is to be expected. It's still early days for DX12 and the game coding is probably still crappy. It'll improve as DX12 matures and game coders get used to it.
 
Computer technology related expectations claim yet another victim...
Part of me is just LOLing... hard.
 
This is to be expected. It's still early days for DX12 and the game coding is probably still crappy. It'll improve as DX12 matures and game coders get used to it.

I'm a little bit more on @Skidmarksdeluxe side. If you think this is a fail, maybe you don't know much about programming. If you think a compiler will do the magic for you by simply supporting an API level, you're delusional. If DX12 is allowing for more bare-metal instructions it would be similar to using code similar to __asm__ in C for those routines. That's in the programmer's side, not the compiler.

Low level is not trivial and not everyone wants to touch it, and not everywhere. If these developers aren't using the low-level features as often as to make a difference in performance; it won't magically happen.
 
Why 4k? 980ti ran it under 100fps in 1080p and was fastest so no need for 4k. Also 37fps 4k? I play ark hd ready to get 60fps instead of 40+fps in 1080p and if it wasn't enough would put to how ever low needed to get the minimum of 60. Need at least 2 of any gpu to play new games 4k. Good there was link to 1080p results so got some data on actual performance.
 
At first everyone magically expected improved performance regardless, but a lot of people finally comprehend that performance increases will usually only appear when said game is CPU heavy/bound. I don't think the Fable performance findings came as much a surprise to anyone.

We are just now seeing DX11 become standard for PC's. Consoles are still stuck in 2009.
Yeah...but the consoles always had their own standards for low-level API for a few or more generations, so they beat us to that being widely used awhile ago. Not to mention, the XB1 uses DX11.2 (or 11.3?) and the PS4 the equivalent on the OpenGL side of things standard, so bringing up the previous generation is irrelevant. Before now on the PC we had only a few handfuls of games using DX11--not much of a bragging right after 4 years.
 
It's most likely because it's a console port and like all console ports the game is runs like crap on PC. we'll have to see if the final version works better.

BTW from the benchmarks, it seems the 290x is the best one you can buy right now. performance is very close to the 980 at around 250$
 
Last edited:
Where are all those people who claimed that AMD was going to be king because they perform better on DX12? Still too early to tell, but at least now we see that it might actually be the opposite...
 
Where are all those people who claimed that AMD was going to be king because they perform better on DX12? Still too early to tell, but at least now we see that it might actually be the opposite...

If game is not made to use specific new features of DX12, there's not much difference. However if you actually use and optimize your game to take advantage of the new near-metal stuff in the DX12 api, the story is different.

Also at this point we have no clue, how this same game would perform in DX11 - maybe it's like 30fps in 1080p and 15 fps in 4k when run on 980 Ti. We do not know yet.

Also DX12 gives you option to have much more stuff on your screen => If this has been used in this demo, then you can't even get similar look in DX11.

Why 4k? 980ti ran it under 100fps in 1080p and was fastest so no need for 4k. Also 37fps 4k? I play ark hd ready to get 60fps instead of 40+fps in 1080p and if it wasn't enough would put to how ever low needed to get the minimum of 60. Need at least 2 of any gpu to play new games 4k. Good there was link to 1080p results so got some data on actual performance.

+1. Techspot is obsessed with 4k for some unknown reason... I don't understand it either. It's like being obsessed with colonizing Europa (Jupiter moon) when we haven't even reached Mars yet.
 
Last edited:
Not sure why you quoted my post... It has nothing to do with what you replied... and 4k benchmarks are important - it will be the new standard for high end systems and there are already people (me included) who enjoy gaming at 4k.

When looking at how well a $600+ card performs, you need to be testing the highest resolutions! That's why I was actually more confused with the testing of this benchmark at low resolutions - If you buy a Fury or 980Ti, you better be gaming at at LEAST 1080p - otherwise you wasted your money!
 
Back