Deus Ex: Mankind Divided Benchmarked, Performance Review

I knew the 390X would be as strong as the 980 Ti eventually. I find it absolutely hilarious that this article would look at the benchmark and say "Yeah get 2 x Titan's for 4K". Why? So they can lose to a Fury X in a year? smh...


The sad reality is that because AMD hasn't launched Vega yet, there is no card out now that can run this game as well as we would like.

P.S. Before people freak out about AMD-bias keep in mind the 1080 has about the same TF"s as the Fury X, so no one should be surprised they are performing so closely now that companies are finally taking advantage of AMD's shader power. LOL they haven't even added DX12 yet. Bloodbath incoming when the RX 480 matches the 1070. Welcome to the new normal people :D

Uh-huh. The old AMD fanboy mantra: "Just You Wait"... it's already been a bloodbath for quite a few years on one side of the fence- I guess that's why they're called the Red Team.
 
You can probably tweak some settings to improve performance while still maintaining quality.
 
Uh-huh. The old AMD fanboy mantra: "Just You Wait"... it's already been a bloodbath for quite a few years on one side of the fence- I guess that's why they're called the Red Team.

Erhhh what? I can honestly say that Maxwell was the first time in like 4 years that Nvidia has truly had a stronger flagship than AMD (Besides the short-lived OG Titan). But even then it tool like what - 1 year for the Fury X to predictably emerge as superior? Even at launch they tied in 4K, and it took overclocking for the 980 Ti to win.
 
Erhhh what? I can honestly say that Maxwell was the first time in like 4 years that Nvidia has truly had a stronger flagship than AMD (Besides the short-lived OG Titan). But even then it tool like what - 1 year for the Fury X to predictably emerge as superior? Even at launch they tied in 4K, and it took overclocking for the 980 Ti to win.
Well I suppose "quite a few years" sounds longer than four, but since then and into the near future they will still be the king (I'm pretty sure the Kepler based 780 Ti also edged out the R9 290X for most games, btw). When Vega arrives, we'll see... by that time though, I'd imagine the 1080 Ti will be out too. The bar will be set VERY high.

Hey, if AMD comes out with something as good or better, I'll happily consider looking their way for my next upgrade... unless it runs at 90 degrees C and sucks 450 watts. I'm just saying that right now, AMD fans are chasing the carrot, while the Nv gang is already at the dinner table, having one hell of a meal. So your "bloodbath" comment required a reply. Cheers!
 
@Steve

"Overclocking your Intel processor for better frame rates in Deus Ex: Mankind Divided is pointless."

Not according to the testing I've done:

http://steamcommunity.com/app/337000/discussions/1/352792037314627483/

You testing certainly seems more conclusive ;)

No idea how you didn't see an improvement even with the low quality preset, that is an insane "CPU" bottleneck.

Use AfterBurner to provide an OSD for the system statistics, will be easier to log data and possibly more accurate than your Logitech keyboard.
 
Why dont you test overclocking of G3258? :(
Overclocking of i7 made no gains, but G3258 could be very interesting.
 
@Steve

"Overclocking your Intel processor for better frame rates in Deus Ex: Mankind Divided is pointless."

Not according to the testing I've done:

http://steamcommunity.com/app/337000/discussions/1/352792037314627483/

You testing certainly seems more conclusive ;)

No idea how you didn't see an improvement even with the low quality preset, that is an insane "CPU" bottleneck.

Use AfterBurner to provide an OSD for the system statistics, will be easier to log data and possibly more accurate than your Logitech keyboard.

Yeah, I couldn't believe it either. This is the first time I've ever felt the need to overclock this processor. The last time I actually needed to overclock a processor to get playable framerates in a game was back when Crysis came out. The first one.

As for the accuracy of my keyboard resource monitor, you'd be surprised how accurate it is. It's not as detailed as OSD types out there, but it is accurate and keeps the clutter off my screen. It used to give me per-core usage rates, but that's when I was using a quad-core. But with hyperthreading making it appear as if I have 8 cores, there's not enough room for all 8 per-core readouts on the screen, so it just gives me overall CPU usage readouts now. I actually find the constant keyboard readouts to be an invaluable tool (kind of like a tachometer on a car). Such as if I start a game for the first time and it hangs for a long time. The cpu useage can give me an idea of what is happening behind the scenes. No CPU useage increase over a protracted time (but game is listed in task manager), game probably crashed and I can feel free to kill the process and try again. Nothing happening, but definitely getting an increase in CPU useage, then that indicates something is happening, but I just can't see it, so I should wait and not jump the gun by killing the process and restarting it. Turned out in the latter scenario, it was my Anti-virus scanning the executable on first run to make sure the program was clean before allowing it to run, but the AV didn't give me any popup to advise me that's what was happening, but the increased CPU usage gave me enough information to be patient and give it more time.

The memory indicator has allowed me to see memory populate as it happens, which has proven invaluable as well, for spotting movement on long loads while in a black screen, can show me the game is actually doing something, making me more patient when waiting. And I've also been able to spot what would eventually turn out to be memory leaks that would eventually crash the game. Allowing me to report issues with more information.

I've found the constant readouts to be an invaluable tool, not only for making me more aware of what's happening in the background (things you don't normally see as a normal user), but also to keep myself in check and making me more patient when things aren't as instantaneous as I'm expecting. But also to give me points of reference to more accurately diagnose and troubleshoot when things go wrong.

High CPU usage is not uncommon in games. Most games on my system use an average of (jumping from to) about ~30-50% CPU usage in game. Many of the newer more demanding games lately use an average of (jumping from to) ~35-70% CPU usage in game sometimes spiking (rarely) to 80%. But DXMD is using an average of (jumping from to) ~40-90% usage in game (Prague train station cut scene is particularly rough maintaining 60-90% CPU usage) and sometimes (infrequently, but often enough) maxing out the processor completely in short spikes resulting in fairly frequent momentary hangs/pauses. And that's on my core i7. Some people on lower model CPU's are reporting higher sustained average CPU loads over the whole resulting in worse performance than even I'm getting.

I've managed to tweak my settings a little further to better help smooth out my framerates more and I'm now managing to maintain a fairly consistent 60fps in game with the occasional momentary hangs/dips as reported earlier. And I've managed to take my overclock back off. But the in-game benchmark situation is still the same. As reported in that thread I linked you, I get the exact same performance in that benchmark regardless of settings. Even on lowest, I still get only 34.3fps on that benchmark at stock clocks and as mentioned in that thread, I'm running 2 R9 290X's crossfired. The only thing that scales that benchmark score up currently is overclocking my CPU. On a 4.5Ghz overclock, I can hit up to 43fps. I'm still in Prague just now, so I don't know yet just how representative that benchmark is until I get to that particular section of the game, when I get to Golem, which is where I think that particular scene is set. Then I'll have a better idea of where that benchmark fits in.

Hope this helps.

P.S. Wow, a writer that actually responds to and participates in his articles comments sections. I actually didn't expect that. Thank you for the response.
 
Why dont you test overclocking of G3258? :(
Overclocking of i7 made no gains, but G3258 could be very interesting.
Because this review is about the game's GPU performance, not CPU performance. That's why 38 GPUs were tested using one CPU.

I think you may want to read the review again. Page 5 in particular.

https://www.techspot.com/review/1235-deus-ex-mankind-divided-benchmarks/page5.html
Wow, how did I miss that?? Please disregard my post, I don't see a way to delete it.
 
I also recommend the AfterBurner plugin for the Logitech LCDs. I use it with a G13 and can customise what is shown and modify the font size to fit more on the screen. I monitor CPU/GPU usage, temps and fan usage as well as memory usage and FPS.
I'm averaging 30 - 35 FPS in-game - I'm going to fully disable MSAA tonight and see what improvements I get.
I also regularly see over 10GB of system RAM used (out of 16), but only 2GB of my 3GB of GPU RAM (on High settings). I'd like to get the FPS a little higher but it's not affecting my game experience at the moment.
 
Seems like AMD cards are having a much easier time with this game, even in DX11... fishy. I'd understand in DX12, but not in DX11. I expect the gap to widen even further in DX12 between the 2 vendors.
 
Considering my i5-3550 drops into the mid-40s at times in Prague at medium settings at 900p (yes, it's sub-1080p but I don't have a 1080p screen) I'm not sure how a 3470 is faring this well.
 
Well I suppose "quite a few years" sounds longer than four, but since then and into the near future they will still be the king (I'm pretty sure the Kepler based 780 Ti also edged out the R9 290X for most games, btw). When Vega arrives, we'll see... by that time though, I'd imagine the 1080 Ti will be out too. The bar will be set VERY high.

Hey, if AMD comes out with something as good or better, I'll happily consider looking their way for my next upgrade... unless it runs at 90 degrees C and sucks 450 watts. I'm just saying that right now, AMD fans are chasing the carrot, while the Nv gang is already at the dinner table, having one hell of a meal. So your "bloodbath" comment required a reply. Cheers!

Erhh in my opinion no one had much to celebrate in 2016 besides those willing to pay $1200 for a Titan. Everything else was marginally stronger than the previous gen. Even now the Fury series is ahead of the 1070, and the 1080 is only like 15% stronger than the Fury X. 15% for double the price is pretty pathetic progress if you ask me...
 
Erhh in my opinion no one had much to celebrate in 2016 besides those willing to pay $1200 for a Titan. Everything else was marginally stronger than the previous gen. Even now the Fury series is ahead of the 1070, and the 1080 is only like 15% stronger than the Fury X. 15% for double the price is pretty pathetic progress if you ask me...

2016 was a massively exciting year for GPUs, don't you get WiFi reception under that rock? :D

A few fact checks, the Fury X is not faster than the GTX 1070 and is still inferior to the GTX 980 Ti in terms of performance, just as it was upon release. There are some low-level API titles where the Fury X looks amazing such as Doom with Vulkan but they represent a very small percentage of the market right now.

The GTX 1080 is considerably more than 15% stronger than the Fury X, try doubling that figure and then you are getting closer to reality.
 
Back