Batman: Arkham Knight Benchmarked, Performance Review

Steve

Posts: 3,044   +3,153
Staff member

batman arkham knight benchmarked performance review

Arkham Knight is based on the same Unreal Engine 3 as its predecessor, but you can expect noticeably improved graphics, in part because the PS4 and Xbox One have replaced the PS3 and 360 as the lowest common denominator.

We've had high expectations for the game's graphics leading up to the launch after its recommended system specs were shared by Rocksteady Studios. Being an Nvidia-sponsored title, the spec lists failed to mention any AMD GPUs, but we were surprised to see the GeForce GTX 980 recommended for maximum quality graphics with no less than the GTX 760 suggested for standard play.

Favoring one camp of GPUs is hardly the least contentious way to launch a new game, however it seems this release would have rattled cages regardless. Countless PC gamers have expressed anger about Arkham Knight's performance with reports of constant stuttering ruining gameplay. The issue affects both Nvidia and AMD users, though it seems AMD folks are having the most trouble.

Read the complete review.

 
This is terrible. It's a good thing I'm not stupid enough to preorder anything. If I had preordered the game I would've been foaming at the mouth. I'll wait for sale time then check if things have improved somewhat.
 
Good review, and my conclusion as the article says itself, any game Nvidia touches ends up broken, incredible they haven't said a word even though days ago they were making a lot of noise of how this game was using their exclusive crappy technology.
 
"The AMD FX-9590 looks plenty strong at 4.5GHz" But it still can't match Intel at 2.5GHz, amazing, that is all.

I'll test the PhysX implementation on my GTX 970/580 setup tonight, I still have not launched the game due to the complaints concerning the optimization, maybe I'll be better off with the dedicated PhysX GPU?
 
Everyone complaining about gameworks, yet every single game on PC I've ever played that has AMD trashware behind it needed some form of tweaking with INI's or game files to get them to run without microstutter/proper mouse acceleration. So shame on both companies for pulling this crap, and shame on the devs who pick a camp and crap on the other one.
 
Almost every single game that has come out with gameworks has had issues. You take a company that is stressed for time and resources and then make them waste time having to code proprietary eye candy crap because some graphics card company threw a bit of money at their parent company and this is what you get. Was it worth it to throw all that NV logo crap in the game so you could have pretty rain? The money they got up front would be piddly compared to the release sales and resulting ones they COULD have had.
 
Will you guys do a re-benchmark when Rocksteady re-releases the game? All visuals effects where needed, patches, better performance (maybe better visuals?) ?
 
Wait? Is that guy in the picture above wearing a Confederate flag? When is Rocksteady coming out with an update to remove it?
 
Good review, and my conclusion as the article says itself, any game Nvidia touches ends up broken, incredible they haven't said a word even though days ago they were making a lot of noise of how this game was using their exclusive crappy technology.
Since I don't own a console, let me say the same for console exclusives. You are calling it crappy not for what it is but for how it is released.
 
I think the CPU benchmark results suggest that the game is GPU-limited at 1080p. That's why the i3 can match the i5 and i7 easily. To exaggerate the difference in CPU performance, you have to benchmark at a lower (but unrealistic resolution). Perhaps try 720p?
 
Ah, the Gameworks debate again.

As PC gamers we are always tinkering with image quality settings to find the right balance to give us a smooth and/or playable framerate. Do we turn anti-aliasing all the way up, or do we need to turn it completely off? Can we enable HBAO+ or do we have to settle for SSAO. Do we move the draw distance slider all the way up, or just halfway? Post processing on or off? We use guides and even software to help us optimize our settings, but when we see Gameworks in the options we rage like The Incredible Hulk?! WTF???

What were the reactions to 4K? Were they to write it completely off or wait for GPU's that could push the required amount of pixels to make it enjoyable? Gameworks has been rocking the headlines for the past few months starting with Far Cry 4 I believe, and people are already calling for its head?! Did the Crytek guys gimp Crysis 2 because the original Crysis was melting too many GPU's? No, because the next generation of GPU's got better and those are the ones we upgraded to to play it. It took me TWO GPU upgrades AFTER the original Crysis before I could even play it on High. So how the freaking heck are we complaining about a ~20fps performance hit from a couple OPTIONAL Gameworks features while still being play?

I'm going to come right out and say it - people are jealous of nVIDIA when it comes to Gameworks. Haters. There is NO other sensible reason you would be hanging Gameworks out to dry otherwise. Have you seen the demo showing the Gameworks effects?! OMG! Or the fur in Far Cry 4 and The Witcher 3?! The fur does look a little weird, but it's miles ahead of what that dog looks like in Fallout 4. I could understand the hate if Gameworks was out for years and not getting any better, but all this hate started within the last 6 months. Heck, probably even before it was released just because nVIDIA's name was attached to it.

I have an overclocked GTX 970 and even GFE turned Gameworks off when it optimized The Witcher 3. Hairworks is an option, not a requirement to play the game and I can 100% understand it [currently] requires an insane amount of tessellation that was most likely a big reason tessellation was amped up in the Maxwell architecture. 3x the tessellation performance over past gens according to nVIDIA. Pascal will do even better. It's just that obvious. And if Gameworks features are causing your frame rate to tank - don't enable them!!!

Sometimes the software is ahead of the hardware.....
 
Last edited:
Ah, the Gameworks debate again.

As PC gamers we are always tinkering with image quality settings to find the right balance to give us a smooth and/or playable framerate. Do we turn anti-aliasing all the way up, or do we need to turn it completely off? Can we enable HBAO+ or do we have to settle for SSAO. Do we move the draw distance slider all the way up, or just halfway? Post processing on or off? We use guides and even software to help us optimize our settings, but when we see Gameworks in the options we rage like The Incredible Hulk?! WTF???

What were the reactions to 4K? Were they to write it completely off or wait for GPU's that could push the required amount of pixels to make it enjoyable? Gameworks has been rocking the headlines for the past few months starting with Far Cry 4 I believe, and people are already calling for its head?! Did the Crytek guys gimp Crysis 2 because the original Crysis was melting too many GPU's? No, because the next generation of GPU's got better and those are the ones we upgraded to to play it. It took me TWO GPU upgrades AFTER the original Crysis before I could even play it on High. So how the freaking heck are we complaining about a ~20fps performance hit from a couple OPTIONAL Gameworks features while still being play?

I'm going to come right out and say it - people are jealous of nVIDIA when it comes to Gameworks. Haters. There is NO other sensible reason you would be hanging Gameworks out to dry otherwise. Have you seen the demo showing the Gameworks effects?! OMG! Or the fur in Far Cry 4 and The Witcher 3?! The fur does look a little weird, but it's miles ahead of what that dog looks like in Fallout 4. I could understand the hate if Gameworks was out for years and not getting any better, but all this hate started within the last 6 months. Heck, probably even before it was released just because nVIDIA's name was attached to it.

I have an overclocked GTX 970 and even GFE turned Gameworks off when it optimized The Witcher 3. Hairworks is an option, not a requirement to play the game and I can 100% understand it [currently] requires an insane amount of tessellation that was most likely a big reason tessellation was amped up in the Maxwell architecture. 3x the tessellation performance over past gens according to nVIDIA. Pascal will do even better. It's just that obvious. And if Gameworks features are causing your frame rate to tank - don't enable them!!!

Sometimes the software is ahead of the hardware.....

This ad brought to you by Nvidia. Please drink responsibly.
 
So I read that "the Batmobile takes up an entire DVD disc in size". If that is true, that might explain why that car tanks every video card in existence.
 
This is the first responsible unbiased article I have read so far. States the facts not all the trolling the rest of the internet has been spewing. Thank YOU!
 
Ah, the Gameworks debate again.

As PC gamers we are always tinkering with image quality settings to find the right balance to give us a smooth and/or playable framerate. Do we turn anti-aliasing all the way up, or do we need to turn it completely off? Can we enable HBAO+ or do we have to settle for SSAO. Do we move the draw distance slider all the way up, or just halfway? Post processing on or off? We use guides and even software to help us optimize our settings, but when we see Gameworks in the options we rage like The Incredible Hulk?! WTF???

What were the reactions to 4K? Were they to write it completely off or wait for GPU's that could push the required amount of pixels to make it enjoyable? Gameworks has been rocking the headlines for the past few months starting with Far Cry 4 I believe, and people are already calling for its head?! Did the Crytek guys gimp Crysis 2 because the original Crysis was melting too many GPU's? No, because the next generation of GPU's got better and those are the ones we upgraded to to play it. It took me TWO GPU upgrades AFTER the original Crysis before I could even play it on High. So how the freaking heck are we complaining about a ~20fps performance hit from a couple OPTIONAL Gameworks features while still being play?

I'm going to come right out and say it - people are jealous of nVIDIA when it comes to Gameworks. Haters. There is NO other sensible reason you would be hanging Gameworks out to dry otherwise. Have you seen the demo showing the Gameworks effects?! OMG! Or the fur in Far Cry 4 and The Witcher 3?! The fur does look a little weird, but it's miles ahead of what that dog looks like in Fallout 4. I could understand the hate if Gameworks was out for years and not getting any better, but all this hate started within the last 6 months. Heck, probably even before it was released just because nVIDIA's name was attached to it.

I have an overclocked GTX 970 and even GFE turned Gameworks off when it optimized The Witcher 3. Hairworks is an option, not a requirement to play the game and I can 100% understand it [currently] requires an insane amount of tessellation that was most likely a big reason tessellation was amped up in the Maxwell architecture. 3x the tessellation performance over past gens according to nVIDIA. Pascal will do even better. It's just that obvious. And if Gameworks features are causing your frame rate to tank - don't enable them!!!

Sometimes the software is ahead of the hardware.....

If GameWorks features were just something you could turn off completely and get on with it there wouldn’t be a problem. GameWorks is like a cancer in most games that feature it, you can’t get rid of its performance robbing properties entirely.

I love how you bought up Crysis, you know what Nvidia did there right? It was shameful.

Not sure you are on to anything there, I am certainly not jealous of Nvidia but I am not a fan of GameWork features.

If Nvidia’s primary goal was to make games look better, be more fun to play and of course play better than GameWorks would be amazing. The goal of GameWorks seems to be to introduce patented features that can only be optimized for by Nvidia and those said features are designed to be as taxing as possible.

Let me be clear I have nothing at all against Nvidia, they make amazing GPUs, as does AMD. BUT I am not a fan of GameWorks or at least how it has been used in recent titles.
 
If GameWorks features were just something you could turn off completely and get on with it there wouldn’t be a problem. GameWorks is like a cancer in most games that feature it, you can’t get rid of its performance robbing properties entirely.

I pray you're not talking about Crysis 2.

You can't turn Gameworks features off? That's funny because you seemed to have been successful in doing so in your Arkham Knight Performance review. A review I was surprised to see in light of the recall I might add. It seems you have a vendetta, because every Gameworks Performance Review has a dedicated section for your rants.

If Nvidia’s primary goal was to make games look better, be more fun to play and of course play better than GameWorks would be amazing. The goal of GameWorks seems to be to introduce patented features that can only be optimized for by Nvidia and those said features are designed to be as taxing as possible.

Ah that's what I was waiting for. So you just called Gameworks a cancer, and then you do a 180 and want it optimized for AMD hardware?! Maybe, just maybe AMD should come up with its own solution like they did with Raptr, FreeSync and closed betas like Mantle that locked out anyone from using it entirely. If nVIDIA were to spend their money just to give it away to AMD, what would be the point of buying nVIDIA cards when AMD cards are cheaper? What would drive nVIDIA to innovate if they were to just give it away? Should Mercedes give BMW access to their AMG engines? Maybe energy drink companies should share their recipes with Coca Cola because soft drink sales are down.

Fact: AMD does best with source code access and using standards. nVIDIA can do a majority of its optimizations via their drivers and some devs prefer using libraries because that means less work for them. AMD is at a disadvantage, but that has nothing to do with nVIDIA, but their own [lack of] resources. Asking nVIDIA to "go easy" on AMD is foolish talk. If the playing field was even heck yea gamers would benefit, but if nVIDIA gives everything away, again, what would be the point of nVIDIA continuing to put money into R&D to improve visuals? And if nVIDIA doesn't do it, who will? AMD? Ha! AMD doesn't have the resources, money, time, or the R&D to do so. We're talking about the same AMD that still doesn't have Crossfire support on games that have been out for months. They still don't have Crossfire support for FreeSync that they said was coming in APRIL, and that's really sad considering FreeSync's high minimum refresh. The same AMD that has one WHQL driver in 2015 vs eight from nVIDIA.

nVIDIA owning 76% of the dGPU market didn't just happen by accident, so they are doing something right. Something AMD could learn from rather than looking for free rides simply because they are having a tough time staying out of the red. It seems the word competition is thrown around so much people have forgot its meaning. Next up is the word monopoly and fear-mongers claiming cards will rise to $1000 overnight as a result of a deceased AMD. So show me on the doll where nVIDIA touched you. It's okay, no one else is around.

I guess while I'm here I should strongly suggest if you're going to continue to list the GTX 970 as having two memory pools (you're the only site I know of that does this - shocker!), then maybe you should include memory usage in your performance reviews. Just a thought. :)
 
Last edited:
@hahahanoobs
Yeah sure Nvidia is making a good job, gaining market share by making good graphic cards at good prices, but they are doing almost nothing to make PC gaming any better, in fact, recent Gameworks games are being a problem, even for their own GPUs, I really hope they get rid of their Gameworks stupidity sooner than later, but seeing how Nvidia fans die in the flames defending it, it's not actually comforting.
 
Back