Gears of War 4 Benchmarked: Graphics & CPU Performance Tested

Steve

Posts: 3,044   +3,153
Staff member

Besides being one of the year's biggest blockbuster games, the arrival of Gears of War 4 a few days ago is especially exciting for a number of reasons: Windows/Xbox cross-platform multiplayer, the exclusion of older DirectX APIs, the use Unreal Engine 4 versus UE3, and for the simple fact that this is the first GoW title developed outside of Epic Games.

Gears of War 4 is DX12-only, meaning the game has been built from the ground up to leverage this low-level API on both the PC and Xbox versions, and considering developer 'The Coalition' is a subsidiary of Microsoft, it should come as no surprise that they worked together on incorporating the technology, nor that Epic Games assisted in allowing Gears of War 4 to take full advantage of UE4's DX12 support.

In an effort to figure that out, we've thrown not 20 or even 30 graphics cards at this title, but 40 -- 41 to be exact. As the cherry on top, we've also tested a number of Intel and AMD chips to see the impact processors have on performance.

Read the complete article.

 
I dont like how the game adjusts settings based on your VRAM. That should be optional. Imagine if this kind of thing happened, but it was AMD cards looked worse then nvidia ones.
 
Well, it looks like I can still skip a video card upgrade with this one.
Please also benchmark Forza Horizon 3 PC !

I did not find a proper benchmark for this one on the entire internet !
 
Does anybody knows why AMD's first Generation GCN GPUs are behind Nvidia Kepler GPUs (7950<760 - 7970GHz<680) in this new DX12 title? wasn't AMD GPUs supposed to be More future proof compare to Nvidia?
 
Just want to check, because I think this setting might be why you're getting drastically different textures based on the VRAM each card has, and I'd like to clarify for all reading, that it's something you can disable. Of course, if I'm wrong and you made sure that this was disabled, just ignore me :D

X63C9dk
 
More evidence to never ever buy a GTX 1060 3GB...

Would have been interesting to test the image quality on a Fury X 4GB vs any 6GB and/or 8GB card of both AMD and nVidia to see if there's any difference at any of the resolutions, to determine how much the 4GB gives issues.

I also would have liked to see an image comparison difference between AMD and nVidia when both have sufficient memory.

Still it must be said... One of the best articles yet. Good job.
 
Does anybody knows why AMD's first Generation GCN GPUs are behind Nvidia Kepler GPUs (7950<760 - 7970GHz<680) in this new DX12 title? wasn't AMD GPUs supposed to be More future proof compare to Nvidia?
Because all Unreal Engine versions have generally been favorable to nVidia hardware.
 
It's actually quite simple as to why the i7 is outperforming the i5. The game was optimized to take advantage of hyper threading and more cores. slowly more and more games are doing it, the new Deus Ex would be another example. People use to joke about getting an i7 over an i5 but that's slowly changing...

http://www.dsogaming.com/pc-performance-analyses/gears-of-war-4-pc-performance-analysis/

Gears of War 4 scaled incredibly well on our hexa-core CPU, and its in-game benchmark tool used all of our six CPU cores to their fullest. Now we do have to note that the benchmark scene stresses more the CPU than the GPU, which is why we used it only to find out how the game scales on multiple CPUs.

As always, we simulated a dual-core and a quad-core system. Our simulated dual-core was able to run the in-game benchmark (at 1080p and on Ultra settings) with an average of 39fps, however we experienced a lot of stuttering that made the game unplayable. Our simulated quad-core, on the other hand, was able to push an average of 85fps, and our hexa-core ran it with an average of 115fps.

Things got really interesting when we enabled Hyper Threading. With Hyper Threading enabled, our simulated dual-core system was able to run the benchmark with an average of 61fps, and with minimal stuttering. Our simulated quad-core system pushed an average of 103fps, while our hexa-core system did not see any performance improvements.
 
Does anybody knows why AMD's first Generation GCN GPUs are behind Nvidia Kepler GPUs (7950<760 - 7970GHz<680) in this new DX12 title? wasn't AMD GPUs supposed to be More future proof compare to Nvidia?

Nvidia worked with the devs the whole time on this game so it is optimized for their hardware.
 
Just astounds me that an Intel i3 can keep up and sometimes surpass AMD's high end 8350 cpu. It must smack them in the face every time they see numbers like these. They better hope Zen is everything we think it will be.
 
There must be a mistake on the graphs... the AMD cards are future-proof and should be well ahead of their Nvidia counterparts... I'm looking forward to reading some of our AMD fanboys' rationales for these results... I'm assuming it's because Nvidia has bribed the developers to make their cards look better...
 
It's actually quite simple as to why the i7 is outperforming the i5. The game was optimized to take advantage of hyper threading and more cores. slowly more and more games are doing it, the new Deus Ex would be another example. People use to joke about getting an i7 over an i5 but that's slowly changing....
Games have been taking advantage of Hyper Threading for 5+ years.
i7's have been the king CPU's for gaming forever, minus a few rare examples.
i7's have always been at the top of 99% of gaming charts and that will not change.

i5's are a great compromise for price and performance.
HT is also why many i3's match i5's.
 
There must be a mistake on the graphs... the AMD cards are future-proof and should be well ahead of their Nvidia counterparts... I'm looking forward to reading some of our AMD fanboys' rationales for these results... I'm assuming it's because Nvidia has bribed the developers to make their cards look better...

Well actually Nvidia did work closely with the devs on this one but the results are pretty consistent from what we have been seeing from DX 12 gains so there are no "foul gains" as you seem to be trumping up.

It's funny that you openly call for trolls to come, you really don't hide how unneutral you are.

If anything the most disappointing card in this test was the 980 Ti, it should be performing closer to the 1070.
 
I've got the game for free but haven't played it yet cause I'm using my i7 3770 igpu, I just purchased a PNY GTX 1060 (VCGGTX10606XGPB-OC) and looks like I'll be able to enjoy this game in ultra. Cant' wait :D
 
I've got the game for free but haven't played it yet cause I'm using my i7 3770 igpu, I just purchased a PNY GTX 1060 (VCGGTX10606XGPB-OC) and looks like I'll be able to enjoy this game in ultra. Cant' wait :D

That's a great card, you should have a awesome experience playing. The 3770 is the perfect CPU for the job as well.
 
It's actually quite simple as to why the i7 is outperforming the i5. The game was optimized to take advantage of hyper threading and more cores. slowly more and more games are doing it, the new Deus Ex would be another example. People use to joke about getting an i7 over an i5 but that's slowly changing....
Games have been taking advantage of Hyper Threading for 5+ years.
i7's have been the king CPU's for gaming forever, minus a few rare examples.
i7's have always been at the top of 99% of gaming charts and that will not change.

i5's are a great compromise for price and performance.
HT is also why many i3's match i5's.

That isn't true, other than 3-5 modern games there hasn't been a real difference between an i5 and an i7, most of the time the i7 actually performed worse due to 90% of the games currently out not supporting HT and performing worse with it on.

Clock for clock i5=i7, if the game actually support HT then i7 is king but that's rare.
 
I dont like how the game adjusts settings based on your VRAM. That should be optional. Imagine if this kind of thing happened, but it was AMD cards looked worse then nvidia ones.

I think there's an option you can disable -enabled by default. It prioritises the use of available resources to its own criteria; I don't know how the option is called in English.
 
Just astounds me that an Intel i3 can keep up and sometimes surpass AMD's high end 8350 cpu. It must smack them in the face every time they see numbers like these. They better hope Zen is everything we think it will be.

Prob not really surprising to AMD since the two chips are priced the same, your also seeing the most performance your going to get out of the i3, where as the FX owner can gain some more if they want.
 
That isn't true, other than 3-5 modern games
I know of about 12 or more.

there hasn't been a real difference between an i5 and an i7, most of the time the i7 actually performed worse due to 90% of the games currently out not supporting HT and performing worse with it on.

i7's are always the best gaming CPU if you have the money and HT being on rarely effects anything. I saw maybe 2 or 3 games and it was a few FPS, maybe. You show me 2-3 games that an i5 sneaks by a few FPS, I'll show you 4-5 where HT makes a big difference.
Performing worse is extremely rare, an i7 is the best performing gaming CPU you can buy 99% of the time, and its been like this forever. They have more muscle and cache.
They also handle background tasks more effectively during gamin and I believe they push multi-GPU setups better, although not sure about that.
i5's are a better purchase for the money, but if you want the best gaming CPU you get an i7, especially now.

I have charts like this from 10+ games going back to 2010.
Skyrim, Tribes, and others.

 
Last edited:
That isn't true, other than 3-5 modern games
I know of about 12 or more.

there hasn't been a real difference between an i5 and an i7, most of the time the i7 actually performed worse due to 90% of the games currently out not supporting HT and performing worse with it on.

i7's are always the best gaming CPU if you have the money and HT being on rarely effects anything. I saw maybe 2 or 3 games and it was a few FPS, maybe. You show me 2-3 games that an i5 sneaks by a few FPS, I'll show you 4-5 where HT makes a big difference.
Performing worse is extremely rare, an i7 is the best performing gaming CPU you can buy 99% of the time, and its been like this forever. They have more muscle and cache.
They also handle background tasks more effectively during gamin and I believe they push multi-GPU setups better, although not sure about that.
i5's are a better purchase for the money, but if you want the best gaming CPU you get an i7, especially now.

I have charts like this from 10+ games going back to 2010.
Skyrim, Tribes, and others.


The issue isn't so much HT, but the availability of more cores. The i3 is the best example; as a two core CPU, it's downright weak, but thanks to HT, it's able to beat the FX-8350.

DX12 is allowing more fine-grain control of the GPU pipeline, via more threads, and more cores reduces the chance of any one particular core getting bottlenecked, resulting in more consistent FPS (higher minimums). I expect this trend will continue going forward.

The reason the FX series still does so poorly in comparison is because the individual CPU cores are so weak, even with more threads being balanced around the cores, at least one CPU core is still being stressed more then 100% load, resulting in significant performance loss. Throw in relatively weak IPC, and you sum up AMDs current CPU lineup.
 
God I am so tired of these BS XBOX games implementing failbox auto-tuning settings. Program a real game why don't you guys?!


Also the AMD cpu performance is quite odd considering the X1 runs on an AMD 8-core .
 
Well, it looks like I can still skip a video card upgrade with this one.
Please also benchmark Forza Horizon 3 PC !

I did not find a proper benchmark for this one on the entire internet !

That's because the game still doesn't work well. Apparently the problem is that you get lower framerates the faster your storage device is. So someone with a PCIE SSD is screwed, but if you have an HDD it runs fine.
 
It's actually quite simple as to why the i7 is outperforming the i5. The game was optimized to take advantage of hyper threading and more cores. slowly more and more games are doing it, the new Deus Ex would be another example. People use to joke about getting an i7 over an i5 but that's slowly changing....
Games have been taking advantage of Hyper Threading for 5+ years.
i7's have been the king CPU's for gaming forever, minus a few rare examples.
i7's have always been at the top of 99% of gaming charts and that will not change.

i5's are a great compromise for price and performance.
HT is also why many i3's match i5's.

That isn't true, other than 3-5 modern games there hasn't been a real difference between an i5 and an i7, most of the time the i7 actually performed worse due to 90% of the games currently out not supporting HT and performing worse with it on.

Clock for clock i5=i7, if the game actually support HT then i7 is king but that's rare.

All games support hyperthreading lol. The i7 just doesn't start utilizing it unless it needs more than 4-cores worth of performance.
 
Back