Battlefield 1 Benchmarked: Graphics & CPU Performance Tested

Steve

Posts: 3,044   +3,153
Staff member

Battlefield 1 marks the fifteenth installment to EA's multiplayer military shooter franchise and the first one to be inspired by historic events surrounding World War I, opening the door to classic warfare we've yet to see such as trench clubs and a cavalry class.

Similar to previous entries in the series, Battlefield 1 has a fun but brief single-player campaign that serves more as a tutorial for the game's online multiplayer, which should be particularly polished this time around considering an multi-platform open beta was available for weeks ahead of release and drew over 13 million players.

The BF1 beta made a strong first impression with great graphics that weren't overly demanding. After testing 41 graphics cards and 20 processors in our Gears of War 4 benchmark feature, we wanted to do the same for Battlefield 1 and for the most part we succeeded.

Read the complete article.

 
Last edited by a moderator:
Thanks for your efforts! I wish you featured your very useful regular CPU overclocking graphs. No other site I know benchmark an overclocked and *underclocked* CPU - very nice metric to have.
 
Thanks for your efforts! I wish you featured your very useful regular CPU overclocking graphs. No other site I know benchmark an overclocked and *underclocked* CPU - very nice metric to have.

We wanted to include more, it would have taken another few days to get that testing done due to the 5 hardware change limit of Origin. I have had to move onto the GTX 1050 testing for now.

Thanks for another comprehensive and thorough analysis, Steven.

Thanks mate!
 
We wanted to include more, it would have taken another few days to get that testing done due to the 5 hardware change limit of Origin. I have had to move onto the GTX 1050 testing for now.

So the hardware change limit didn't even allow you to try different frequencies of the same CPU?
 
We wanted to include more, it would have taken another few days to get that testing done due to the 5 hardware change limit of Origin. I have had to move onto the GTX 1050 testing for now.

So the hardware change limit didn't even allow you to try different frequencies of the same CPU?
if origin is dumb enough to see a different cpu frequency as "different hardware" that could cause it. and EA is thick enough to make that mistake. After all, they were thick enough to put a 5 machine limit on origin.
 
Just curious as to why you guys used an i3-6100T which is 600mhz slower than the non-T version? With the extensive hardware used in these tests, I am perplexed that the only dual core variant is the one to get knee-capped.

It's almost like there is a continuous agenda to push the i3 as a garbage processor.
 
Just curious as to why you guys used an i3-6100T which is 600mhz slower than the non-T version? With the extensive hardware used in these tests, I am perplexed that the only dual core variant is the one to get knee-capped.

It's almost like there is a continuous agenda to push the i3 as a garbage processor.

Sorry, 500mhz slower*
 
Was there ANY reason to use DX12 with this title? I know framerates suffer, but are there better effects or anything? SOMETHING must be causing the framerates to suffer after all...
 
Was there ANY reason to use DX12 with this title? I know framerates suffer, but are there better effects or anything? SOMETHING must be causing the framerates to suffer after all...

DX12 in theory should provide less cpu overhead and allow older/slower CPUs just as well as their high end counter parts. As far as I know DX12 doesn't offer much in the way of new graphical features just a more optimized environment (support for multi core processing and such.) Oddly enough Digital Foundry found the RX 480 vastly superior to the 1060 and more in line with a 1070 with DX12. I have my 1070 right now and have an RX 480 on the way so I'll check for myself. Wonder if a bad update is the cause for the differences.
 
Just curious as to why you guys used an i3-6100T which is 600mhz slower than the non-T version? With the extensive hardware used in these tests, I am perplexed that the only dual core variant is the one to get knee-capped.

It's almost like there is a continuous agenda to push the i3 as a garbage processor.

No agenda here, we are huge advocates of the Skylake Core i3 range. The graphs are clearly labelled stating that we used the low-voltage model and it’s operating frequency, so I am not sure how you could draw the conclusion you have.

DX12 in theory should provide less cpu overhead and allow older/slower CPUs just as well as their high end counter parts. As far as I know DX12 doesn't offer much in the way of new graphical features just a more optimized environment (support for multi core processing and such.) Oddly enough Digital Foundry found the RX 480 vastly superior to the 1060 and more in line with a 1070 with DX12. I have my 1070 right now and have an RX 480 on the way so I'll check for myself. Wonder if a bad update is the cause for the differences.

Yes, I saw the Digital Foundry video a few days ago now. Oddly I couldn’t replicate their results and virtually no other tech site has at this point, with the exception of Tweak Town as mentioned in the article.

Heaps of gamers are now reporting massive frame dips when using DX12. Gameranx has updated their tweaking article by suggesting gamers switch from DX12 to DX11 in order to avoid frame stuttering.
 
DX12 in theory should provide less cpu overhead and allow older/slower CPUs just as well as their high end counter parts. As far as I know DX12 doesn't offer much in the way of new graphical features just a more optimized environment (support for multi core processing and such.) Oddly enough Digital Foundry found the RX 480 vastly superior to the 1060 and more in line with a 1070 with DX12. I have my 1070 right now and have an RX 480 on the way so I'll check for myself. Wonder if a bad update is the cause for the differences.

DF may have been using a pre release version of the game, pre beta video drivers, who knows. It's clearly an engine optimization problem on DICE's side of things as it effects both AMD and Nvidia. It should get sorted out over time.
 
Yes, I saw the Digital Foundry video a few days ago now. Oddly I couldn’t replicate their results and virtually no other tech site has at this point, with the exception of Tweak Town as mentioned in the article.

Heaps of gamers are now reporting massive frame dips when using DX12. Gameranx has updated their tweaking article by suggesting gamers switch from DX12 to DX11 in order to avoid frame stuttering.
Will there be any update from Techspot if the issue is confirmed/solved?
 
For sure! Let's hope there is a fix soon.
Thanks Steve - it's frustrating that DF and TweakTown have put out results that cannot be replicated.

I've been trying to decide between the 480 and 1060 to replace my GTX 760 over the past few months and TBH it's almost worth the extra $150 to grab a 1070 and not have to choose.
 
Thanks Steve - it's frustrating that DF and TweakTown have put out results that cannot be replicated.

I've been trying to decide between the 480 and 1060 to replace my GTX 760 over the past few months and TBH it's almost worth the extra $150 to grab a 1070 and not have to choose.

I did a video on this. Now is probably the worst time in a long time to buy a video card. I have a 1070 now thinking of replacing with an RX 480. It is a downgrade, but can sell the 1070 for $400 and bought the 480 for $200 (8gb model.) Why? The 1070 will take a huge hit in January when Vega news is announced. Should drop down to $300 or so, and will be replaced next year by NVIDIA's Volta gpus at the $200 mark. 1070 on up have a lot of value to lose, but unlike years past new gpus will be coming out every 6 months or so. AMD Vega q1 2017, Nvidia Volta q3 2017, Amd Vega 20 (die shrink refresh like pascal,) q1-q2 2018. The highend cards will be replaced quickly and lose a lot of value fast. My 1070 is already $70 less than when I got it a couple months ago. I'd only get one now if it were $350 or less, or wait until vega in 2 months and they'll all be around $300. If you must buy now the RX 480 has more RAM and horsepower than the 1060. As dx12/Vulkan matures the 480 will get stronger and should only be 10-15% weaker than the 1070 at the end of the day. You can get 4gb models for $165 right now (best short term value,) and 8gb for $205 (that's what I have on the way.)

It's your call, but I plan to upgrade again in Jan/Feb depending on price/performance Vega has vs the 1080ti as I'm going to build a 4k/60+ gaming rig. If neither gpus can do that at a reasonable price I'll wait till q3 for Volta. To each their own.
 
We wanted to include more, it would have taken another few days to get that testing done due to the 5 hardware change limit of Origin. I have had to move onto the GTX 1050 testing for now.

So the hardware change limit didn't even allow you to try different frequencies of the same CPU?

WHY? If you have a 10% OC, add roughly 10% to the numbers. Do you need to be spoon-fed everything?
 
Many review sites do rubbish testing that's very superficial. Without frame time and latency testing consumers won't really have an accurate
result.
 
I did a video on this. Now is probably the worst time in a long time to buy a video card. I have a 1070 now thinking of replacing with an RX 480. It is a downgrade, but can sell the 1070 for $400 and bought the 480 for $200 (8gb model.) Why? The 1070 will take a huge hit in January when Vega news is announced. Should drop down to $300 or so, and will be replaced next year by NVIDIA's Volta gpus at the $200 mark. 1070 on up have a lot of value to lose, but unlike years past new gpus will be coming out every 6 months or so. AMD Vega q1 2017, Nvidia Volta q3 2017, Amd Vega 20 (die shrink refresh like pascal,) q1-q2 2018. The highend cards will be replaced quickly and lose a lot of value fast. My 1070 is already $70 less than when I got it a couple months ago. I'd only get one now if it were $350 or less, or wait until vega in 2 months and they'll all be around $300. If you must buy now the RX 480 has more RAM and horsepower than the 1060. As dx12/Vulkan matures the 480 will get stronger and should only be 10-15% weaker than the 1070 at the end of the day. You can get 4gb models for $165 right now (best short term value,) and 8gb for $205 (that's what I have on the way.)

It's your call, but I plan to upgrade again in Jan/Feb depending on price/performance Vega has vs the 1080ti as I'm going to build a 4k/60+ gaming rig. If neither gpus can do that at a reasonable price I'll wait till q3 for Volta. To each their own.
That's the line of thinking I've been going down that Vega will drop the price of the 1070 IF it's competitive. At some point I want to pull the trigger though and put the 760 into a new HTPC but with Zen coming Q1 2017 I've put that on hold as well.
 
Many review sites do rubbish testing that's very superficial. Without frame time and latency testing consumers won't really have an accurate
result.

You do realize you can figure frame time in you're head from looking at minimum and average frame rates. If you're not the best at math you can use a calculator as well. Simple rules of thumb 30fps = 33ms frame time, 60fps = 16.6ms 120fps = 8.3 ms... Memorizing those you can figure it out from there
 
That's the line of thinking I've been going down that Vega will drop the price of the 1070 IF it's competitive. At some point I want to pull the trigger though and put the 760 into a new HTPC but with Zen coming Q1 2017 I've put that on hold as well.

Waiting till computex in Jan would give you a fuller picture of where the market will be here soon. Unfortunately we have very little information other than Vega 10 will have 4096 shader cores (same as Fury X.) assuming it runs at or near 1500mhz that gives us 12tflop performance which is on par with the TITANXP. AMD's drivers and architecture are improving and Vega is a new architecture not just a BIG Polaris... So there's no way to know how fast it will be in the real world, but TITANXP performance isn't unlikely. Much like the R9 290X it came out a bit later but crushed the original Titan for only $600. I'm getting the same vibe with Vega. Of course here will be smaller/cheaper variants, but there's 0 information on those. Like I said if you can wait... That would be the best option.
 
We wanted to include more, it would have taken another few days to get that testing done due to the 5 hardware change limit of Origin. I have had to move onto the GTX 1050 testing for now.

So the hardware change limit didn't even allow you to try different frequencies of the same CPU?

WHY? If you have a 10% OC, add roughly 10% to the numbers. Do you need to be spoon-fed everything?

Maybe because of results like this?

https://www.techspot.com/review/1263-gears-of-war-4-benchmarks/page4.html

On all 3 CPUs where they tested the game at different frequencies, the average framerates did not scale based on CPU frequencies, & the minimum framerates only "kind of" scaled for the i5-6600k [Yes, I know they labeled it as the "i7-6600K", but that chip doesn't exist], but didn't for the i7-6700K or FX-9590.

Same thing happens in Overwatch (https://www.techspot.com/review/1180-overwatch-benchmarks/page5.html): framerates change when the frequency of the CPU changes, but their scale doesn't match the CPU frequency scale.

So there's no guarantee that a 10% overclock on your CPU will give you 10% more performance. It might give less, it might give more.
 
Back