Star Wars Battlefront Benchmarked, Performance Review

Julio Franco

Posts: 9,097   +2,048
Staff member

It's been a big year for PC gaming with plenty of blockbuster releases including GTA V, The Witcher 3, Fallout 4 and Project CARS, but there was one more game on everyone's radar since its first teaser videos were unveiled by EA in 2013.

Star Wars Battlefront has been met with tremendous anticipation. Its 5-day open beta attracted a staggering 9.5 million players across all platforms, who were willing to download over 8GB of game content, making it EA’s largest beta test ever.

But the purpose of this article is not to explore or judge Battlefront's gameplay -- which has received mixed reactions -- it's the game's amazing visuals that got us here in the first place.

Read the complete article.

 
"Still a bit of a head scratcher when you consider that the Core i7-6700K is able to max out the GTX 980 Ti at just 2.5GHz, while the FX-9590 requires its full 4.8GHz turbo boost clock speed to achieve the same thing."

Not that complicated really.

First off, remember that CMT has a significant performance penalty of about 20%. As a result, AMD's 8-core CPU can really only deliver the performance of about 6.6 cores. So you simply don't get the scaling you expect out of AMD's processors.

Secondly, AMD has the problem where one heavy workload thread risks overloading a single CPU core, which acts as a giant bottleneck on the entire processor. As a result, rising per-core performance (Clock * IPC) can lead to almost linear performance gains, due to removal of said bottleneck.

Now throw Intels HTT in the mix, which unlike AMD's CMT can handle those light workload threads without incurring a 20% performance hit on the non-HTT core, allowing the four main cores to handle their workloads for longer without being interrupted.

Now consider Intels 50-60% edge in IPC.

Is it any shock Intel @ 2.5 > AMD @ 4.5? AMD's BD architecture was always a poor design for desktops, and anyone who understood how hardware and software has to work together understood this.
 
Still would be great if they would do some multiplayer testing. I've done some testing my self and despite no two matches being the "same" it still was consistent. so that's the testing that would actually be helpful game modes that we're actually gonna play. not some training mission.
 
"Still a bit of a head scratcher when you consider that the Core i7-6700K is able to max out the GTX 980 Ti at just 2.5GHz, while the FX-9590 requires its full 4.8GHz turbo boost clock speed to achieve the same thing."

Not that complicated really.

First off, remember that CMT has a significant performance penalty of about 20%. As a result, AMD's 8-core CPU can really only deliver the performance of about 6.6 cores. So you simply don't get the scaling you expect out of AMD's processors.

Secondly, AMD has the problem where one heavy workload thread risks overloading a single CPU core, which acts as a giant bottleneck on the entire processor. As a result, rising per-core performance (Clock * IPC) can lead to almost linear performance gains, due to removal of said bottleneck.

Now throw Intels HTT in the mix, which unlike AMD's CMT can handle those light workload threads without incurring a 20% performance hit on the non-HTT core, allowing the four main cores to handle their workloads for longer without being interrupted.

Now consider Intels 50-60% edge in IPC.

Is it any shock Intel @ 2.5 > AMD @ 4.5? AMD's BD architecture was always a poor design for desktops, and anyone who understood how hardware and software has to work together understood this.

Except we very rarely if ever see those kinds of margins when testing new games.

FYI we understand it isn't complicated at all, we looked at the architecture in detail years ago, what you said is common knowledge. As I said we don't often see margins that are anything like that so it was surprising.

Still would be great if they would do some multiplayer testing. I've done some testing my self and despite no two matches being the "same" it still was consistent. so that's the testing that would actually be helpful game modes that we're actually gonna play. not some training mission.

It can't be consistent which is why no professional tech sites do it. Sorry to have completely wasted your time with something unhelpful. I am sure the GPU performance margins you see here are not transferable to multiplayer gaming.
 
So I need to upgrade my dinosaur GTX 660Ti to a GTX 960 to play smooth 1080p frame rates now eh? Is there a mysterious wealthy benefactor out there who can shower us peasants with money to afford such upgrades??
 
So I need to upgrade my dinosaur GTX 660Ti to a GTX 960 to play smooth 1080p frame rates now eh? Is there a mysterious wealthy benefactor out there who can shower us peasants with money to afford such upgrades??

Fun fact: If you set aside $1.50 each week since buying the GTX 660 Ti you could buy a GTX 960 4GB card today and still have enough money left over for a nice meal. Still it might pay off if you hold out for that mysterious wealthy benefactor :)
 
So I need to upgrade my dinosaur GTX 660Ti to a GTX 960 to play smooth 1080p frame rates now eh? Is there a mysterious wealthy benefactor out there who can shower us peasants with money to afford such upgrades??

Fun fact: If you set aside $1.50 each week since buying the GTX 660 Ti you could buy a GTX 960 4GB card today and still have enough money left over for a nice meal. Still it might pay off if you hold out for that mysterious wealthy benefactor :)
That's $1.50 that could be going into your retirement or mortgage, both which pay dividends on top of the initial investment.

Yes, gaming is a hobby but it doesn't need to be so expensive that you need to contribute a dollar a day into it. It can be cheaper, it was cheaper in the recent past, and it should be cheaper if not for AMD's unfortunate market weakness.
 
That's $1.50 that could be going into your retirement or mortgage, both which pay dividends on top of the initial investment.

Yes, gaming is a hobby but it doesn't need to be so expensive that you need to contribute a dollar a day into it. It can be cheaper, it was cheaper in the recent past, and it should be cheaper if not for AMD's unfortunate market weakness.

It was $1.50 per week ;) Honestly when compared to other forms of entertainment PC gaming is rather cheap. You certainly don't need to spend thousands each year to enjoy it.
 
Last edited:
When testing the CPU performance, why not test the CPU load with an equivalent AMD card as well? In addition to the CPU comparison, it also gives us an overview of how the drivers of AMD and nVidia are performing in games.
 
When testing the CPU performance, why not test the CPU load with an equivalent AMD card as well? In addition to the CPU comparison, it also gives us an overview of how the drivers of AMD and nVidia are performing in games.

Ideally we would like to do that. Unfortunately to provide the CPU results that we do there is about a day and a half worth of testing involved. Combine that with the two days we often spend testing all the GPUs and it starts to get a little out of hand ;)
 
So I need to upgrade my dinosaur GTX 660Ti to a GTX 960 to play smooth 1080p frame rates now eh? Is there a mysterious wealthy benefactor out there who can shower us peasants with money to afford such upgrades??

Fun fact: If you set aside $1.50 each week since buying the GTX 660 Ti you could buy a GTX 960 4GB card today and still have enough money left over for a nice meal. Still it might pay off if you hold out for that mysterious wealthy benefactor :)
That's $1.50 that could be going into your retirement or mortgage, both which pay dividends on top of the initial investment.

Yes, gaming is a hobby but it doesn't need to be so expensive that you need to contribute a dollar a day into it. It can be cheaper, it was cheaper in the recent past, and it should be cheaper if not for AMD's unfortunate market weakness.
There was never a time when gaming was cheaper lol. You can play that game at 1080p on high with your 660ti easily with 50- 60fps. Ugh shock, but its not Ultra, right?

So, there you have it, a brand new star wars game with 4-5 years old intel i5 and 4 years old hd7950 easily on high or even ultra. You could get that gpu for as low as 180euro about 2 years ago. In the past we bought new platforms (cpu + mobo + ram) way more frequently, graphics cards were underpowered after 1-2 years for anything else but low settings
 
Last edited:
Ideally we would like to do that. Unfortunately to provide the CPU results that we do there is about a day and a half worth of testing involved. Combine that with the two days we often spend testing all the GPUs and it starts to get a little out of hand ;)
Yeah. I understand that it is a lot of work. But I mean, you're definitely testing AMD cards with at least one CPU right? So, that would give at least some reference. It doesn't have to be the whole list, although that would indeed by ideal.
 
Yeah. I understand that it is a lot of work. But I mean, you're definitely testing AMD cards with at least one CPU right? So, that would give at least some reference. It doesn't have to be the whole list, although that would indeed by ideal.

I hope so.
 
Seems like Hyperthreading doesn't do this game a lot of good, the difference from the Pentium to the Haswell i3 isn't very large, but the 10fps minimum from the 6100 to the 860K shows that AMD's "realer" threads are more beneficial, a completely different scenario than Fallout 4.

Very well optimized game, say what you will about EA, Frostbite is a killer engine.
 
Back