Overwatch Benchmarked, Performance Review

Steve

Posts: 3,044   +3,153
Staff member

Though there are still some hotly anticipated titles due in 2016, we might already have the game of the year on our hands. Overwatch has been on our radar for a while now but it recently blew up in a big way with an open beta that attracted 9.7 million players earlier this month, doubling that of Destiny and a third more than The Division.

In typical Blizzard fashion, Overwatch is impressive looking yet runs well on a wide range of hardware. It scales down to work on low-end hardware but can also be cranked up to take advantage of high-end gear, especially at the 4K resolution.

The game engine was developed by Blizzard specifically for Overwatch and this goes a long way in explaining why the game flows so well. Between the $40 base price and the fact that class-based shooters have been relatively stagnant since the arrival of Team Fortress 2 nearly nine years ago, Overwatch is poised to be hugely popular among PC gamers. Of course, one question remains: can your hardware handle it?

Read the complete review.

 
Here is my result.
I have G3258 at 4.0GHz and 750ti at 1350/6200MHz.
Settings: 1080p, 100% render scale, details: tweaked between ultra and epic.
FPS between 60 - 90.
 
What happened with r7 370? If this is basically a 7850 with a different name, then why is it so slow everywhere...
 
I'd probably do myself a favor if I toned down a couple settings with my 980 at 1440p as I've been running it on epic settings since day 1. While it usually sits above 60fps at 100% render scale, if the battle gets real intense I've seen it drop below 60.

It's nice to see the game run well on even entry level hardware, something Blizzard has always been good about. I shouldn't be hearing any excuses from anybody I play with that their computer can't handle it.
 
Quad core still rules the market. Faster than 6, 8 and probably 10 core cpu. Only 5% of applications can take advantage of the extra cores. It's safe to say that most real world usage will favor quad core everytime with the exception of a few things mostly reserved for professionals in certain businesses.
 
Excellent review as usual. Though I am curious @Steve if the 860k exhibits similar behavior. Or was it just the G3258 since they are both in the same price bracket. Was considering it for a friend on an extreme budget but I don't have one laying around currently to play with.
 
Everytime I see CPU benchmarks I get all happy that my 3770k is still a beast. I probably wont upgrade for at least 2 to 3 years.
 
Everytime I see CPU benchmarks I get all happy that my 3770k is still a beast. I probably wont upgrade for at least 2 to 3 years.
With the focus being mobile processing, there won't be much that will change each architectural generation in terms of performance for us. Still like my 3770K too, but wouldn't mind DDR4 and extra native SATA3 ports :D
 
Nice review! The only thing I found strange was the use of a 1GB R7 360, I've never seen one of those myself (thought all of them were 2 GB). As the R7 260x and 360 are very similar, one could expect a 2GB R7 360 to perform just as good as the 260x?
 
Quad core still rules the market. Faster than 6, 8 and probably 10 core cpu. Only 5% of applications can take advantage of the extra cores. It's safe to say that most real world usage will favor quad core everytime with the exception of a few things mostly reserved for professionals in certain businesses.

Its faster because none of the 6,8,10 cores are using skylake architecture.
 
One to two FPS on my GTX 660 @4K. LOL

I can't believe how much I am laughing at seeing the FPS numbers 1 and 2 rated for my card.
 
Thank you very much for taking the time and effort to do the CPU benchmarks. Techspot is one of the only reviewing sites I know that posts these, and you even show the effects of overclocking both Intel and AMD processors as well. I look forward to the CPU page of every game review you post!
 
I am thoroughly confused as to how you are getting so much more FPS on i7's than i5's. For example I have a 2500k, in the 1080p testing a 3770k gets FIFTY more FPS than my CPU, I realize its a generation ahead of mine as you didnt test 2600k but there is no way that accounts for that large of a gap. Sorry, im not taking this benchmark seriously, ive never EVER seen a gap that high in a game between a quad core and i7.
 
I am thoroughly confused as to how you are getting so much more FPS on i7's than i5's. For example I have a 2500k, in the 1080p testing a 3770k gets FIFTY more FPS than my CPU, I realize its a generation ahead of mine as you didnt test 2600k but there is no way that accounts for that large of a gap. Sorry, im not taking this benchmark seriously, ive never EVER seen a gap that high in a game between a quad core and i7.

So you have never seen a game where a Core i5 processor is notably slower than a Core i7? You probably want to look around a bit, there are plenty of examples at TechSpot but since you won't be able to take them seriously here are some from an external source...

http://gamegpu.com/images/images/stories/Test_GPU/MMO/Tom_Clancys_The_Division_/Division_proz.jpg
http://gamegpu.com/images/stories/Test_GPU/Action/Rise_of_the_Tomb_Raider_dx12/test/tr_proz_12.jpg
http://gamegpu.com/images/stories/Test_GPU/strategy/Total_War_Warhammer/test/tww_pr.jpg
http://gamegpu.com/images/stories/Test_GPU/Action/DOOM/test/doom_proz.jpg
http://gamegpu.com/images/stories/Test_GPU/MMO/Paragon/test/paragon_proz.jpg
 

Ya I dont buy that for a second either, also who is gamegpu ive never heard of them. The only game ive ever seen that benefits even a little bit from hyperthreading is BF4, but these numbers from this overwatch chart are beyond absurd. 50 FPS between a 2500k and 3770k, there is no possible way that is correct id bet any amount of money they messed something up.
 
Ya I dont buy that for a second either, also who is gamegpu ive never heard of them. The only game ive ever seen that benefits even a little bit from hyperthreading is BF4, but these numbers from this overwatch chart are beyond absurd. 50 FPS between a 2500k and 3770k, there is no possible way that is correct id bet any amount of money they messed something up.

It is a 35% increase, if you don't believe that is possible then I won't try and change your mind. Good luck keeping your head in the sand.

If you do want to accept the truth give this a read...
http://www.eurogamer.net/articles/d...it-finally-time-to-upgrade-your-core-i5-2500k
 
It is a 35% increase, if you don't believe that is possible then I won't try and change your mind. Good luck keeping your head in the sand.

35% is a world apart, you know you messed something up in this benchmark too dont deny it. Take WoW for example one of the most CPU bound games, you will get literally exact same FPS with a i5 vs i7 given they are the same architecture and clock speed. I fully understand some rare games do benefit a little from hyperthreading, but the results you got here are way beyond what anyone could expect from hyperthreading. You see the problem with your CPU results is someone who is looking to build a rig see this benchmark and decide to spend 150 bucks more on an i7 when they dont have to at all. Your results are skewed and I suggest you revisit them.
 
35% is a world apart, you know you messed something up in this benchmark too dont deny it. Take WoW for example one of the most CPU bound games, you will get literally exact same FPS with a i5 vs i7 given they are the same architecture and clock speed. I fully understand some rare games do benefit a little from hyperthreading, but the results you got here are way beyond what anyone could expect from hyperthreading. You see the problem with your CPU results is someone who is looking to build a rig see this benchmark and decide to spend 150 bucks more on an i7 when they dont have to at all. Your results are skewed and I suggest you revisit them.

Next you are going to tell us that the Core i3's aren't any faster than the Pentium processors, I mean HyperThreading doesn't do anything :S These results are 100% accurate and I am not denying anything.
 
Next you are going to tell us that the Core i3's aren't any faster than the Pentium processors, I mean HyperThreading doesn't do anything :S These results are 100% accurate and I am not denying anything.

Well to be fair bub if the game isnt coded for more than 2 threads no an i3 wouldnt be faster than a pentium given the same clock speed. I may go out and buy a 2600k on craigslist just to prove you wrong, I bet I wouldnt see more than 5% difference over my chip. I actually cant believe there have been no other comments like mine, this is a complete outside the lines result regarding i5/i7 performance, ive never seen a delta this large.
 
Well to be fair bub if the game isnt coded for more than 2 threads no an i3 wouldnt be faster than a pentium given the same clock speed. I may go out and buy a 2600k on craigslist just to prove you wrong, I bet I wouldnt see more than 5% difference over my chip. I actually cant believe there have been no other comments like mine, this is a complete outside the lines result regarding i5/i7 performance, ive never seen a delta this large.

Right but Overwatch doesn't just use 2-threads, it will use 8-threads if allowed. On a Core i7-6700K clocked at 4.5GHz we saw around 60% utilization in our CPU test (heavy usage across all threads). This explains why the 3770K is able to deliver a decent performance boost over the 2500K.

Good luck with your tests, make sure you have two teams full of AI bots. I won't hold my breath for that apology ;)

https://www.reddit.com/r/Overwatch/comments/459jrn/tech_supp_high_cpu_usage/

https://www.reddit.com/r/Overwatch/comments/49z8ia/cpu_usage_95100_while_playing_overwatch/

https://www.reddit.com/r/Overwatch/comments/3u288n/ow_settings_reducing_cpu_usage/

http://eu.battle.net/forums/en/overwatch/topic/17611661193

"Overwatch is a heavily threaded game with fairly complex rendering features. It will use as much of your computer’s CPU and GPU resources as you allow."

http://glemda.com/overwatch/cpugpu-utilization-and-overheating-guidelines/
 
Last edited:
Maybe I'm wrong but before you could just buy a strong GPU and skimp out on CPU and still be able to play newest games. Now it seems every game also needs a good CPU as well.
 
Hello, I would like to add my results since I have some diverse budget hardware. I am really shocked at your AMD 860k results, I have one as well as an A10-5800k, a FX-8320e.

Playing overwatch on high 1080p, render scale 100% @ 1080p I am getting average 110fps (r9 380 2gb). Windows shows 95% cpu utilization -- sometimes pegged at 100% the fan will start to run hard so watch out for thermal throttling here. My FX-8320e is water cooled 95w version 3.4ghz sits at about 60% utilization 125fps on epic 1080p 100% renderscale (rx 480 4gb) with a lot less dips then the 860k. My close friend has an I5-3570k that stutters bad, 30fps sometimes -- I was not allowed to troubleshoot it, but im guessing a rogue process taking up cpu utilization --after reinstalling windows he is back to 110-130fps on high.
 
These benchmarks don't make sense, how can the GPU benchmarks @ 1080p show the GTX 1080 reaching 250//269fps yet in the CPU benchmarks with 6700k@4ghz as the test setup it's only getting 195/239 with a gtx 1080@1080p??????????
 
Back