StarCraft II: Wings of Liberty GPU & CPU Performance

Fair enough, but you could get A LOT more information on a 30 inch 2560 resolution or whatever. You would see a lot more of the map/units, but they would be tiny. On a 24 inch monitor it is tough to tell a marine from a ECV. Playing it on my 46" Samsung 1080p LCD HDTV is awesome, you can easily tell the difference of every icon. I think scrolling around the map is so annoying, it would be great if we had monitors with the resolution and size to show the whole map without scrolling, but that is a pipe dream, for now 1080p fits the gaming bill best.

Actually no that is incorrect. The field of view is exactly the same at all resolutions. If you play at 1680, 1920 or 2560 you will see exactly the same amount of terrain and units. Blizzard has done this to avoid giving people with more cash for larger resolution screens an advantage over poorer gamers :)

So the units will only become bigger on larger screens, the ability to see more does not come into it. That said I prefer to play on my Dell 30” opposed to my 50” Samsung as it looks much sharper and in all honesty is just easier to play.
 
The CPU benchmark doesnt mention anything about how many players or AI players there are.

Im sticking mainly to the singleplayer campaign (as I dont like RTS online play) but my Q6600 @ 2.4ghz and GTX 260 handles this game fine at 1920x1080 Ultra settings - a steady 50fps+

so painting a picture of a crippling 25fps isnt accurate.
 
The CPU benchmark doesnt mention anything about how many players or AI players there are.

Im sticking mainly to the singleplayer campaign (as I dont like RTS online play) but my Q6600 @ 2.4ghz and GTX 260 handles this game fine at 1920x1080 Ultra settings - a steady 50fps+

so painting a picture of a crippling 25fps isnt accurate.

TFA said:
For measuring frame rates we relied on Fraps, recording five minutes of gameplay using a replay of an 8-player online battle. This replay featured a combination of human (4) and AI (4) players placing the maximum amount of stress on the system.
 
Hey Steve, is there any chance of you giving this test ago on a E8400 massivly OCed vs stock and a Q9xxx 12mb cache one also massivly OCed vs stock? (Obviously OC both the dual and quad to same speeds so we can get a linear comparision where cache is the main diffrence, obviously the quads 2 'idle' cores will pick up the OS n whatnot but w/e :d )

Would be quite interesting to see how much the actual cache plays in here vs the I7s HT and other architectural diffrencies!

Would it be too much work x<?
 
Do you feel better now, did you have an abusive relationship with an RTS in the past. Take your useless trolling comments somewhere else...
 
Thanks Steve for the completely depressing dose of reality (i.e., abysmal fps on ultra settings due to CPU limitation). I just finished putting together a budget gaming system, kind of an oxymoron i know, for roughly $500 (Win 7 32-bit, 2gb ddr3 1333, GTX 460 768, AMD Athlon II X3 440 Rana 3.0GHz). I'm getting the card in today was previously playing on low settings with a HD 4250 IGP.

I probably will never play the single player campaign (never did on WC3 and i've been playing the game online for 7 yrs); I'm 100% online playing 3v3+ for the most part so I will likely run into similar CPU taxing scenarios. Was quite stoked by the middle of the article, especially when you said the GTX 460 was the perfect card for the game..and the CPU test (im not even on the list..sob)

This review would have been extremely helpful a few weeks ago considering i built my system specifically to play one game (this game). I'm sure others will find it very practical, given the same requirements (3v3+ on battlenet).

I have a question for you, understanding i've been out of gaming for a while, is there any chance in your opinion that this situation is going to improve any?

A side note i've read on other forums that taking off/lowering CPU dependent graphics features like "reflections" (even states in game that is CPU dependent) can make a notable difference on FPS. If someone could test this out quantitatively that would be great.

Great review/article Steve
 
I'm the same. Running Q8200 @2333mhz with a GeForce GT 240 on XP. Played on medium settings for about an hour then non-stop crashing... lowest settings on card and in game to no effect. Latest drivers, shutdown all background programs and still wouldn't stop.

Can't figure out if it's the combo of Q8200 and the GT240 not being enough or something else... can't figure it out...
 
Well problem solved. Recommended settings with Gt240 (all latest drivers) were medium but I had to try Low and all card options on minimum, as well as limited start-up only which still resulted in constant crashes. Installed an ATI 5850 GDDR5 1GB with 10.7 catalyst drivers last night and voila: all on recommended ultra settings and no crashing even with background programs running... all I wanted was to be able to play the game and now I can on Ultra settings... no doubt CPU performance will come into greater play with more complicated multi-player but right now i'm playing the sh%t out of the single player and no problems... I think you can perhaps get away with other minimum requirements but the video card still must be seen as the great impediment vis a vis crashing.
 
Great article Steve, thanks!

A question:

Im about to buy an imac 27" with i5 core. I have not decided yet but it will be either the quad-core, 2.8ghz with 8mb L3 cache or the duo-core, 3.6Ghz with 4mb L3 cache. The graphics card will be radeon 5750.

How do u reckon starcraft II will run on this computer? what CPU/core would you choose out of the two above?

I would greatly appriciate a quick answer.
thanks again.
 
And Guest - be sure to bootcamp your iMac with Win 7 and play SC2 on that side - you'll get much better performance.
 
It has been said the game is designed to be played 16:10. You are actually as a slight disadvantage playing 16:9 as part of the original screen is cut off. You don't, however, get any extra real estate going over 16:10 (i.e., dual triple screen setup).
 
has any1 played this game with a pentium D dual core 3.2 ghz cpu or worse on medium settings or better? My cpu is well above the minimum specs and I can only play the game on low settings..
 
Is this a biased review? I'm playing on a Q6600 coupled with a 8800 GTS and 3GB RAM, and I'm not having a single fps problem at ultra settings 1680x1050. I even play 4vs4 or income wars without any noticable fps drop.
And I don't think my PC is any better of those taken into account in this benchmark...

IMHO many hardware components are either overrated or placed on the wrong motherboards...
 
Yes its a biased review of course ... there is simply no way that old hardware will handle 4v4 games using the ultra settings, don't even try to feed us that. Of course what you call a "single fps problem" likely differs from what we believe is playable...
 
If you think that a Q6600 is too old to play along with an 8800 then there's something wrong. I said I'm having a very smooth gameplay with Starcraft 2 at 1680x1050 at ULTRA settings. I didn't benchmark honestly, but I'm not having problems with a tons of units on screen 4vs4, nor with Crysis Warhead at full details or any other game.
And also you will never notice any difference over a certain fps.
 
If you think that a Q6600 is too old to play along with an 8800 then there's something wrong. I said I'm having a very smooth gameplay with Starcraft 2 at 1680x1050 at ULTRA settings. I didn't benchmark honestly, but I'm not having problems with a tons of units on screen 4vs4, nor with Crysis Warhead at full details or any other game.
And also you will never notice any difference over a certain fps.

I've been running my Q6600@3Ghz with a HD5850@850/1200. And lots of units at 1920x1200 ultra settings isn't a problem, until I go protoss and put 8-10 Carriers spitting out interceptors with a Mothership trying to apply the cloaking visual effect to them. At that point I get a message on screen advising me to reduce settings to improve performance o_0, mind you thats with the machine running 3 hard level AI opponents on it too...

I also found a 35 Void Ray army caused a bit of fps loss when all building up beam damage.

Definitely situations can arise that will show up the bottleneck of my old CPU :)

PS: Forcing AA and max quality details in Catalyst Control Center.
 
Yes with the Q6600 and much faster graphics cards our 8-player benchmark kills this processor and we often see slow down. Of course this only happens once each player gets about half way to the unit cap but it happens every time. Which is why we recommended a more modern processor ... made sense to me.
 
Hehe - no problems with Crysis Warhead at full details on a Q6600 with 8800GT. ;)
 
Hehe - no problems with Crysis Warhead at full details on a Q6600 with 8800GT. ;)

Honestly I was going to let him go on that call "my 8800 GTS with Q6600" plays all games using the highest settings. haha

Why have we been wasting all of our time researching new hardware when Nvidia created the only graphics card today's games need almost 4 years ago.

On a side note we tested Crysis Warhead using a Q6600 processor overclocked to 3.0GHz and if you find an average of 21fps acceptable then yes the 8800 GT can play this game using the highest in-game quality settings with AA turned off. Also worth noting is that the 8800 GTS averaged just 23fps, hardly perfectly playable but everyone has their own standards ;)

https://www.techspot.com/article/118-crysis-warhead-performance/page3.html

Anyway lets get back on topic, pretty sure this thread is about StarCraft II and we do have one for Crysis Warhead if that is what you wish to discuss.
 
I've been playing SC2 on a Centrino Duo system with 128 MB of integrated memory. lulz

Next week, however, I'll have the machine described in my profile. :D
 
You really ought to do a follow up to the benchmark. I dont find it very useful.

Why haven't you tried different CPU architectures rated at the same clock speed to get a real sense on what is doing what.

For example. Why is the only Athlon 2 processor rated at the lowest clock speed of all the CPU's?

Of cource it's gonna show up last.

It would be far more interesting to know how a Ahtlon II X2 260 @ 3.2 Ghz performed vs. a Phenom II X2 555 Be @ 3.2 Ghz.

That would really have clued us in on how much L3 cache matters.

I don't understand how you can get any useful information from your benchmark with the CPU's all rated at different clock speeds and not even rated against price.
 
Hey Steve,

Although I disagree with the posters who feel your results are wrong based on their own campaign or 1v1 experiences, I am curious as to how much better the fps will be when playing a 1v1. Is there any chance you could provide us with those figures?
I am building a new computer and looking to use fraps on my 1v1 replays and would like to have a "base" fps figure to use so I can estimate what my final fps will be once recording with fraps.

Thanks!
 
Back