StarCraft II: Wings of Liberty GPU & CPU Performance

By on July 29, 2010, 5:00 AM
It's been more than a decade since the original StarCraft was released, but unlike a wide majority of the titles released in this time frame, the game never really faded away and throughout the years remained the benchmark of what a well-conceived real-time strategy game should be.

For that reason StarCraft II needs no introduction. This sequel has been in the making for what seems like forever, although it was only officially announced in 2007. One way or another, the delays do not seem to have discouraged anyone from wanting to play the game, and if anything, it has made us more desperate to get a taste of it.


On usual TechSpot fashion, now with the game out in stores we are taking an in-depth look at how your gaming system will handle StarCraft II: Wings of Liberty. We have tested a huge range of graphics cards using three different resolutions and an equal number of visual quality presets. In addition, we have evaluated CPU scaling testing AMD and Intel CPUs of current and previous generations.

Read the complete article.




User Comments: 96

Got something to say? Post a comment
Puiu Puiu said:

I just watched on youtube the whole story of starcraft just to be back on track with things. It's hard to remember the story after so many years.

TitoBXNY TitoBXNY said:

SC2 is just awesome, the replay value is so high. The story is so epic that I remember it like any great movie that I've seen in the past. I keep redoing the same early missions over because I feel that the second time will be faster and better. A must own for any PC gamer.

Guest said:

i think you meant to say Ultra High Quality and not Ultra High Performance,

otherwise great stuff it'll work on my relatively old rig :)

princeton princeton said:

Guest said:

i think you meant to say Ultra High Quality and not Ultra High Performance,

otherwise great stuff it'll work on my relatively old rig

Ultra performance. That's how it performs on ultra.

On another note I'm glad the game is so nvidia biased so my GTX 260 will even outperform the HD 5830. I do wonder why you guys benchmark on 16:10 screens when the standard nowdays is 16:9.

Staff
Steve Steve said:

princeton said:

On another note I'm glad the game is so nvidia biased so my GTX 260 will even outperform the HD 5830. I do wonder why you guys benchmark on 16:10 screens when the standard nowdays is 16:9.

I only have 30" Dell screens (2560x1600) and I prefer 16:10

Guest said:

Likewise, I have a Dell U2410 which is also 16:10. I'm not surprised, by the way, that xfire/SLI support in SC2 is non-existent or even results in a decrease in performance. The same thing is happening in the World of Warcraft: Cataclysm beta. You have to edit a config file to get WoW to use a quad-core processor to its fullest as well. It's pretty disappointing that after 7+ years of development, SC2 is lagging behind on quad core and dual video card support.

The single player campaign is an absolute blast though.

Guest said:

I don't have much faith in those CPU results, my i5-750 is at 3.6GHz and paired with a 480 I haven't yet seen it go below 50fps, singleplayer or multiplayer. It's usually sat at 80-130.

Skout said:

Since it's obvious that it is quite processor-intensive, it would have been nice to see some stats on the minimum specs (2.2ghz) CPUs.. but I guess you guys didn't want to torture yourselves that badly..

Staff
Steve Steve said:

Guest said:

I don't have much faith in those CPU results, my i5-750 is at 3.6GHz and paired with a 480 I haven't yet seen it go below 50fps, singleplayer or multiplayer. It's usually sat at 80-130.

Have you tested an 8 player game with 4 AI and 4 Human players? If not then you need to try that before placing your faith anywhere.

Adhmuz Adhmuz, TechSpot Paladin, said:

Actually on the test side of things, if you really want to push your PC to the edge with this game try playing a fastest possible money map online with 7 other people, 4V4 with my system I get drops down bellow 30 FPS at time other than that if your not playing all out insane games like that or other crazy UMS games your never going to really have this game stress your system.

TorturedChaos, TechSpot Chancellor, said:

I picked up a copy the day it came out, didn't even have to special order it or anything. Just wandered into the game store after work :P.

I have to say though it took for ever to install - almost 2 hours - then had 3 patches it installed. After finally getting it running I jumped right into the single player campaign and made it through the first 3 missions on Ultra graphic settings (which are very pretty btw) before it crashed with a nv4_disp.dll error popping up(not BSOD, just everything went screwed up win 95 graphics and an error pops up saying that nv4_disp.dll has crashed & you have to reboot). Since they I have tried playing it several different ways and only way I can make it run is capping the fps at 60 (which isn't a big deal) and running everything on low graphic settings - that part makes me sad.

So far I haven't seen a true fix for it yet, and apparently a lot of other people with a 9800gt cards (or any 9800 or 8800 nvidia cards) are having similar issues.

Worst part is people where complaining about the exact same issue when it was in beta - but everyone just assumed that its the beta and bliz would fix the issue before the official release and blizz still hasn't fixed it.....

So I'm a bit disgusted right now with Blizzard.

Guest said:

Would a Q6600 work better with an older GPU?, and in the CPU test u should also use an ATI card to see if one GPU is better at taking more load from the CPU.

TomSEA TomSEA, TechSpot Chancellor, said:

Love the range of graphics cards you guys used. Wish I could see every game covered like that. The Q6600 FPS is bizarre though. That's still considered a fine gaming proc by anyone's standard and to see it get slammed like that is frankly, a jaw dropper.

Guest said:

Any game that can only take advantage of 2 cores will suffer heavily on a q6600 because it doesn't have much cache because again it only can use half and the fsb is relatively slow compared even to core 2 duos of the same generation. It is too bad techspot didn't put up some numbers for the E8XXX series cpus as I am sure they would have been better than the Q6600.

Guest said:

EXACTLY...too many times people post their benchmarks that favor their taste lol...they'll post the results where there's nothing to minimum stuff going on in the game...

PanicX PanicX, TechSpot Ambassador, said:

@TorturedChaos

I also had this problem and would usually crash after about an hour of playing or alt+tabbing some. I upgraded my drivers to the Forceware 258.96 2 weeks ago and haven't crashed since.

I'm using an 8800GTS at 1680 x 1050 with all settings high except textures at medium.

TorturedChaos, TechSpot Chancellor, said:

@PanicX

Thanks, but after the first crash I made sure video drivers where up to date (latest is the 258.96 i believe) and all XP's updates were done. Still get the crashes sadly

Guest said:

Why does the i7 920 perform better than a Phenom II X4 at similar clock speeds? Doesn't the Phenom have 512kb/core L2 cache while the i7 has 256kb/core L2? Is it because of 8MB L3 on i7 vs 6MB L3 on the Phenom II? Or is it because of triple channel memory?

Guest said:

^ In CPUs it isnt all about numbers but what architecture the cpus have.

Guest said:

Ultra vs High seem identical in picture quality.

ebolamonkey3 said:

I've gotta say, I'm pretty disappointed in SC2 in terms of game design. I don't mean multiplayer and balance or any of that, but I think Blizzard has gone overboard in trying to make the game run accessible as possible.

It's still DX9, which is ancient, does not scale to multiple monitors (to keep competition fair, fine, make it an option in game or in tournaments), has no crossfire/SLI support and doesn't take advantage of quad core CPUs (I can't even think of a good reason for this when the game runs fine on dual cores and single GPUs), and no LAN (anti-piracy, sure, but seriously, having to connect to Bnet, which may not be possible all the time, just to play with my friends sitting next to me, is very annoying to say the least).

Blizzard has tried to make SC2 as fair for multiplayer as possible and as forgiving on hardware as it can be, but I think they have taken it too far in their zeal. While I agree that the most important part of a strategy game is gameplay and not special effects, the game engine and graphics is already outdated before the game is released, and Blizzard's decision to not support current and future hardware (Quad core, Multi-GPU) is backward to say the least and is hardly the way to establish SC2 as a worthy successor to the original legend.

TitoBXNY TitoBXNY said:

I have a Q6600 running at 3.4 on air and an Radeon 4870. I am running the game at max Ultra settings without issue.

Guest said:

In the past, I've ONLY played Starcraft on a LAN.. its the only time I have ever played SC...

Removing LAN support, especially in a country like Australia where internet simply isn't available everywhere at a reasonable latency and speed is a total game killer. Blizzard, what have you done?!

Guest said:

16:10 all the way. 16:9 screens only seem to go up to 1920x1080, and the extra pixels in say a 24" at 1920x1200 make a huge difference in the quality of the image. Plus, once you're used to 16:10, viewing web pages, or even just positioning windows on a 16:9 screen feels really cramped vertically.

Guest said:

The ONLY con I could find for this game was the LAN not being supported. Like really blizzard? This is the most expensive game ever created, and you could not think of adding LAN? I bet they did that so people had to use Battle.net as the only way to play with other players. That's pretty low.

Hopefully they will fix this in an upcoming patch or something.

Staff
Per Hansson Per Hansson, TS Server Guru, said:

Nice article Steven!

My only gripe with the games requirements is the recommended memory amounts

2GB RAM being recommended value for Windows 7 that is bull (1.5GB the min, 1.0GB if using XP)

my brother has a ATI 5850 and 2GB RAM, it runs much slower due to swapping, RAM usage was maxed by the game.... (he runs Win7)

The games own memory footprint was ca 1600MB after playing just the small intro levels, leaving 400MB for the OS... (This was 1920x1200, all on Ultra/Max, but this is also what the "recommended" numbers should be based on...)

Guest said:

The GPU performance section of this article is pointless considering that starcraft 2 has a known bug where higher performing cards (esp those who SLI their setup) are getting poorer performance than normal. Blizzard are yet to release a patch for this.

Staff
Steve Steve said:

The GPU performance section of this article is pointless considering that starcraft 2 has a known bug where higher performing cards (esp those who SLI their setup) are getting poorer performance than normal. Blizzard are yet to release a patch for this.

And that makes it pointless does it? Right ... so what it already shows is that the high-end cards work fine, making them faster would be the very definition of pointless. What would be nice is for the slower cards to become faster. That said what the article really showed if you read it is that the CPU is very important in this game, especially if you are playing on the big 8 player maps.

Having said that I have not heard about this performance bug, other than the fact that multi-GPU technology doesn't work. Also has Blizzard said there is a problem and they are going to patch it? Because if not I don't like your chances given the game went though a very long beta phase.

Nice article Steven!

My only gripe with the games requirements is the recommended memory amounts

2GB RAM being recommended value for Windows 7 that is bull (1.5GB the min, 1.0GB if using XP)

my brother has a ATI 5850 and 2GB RAM, it runs much slower due to swapping, RAM usage was maxed by the game.... (he runs Win7)

The games own memory footprint was ca 1600MB after playing just the small intro levels, leaving 400MB for the OS... (This was 1920x1200, all on Ultra/Max, but this is also what the "recommended" numbers should be based on...)

Thanks for the feedback. As for the memory issue I would not recommend using Windows 7 with less than 4GB of RAM, let along play games such as StarCraft II with less. On another note there is no performance gain to be had by going over 4GB of RAM with this game. I tried 6GB and 8GB, made no difference. But you are right with 2GB of RAM it runs like a pig.

Guest said:

It's not all about the size of the cache. It's about the actual execution units on the cpu and the amount of work they can do per cycle. More cache is only good if the processing units on the cpu can use it fast enough.

Guest said:

that was supposed to be a response to the guy wondering why AMD cpus with more cache still perform worse.

Guest said:

I am wondering if anyone can help me out. I'm a huge fan of SC and wanted to pickup up the new game but don't think it will run on my system. Anyone have any input? My Specs are below:

Dell Studio 15 laptop

Intel Core 2 Duo T6600 2.2GHz, 800Mhz, 2M L2 Cache

4GB Shared Dual Channel DDR2 at 800MHz

Intel Graphics Media Accelerator 4500MHD

320GB SATA Hard Drive (5400RPM)

Windows 7 64bit

Shitty integrated Graphics card sounds like a big problem.

I'm wondering If I would be able to play on Super low settings and still get somewhat playable fps

Any input out there?

Thanks!

Burty117 Burty117, TechSpot Chancellor, said:

Guest said:

I am wondering if anyone can help me out. I'm a huge fan of SC and wanted to pickup up the new game but don't think it will run on my system. Anyone have any input? My Specs are below:

Dell Studio 15 laptop

Intel Core 2 Duo T6600 2.2GHz, 800Mhz, 2M L2 Cache

4GB Shared Dual Channel DDR2 at 800MHz

Intel Graphics Media Accelerator 4500MHD

320GB SATA Hard Drive (5400RPM)

Windows 7 64bit

Shitty integrated Graphics card sounds like a big problem.

I'm wondering If I would be able to play on Super low settings and still get somewhat playable fps

Any input out there?

Thanks!

you may need to turn the res down a notch but it will be playable.

at least I can get Battlefield 2 to run at 1280x800 in medium settings with the 4500HD so yeah, should be ok on lowest setting.

Guest said:

my acre laptop with a 2.0 core 2 and a 4500 intel video will play low settings at about 20 fps not that bad if you really need to play it.

Jiraiya said:

Blizzard software ate my GPU

Starcraft II causes overheating problems

Blizzard has admitted that its latest Starcraft II game has a bug which can cause a GPU to go into meltdown.

The game's menu screens aren't framerate-limited which means that if the computer has nothing else to do the graphics hardware gets bored and starts to render the screens. This sounds like a good thing but actually causes overheating problems for the card.

The good news is that it is a doodle to fix. If you go into "Documents\StarCraft II Beta\variables.txt" "Documents\StarCraft II\variables.txt" file and add in the lines frameratecapglue=30 and frameratecap=60 everything goes much better.

It is strange that while Blizzard have admitted that there is a problem, it has not really notified people about it. Still it is nice to have a fix and if your GPU has not melted down to China you can probably give it a go.

Source : fudzilla

....................

test Again

Guest said:

I am running a Q9550 overclocked to 3.6ghz with a GTX 285 and am getting the same or better FPS than your I5-I7-I9's. At 1920x1080 I am almost always 60fps. Maybe the Q9550 is a better socket 775 than the old Q6600, I know it has more cache.

PanicX PanicX, TechSpot Ambassador, said:

Blizzard software ate my GPU

Starcraft II causes overheating problems

Blizzard has admitted that its latest Starcraft II game has a bug which can cause a GPU to go into meltdown.

This looks to me like a blatant lie or gross exaggeration. If your PC has adequate cooling, nothing short of changing the wattage draw of the video card could cause it to meltdown. I searched on google about this and only found references to this blog [link]

where this guy is speculating as to the cause of pc crashes is CPU overheating.

Sure the frames of the menus shouldn't tax your GPU's rendering capabilities. But any decent 3D game should, and if it doesn't meltdown during actual game play, why would excessive menu drawing be different?

Guest said:

Quote:

Originally Posted by Jiraiya View Post

Blizzard software ate my GPU

Starcraft II causes overheating problems

Blizzard has admitted that its latest Starcraft II game has a bug which can cause a GPU to go into meltdown.

UH, NO. My setup can run Furmark in a hot summer room without a meltdown. I don't think some lame DX9 game menu running at 1700 fps is going to get near Furmark temperatures. Obviously someone had a bad GPU or inadequate cooling.

Guest said:

Yes of course the q9550 is better tan the Q6600 .. the clue is in the price 1x for the q6600 2x for the q9550.

Have to check out the beta/Demo of this.. on C2D 6750 x 4670 x2gb .. I'm thinking low to medium settings.

I didnt really get on with original game, however.

Just a Thought? Maybe the lowish hardware settings are so that consoles will taste some of that sweet, sweet Starcraft Pie. Then a really simplified version to be played over wi-fi on an Iphone/AnnDroid

Guest said:

Nice GPU and cpu review Steve, it leaves some questionmarks though.

Would been nice to see stock and OCed E8500 and a Q9450/Q9550 at stock and OCed to see how the cache plays out in terms of performance.

Guest said:

I don't get how RTS games are fun, or even a game? You sit there

clicking minerals, gas, wait, build stuff, wait, get soldiers, wait, click units that

attack and then stop for no reason and you have to re-box them and click again.

Send all your soldiers to watch them attack the enemy, repeat. It is

more like a JOB of management, a chore, not a fun game. I have never

enjoyed an RTS, but am a big fan of FPS and even action type RPG's like

Gothic, Zelda. StarCraft is not Game of the Year, maybe battle simulator

of the year. Bring on Gears of War 3 and Rage, those are going to be

fun games. I played a friends Starcraft II for a while and other than the cutscenes,

it is the same 10 year old micro management chore, not a GAME. I suppose to

some people if you took a RPG game and got rid of the part that you actually have

fun with, like controlling your character and fighting the monsters, so that all the game

would have is the tedious part of picking armor, spells, weapons, assigning hit points, etc.

that would be FUN? I doubt it. Have fun guys in your click a million things, scroll around a map, click a million things, then watch a 10 yr old top down icons beat each other up. At least Diablo was FUN because you controlled one character, yourself and did not have to waste 99.99% of your game playing time issuing clicks and orders! I guess it takes a special kind of masochist who likes to do RTS, because you certainly don't play them for fun.

EXCellR8 EXCellR8, The Conservative, said:

I have a Q6600 running at 3.4 on air and an Radeon 4870. I am running the game at max Ultra settings without issue.

same here... got a Q6600 @ 3.2Ghz coupled with a 512MB 4870 and i get 40+ fps. something tells me it's not the processor itself but perhaps the motherboard it was used with. i've been running the game 1080p maxed (v-sync enabled) without a single hiccup and no video issues at all.

Guest said:

Yes but are you playing a 4v4 map with 4 humans and 4 AI?

Guest said:

I am just playing the campaign with Q9550 at 3.6ghz, mem 1066mhz, and GTX 285 at 1920x1080 with 4x antialias and pegged at 60 fps all the time (50-80) without vsync. Obviously my bus and memory are supplying a lot more bandwith than a normal socket 775 quad, but I get the same framerates on this game as well as Crysis and every other game vs an I5,7,9. I just don't see more than a 2 fps benefit of an I processor over mine, or even much of a difference of a gtx 460-480 over a gtx 285 on THIS game.

Guest said:

^

Well you don't see a diffrence in fps from a 5850 up to 480gtx in this game @ 2560x1600 b/c the game is still CPU limited for some reason.

It also only support 2x cores atm so the caches, clocks and inherent architecture of the cpus will have almost all effect on performance once you roll with a card equal or faster then a 5850 @ 280 dollars.

The most interesting point is how much of an effect does the quite large L2 cache on the intel Q9xxx series have on the performance.

Having people say they run the game maxed at 60 fps even if the review says other is really a no point since the circumstances are not the same.

I'm running this game on:

Q9450@3.8 ghz

4870x2 ( ai disabled 0x aa 0x af from CCC) - OCed to 800/1000 atm using 10.8 betas

950 mhz mem @ 4-4-4-12

1920x1200 @ dx9

Everything set to ultra

And I get like 30-55 fps mostly in the 45s during the campaign.

Not impressive at all if you ask me, I run World In Conflict maxed with 4x aa 16x af dx10 at 60 fps all the time and only dips down to low 50s during nukes.

Would be awesome if Steve could include the intel Q9xxx series in his cpu benchmark to see how much behind the i7s they are and if it would be worth an upgrade.

Guest said:

I am just playing the campaign with Q9550 at 3.6ghz, mem 1066mhz, and GTX 285 at 1920x1080 with 4x antialias and pegged at 60 fps all the time (50-80) without vsync. Obviously my bus and memory are supplying a lot more bandwith than a normal socket 775 quad, but I get the same framerates on this game as well as Crysis and every other game vs an I5,7,9. I just don't see more than a 2 fps benefit of an I processor over mine, or even much of a difference of a gtx 460-480 over a gtx 285 on THIS game.

UPDATE: I just played the gem mission where you defend till you die, with about 200 icons on the screen at once, and my might above machine did hit 18 FPS, still playable, but with a TON of units eventually the FPS does drop from the more constant 45-70 FPS, but very rarely.

Staff
Steve Steve said:

UPDATE: I just played the gem mission where you defend till you die, with about 200 icons on the screen at once, and my might above machine did hit 18 FPS, still playable, but with a TON of units eventually the FPS does drop from the more constant 45-70 FPS, but very rarely.

Start playing with friends online in 3v3 and 4v4 matches and you will find out that it's not very rare at all. We were not interested in campaign performance as this is not a very demanding aspect of the game and most will buy StarCraft II predominantly to play multiplayer online.

Basically everyone who has come into this thread claiming excellent performance using ultra settings on old hardware failed to read how we tested the game. If we wanted to show a GeForce 9600 GT doing well we would have tested a 1v1 match or the single player campaign. Unfortunately those results would have done the reader no good once they decide to really get into the game and play some fun 3v3 or 4v4 matches. Even 2v2 matches are considerably more demanding than what most of you guys are testing with.

Guest said:

16:10's the closest to the golden ratio.. Your 16:9 blows.

Staff
Steve Steve said:

Sorry, you blow. I got rid of my 1920x1200 monitor because I also used it to watch Blue Ray Movies, play Xbox 360 and playstation 3 and got SICK OF THE STUPID BLACK BORDERS TO MAKE IT THE STANDARD 1920x1080! Who wants black bands on the top and bottom of the screen all the time, *****, they made a standard for a reason. I don't miss those 120 vertical blacked out lines at ALL!

Although this is completely off topic and you have conducted yourself much as a 12 year old would, I think the first guest has a point. In my opinion 16:10 is much better for use as a computer screen, I much prefer it for viewing web sites, viewing e-mails and writing documents. However for watching Blu-ray movies 16:9 is arguably better and that is why I watch movies on a TV and use a computer screen for my computer.

Having said that I am aware many people prefer 16:10 for watching movies as the subtitles are placed in the black bar at the bottom and not over the picture. Really the black bars do not bother me as you are not missing out on any of the picture. Hell on my 1080p TV most movies still have the black lines anyway because they are shot in the 2.39:1 Cinema format.

On a more related topic if you play StarCraft II on a 16:9 screen you are missing out on some of the image and could be placed at a slight disadvantage to those using 16:10 screens.

Guest said:

Fair enough, but you could get A LOT more information on a 30 inch 2560 resolution or whatever. You would see a lot more of the map/units, but they would be tiny. On a 24 inch monitor it is tough to tell a marine from a ECV. Playing it on my 46" Samsung 1080p LCD HDTV is awesome, you can easily tell the difference of every icon. I think scrolling around the map is so annoying, it would be great if we had monitors with the resolution and size to show the whole map without scrolling, but that is a pipe dream, for now 1080p fits the gaming bill best.

Staff
Steve Steve said:

Fair enough, but you could get A LOT more information on a 30 inch 2560 resolution or whatever. You would see a lot more of the map/units, but they would be tiny. On a 24 inch monitor it is tough to tell a marine from a ECV. Playing it on my 46" Samsung 1080p LCD HDTV is awesome, you can easily tell the difference of every icon. I think scrolling around the map is so annoying, it would be great if we had monitors with the resolution and size to show the whole map without scrolling, but that is a pipe dream, for now 1080p fits the gaming bill best.

Actually no that is incorrect. The field of view is exactly the same at all resolutions. If you play at 1680, 1920 or 2560 you will see exactly the same amount of terrain and units. Blizzard has done this to avoid giving people with more cash for larger resolution screens an advantage over poorer gamers

So the units will only become bigger on larger screens, the ability to see more does not come into it. That said I prefer to play on my Dell 30" opposed to my 50" Samsung as it looks much sharper and in all honesty is just easier to play.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.