StarCraft II: Wings of Liberty GPU & CPU Performance

The ONLY con I could find for this game was the LAN not being supported. Like really blizzard? This is the most expensive game ever created, and you could not think of adding LAN? I bet they did that so people had to use Battle.net as the only way to play with other players. That's pretty low.

Hopefully they will fix this in an upcoming patch or something.
 
Nice article Steven!

My only gripe with the games requirements is the recommended memory amounts

2GB RAM being recommended value for Windows 7 that is bull (1.5GB the min, 1.0GB if using XP)
my brother has a ATI 5850 and 2GB RAM, it runs much slower due to swapping, RAM usage was maxed by the game.... (he runs Win7)
The games own memory footprint was ca 1600MB after playing just the small intro levels, leaving 400MB for the OS... (This was 1920x1200, all on Ultra/Max, but this is also what the "recommended" numbers should be based on...)
 
The GPU performance section of this article is pointless considering that starcraft 2 has a known bug where higher performing cards (esp those who SLI their setup) are getting poorer performance than normal. Blizzard are yet to release a patch for this.
 
The GPU performance section of this article is pointless considering that starcraft 2 has a known bug where higher performing cards (esp those who SLI their setup) are getting poorer performance than normal. Blizzard are yet to release a patch for this.

And that makes it pointless does it? Right ... so what it already shows is that the high-end cards work fine, making them faster would be the very definition of pointless. What would be nice is for the slower cards to become faster. That said what the article really showed if you read it is that the CPU is very important in this game, especially if you are playing on the big 8 player maps.

Having said that I have not heard about this performance bug, other than the fact that multi-GPU technology doesn't work. Also has Blizzard said there is a problem and they are going to patch it? Because if not I don't like your chances given the game went though a very long beta phase.

Nice article Steven!

My only gripe with the games requirements is the recommended memory amounts

2GB RAM being recommended value for Windows 7 that is bull (1.5GB the min, 1.0GB if using XP)
my brother has a ATI 5850 and 2GB RAM, it runs much slower due to swapping, RAM usage was maxed by the game.... (he runs Win7)
The games own memory footprint was ca 1600MB after playing just the small intro levels, leaving 400MB for the OS... (This was 1920x1200, all on Ultra/Max, but this is also what the "recommended" numbers should be based on...)

Thanks for the feedback. As for the memory issue I would not recommend using Windows 7 with less than 4GB of RAM, let along play games such as StarCraft II with less. On another note there is no performance gain to be had by going over 4GB of RAM with this game. I tried 6GB and 8GB, made no difference. But you are right with 2GB of RAM it runs like a pig.
 
It's not all about the size of the cache. It's about the actual execution units on the cpu and the amount of work they can do per cycle. More cache is only good if the processing units on the cpu can use it fast enough.
 
that was supposed to be a response to the guy wondering why AMD cpus with more cache still perform worse.
 
I am wondering if anyone can help me out. I'm a huge fan of SC and wanted to pickup up the new game but don't think it will run on my system. Anyone have any input? My Specs are below:

Dell Studio 15 laptop
Intel Core 2 Duo T6600 2.2GHz, 800Mhz, 2M L2 Cache
4GB Shared Dual Channel DDR2 at 800MHz
Intel Graphics Media Accelerator 4500MHD
320GB SATA Hard Drive (5400RPM)
Windows 7 64bit

Shitty integrated Graphics card sounds like a big problem.

I'm wondering If I would be able to play on Super low settings and still get somewhat playable fps

Any input out there?
Thanks!
 
Guest said:
I am wondering if anyone can help me out. I'm a huge fan of SC and wanted to pickup up the new game but don't think it will run on my system. Anyone have any input? My Specs are below:

Dell Studio 15 laptop
Intel Core 2 Duo T6600 2.2GHz, 800Mhz, 2M L2 Cache
4GB Shared Dual Channel DDR2 at 800MHz
Intel Graphics Media Accelerator 4500MHD
320GB SATA Hard Drive (5400RPM)
Windows 7 64bit

Shitty integrated Graphics card sounds like a big problem.

I'm wondering If I would be able to play on Super low settings and still get somewhat playable fps

Any input out there?
Thanks!

you may need to turn the res down a notch but it will be playable.

at least I can get Battlefield 2 to run at 1280x800 in medium settings with the 4500HD so yeah, should be ok on lowest setting.
 
my acre laptop with a 2.0 core 2 and a 4500 intel video will play low settings at about 20 fps not that bad if you really need to play it.
 
Blizzard software ate my GPU

Starcraft II causes overheating problems
Blizzard has admitted that its latest Starcraft II game has a bug which can cause a GPU to go into meltdown.

The game's menu screens aren't framerate-limited which means that if the computer has nothing else to do the graphics hardware gets bored and starts to render the screens. This sounds like a good thing but actually causes overheating problems for the card.

The good news is that it is a doodle to fix. If you go into "Documents\StarCraft II Beta\variables.txt" "Documents\StarCraft II\variables.txt" file and add in the lines frameratecapglue=30 and frameratecap=60 everything goes much better.

It is strange that while Blizzard have admitted that there is a problem, it has not really notified people about it. Still it is nice to have a fix and if your GPU has not melted down to China you can probably give it a go.

Source : fudzilla

....................

test Again
 
I am running a Q9550 overclocked to 3.6ghz with a GTX 285 and am getting the same or better FPS than your I5-I7-I9's. At 1920x1080 I am almost always 60fps. Maybe the Q9550 is a better socket 775 than the old Q6600, I know it has more cache.
 
Blizzard software ate my GPU

Starcraft II causes overheating problems
Blizzard has admitted that its latest Starcraft II game has a bug which can cause a GPU to go into meltdown.

This looks to me like a blatant lie or gross exaggeration. If your PC has adequate cooling, nothing short of changing the wattage draw of the video card could cause it to meltdown. I searched on google about this and only found references to this blog http://gameinformer.com/b/news/arch...t-ii-i-m-installing-it-so-hard-right-now.aspx
where this guy is speculating as to the cause of pc crashes is CPU overheating.

Sure the frames of the menus shouldn't tax your GPU's rendering capabilities. But any decent 3D game should, and if it doesn't meltdown during actual game play, why would excessive menu drawing be different?
 
Quote:
Originally Posted by Jiraiya View Post
Blizzard software ate my GPU

Starcraft II causes overheating problems
Blizzard has admitted that its latest Starcraft II game has a bug which can cause a GPU to go into meltdown.

UH, NO. My setup can run Furmark in a hot summer room without a meltdown. I don't think some lame DX9 game menu running at 1700 fps is going to get near Furmark temperatures. Obviously someone had a bad GPU or inadequate cooling.
 
Yes of course the q9550 is better tan the Q6600 .. the clue is in the price 1x for the q6600 2x for the q9550.
Have to check out the beta/Demo of this.. on C2D 6750 x 4670 x2gb .. I'm thinking low to medium settings.
I didnt really get on with original game, however.
Just a Thought? Maybe the lowish hardware settings are so that consoles will taste some of that sweet, sweet Starcraft Pie. Then a really simplified version to be played over wi-fi on an Iphone/AnnDroid
 
Nice GPU and cpu review Steve, it leaves some questionmarks though.

Would been nice to see stock and OCed E8500 and a Q9450/Q9550 at stock and OCed to see how the cache plays out in terms of performance.
 
I don't get how RTS games are fun, or even a game? You sit there
clicking minerals, gas, wait, build stuff, wait, get soldiers, wait, click units that
attack and then stop for no reason and you have to re-box them and click again.
Send all your soldiers to watch them attack the enemy, repeat. It is
more like a JOB of management, a chore, not a fun game. I have never
enjoyed an RTS, but am a big fan of FPS and even action type RPG's like
Gothic, Zelda. StarCraft is not Game of the Year, maybe battle simulator
of the year. Bring on Gears of War 3 and Rage, those are going to be
fun games. I played a friends Starcraft II for a while and other than the cutscenes,
it is the same 10 year old micro management chore, not a GAME. I suppose to
some people if you took a RPG game and got rid of the part that you actually have
fun with, like controlling your character and fighting the monsters, so that all the game
would have is the tedious part of picking armor, spells, weapons, assigning hit points, etc.
that would be FUN? I doubt it. Have fun guys in your click a million things, scroll around a map, click a million things, then watch a 10 yr old top down icons beat each other up. At least Diablo was FUN because you controlled one character, yourself and did not have to waste 99.99% of your game playing time issuing clicks and orders! I guess it takes a special kind of masochist who likes to do RTS, because you certainly don't play them for fun.
 
I have a Q6600 running at 3.4 on air and an Radeon 4870. I am running the game at max Ultra settings without issue.

same here... got a Q6600 @ 3.2Ghz coupled with a 512MB 4870 and i get 40+ fps. something tells me it's not the processor itself but perhaps the motherboard it was used with. i've been running the game 1080p maxed (v-sync enabled) without a single hiccup and no video issues at all.
 
I am just playing the campaign with Q9550 at 3.6ghz, mem 1066mhz, and GTX 285 at 1920x1080 with 4x antialias and pegged at 60 fps all the time (50-80) without vsync. Obviously my bus and memory are supplying a lot more bandwith than a normal socket 775 quad, but I get the same framerates on this game as well as Crysis and every other game vs an I5,7,9. I just don't see more than a 2 fps benefit of an I processor over mine, or even much of a difference of a gtx 460-480 over a gtx 285 on THIS game.
 
^

Well you don't see a diffrence in fps from a 5850 up to 480gtx in this game @ 2560x1600 b/c the game is still CPU limited for some reason.

It also only support 2x cores atm so the caches, clocks and inherent architecture of the cpus will have almost all effect on performance once you roll with a card equal or faster then a 5850 @ 280 dollars.


The most interesting point is how much of an effect does the quite large L2 cache on the intel Q9xxx series have on the performance.

Having people say they run the game maxed at 60 fps even if the review says other is really a no point since the circumstances are not the same.

I'm running this game on:
Q9450@3.8 ghz
4870x2 ( ai disabled 0x aa 0x af from CCC) - OCed to 800/1000 atm using 10.8 betas
950 mhz mem @ 4-4-4-12

1920x1200 @ dx9
Everything set to ultra

And I get like 30-55 fps mostly in the 45s during the campaign.

Not impressive at all if you ask me, I run World In Conflict maxed with 4x aa 16x af dx10 at 60 fps all the time and only dips down to low 50s during nukes.

Would be awesome if Steve could include the intel Q9xxx series in his cpu benchmark to see how much behind the i7s they are and if it would be worth an upgrade.
 
I am just playing the campaign with Q9550 at 3.6ghz, mem 1066mhz, and GTX 285 at 1920x1080 with 4x antialias and pegged at 60 fps all the time (50-80) without vsync. Obviously my bus and memory are supplying a lot more bandwith than a normal socket 775 quad, but I get the same framerates on this game as well as Crysis and every other game vs an I5,7,9. I just don't see more than a 2 fps benefit of an I processor over mine, or even much of a difference of a gtx 460-480 over a gtx 285 on THIS game.

UPDATE: I just played the gem mission where you defend till you die, with about 200 icons on the screen at once, and my might above machine did hit 18 FPS, still playable, but with a TON of units eventually the FPS does drop from the more constant 45-70 FPS, but very rarely.
 
UPDATE: I just played the gem mission where you defend till you die, with about 200 icons on the screen at once, and my might above machine did hit 18 FPS, still playable, but with a TON of units eventually the FPS does drop from the more constant 45-70 FPS, but very rarely.

Start playing with friends online in 3v3 and 4v4 matches and you will find out that it’s not very rare at all. We were not interested in campaign performance as this is not a very demanding aspect of the game and most will buy StarCraft II predominantly to play multiplayer online.

Basically everyone who has come into this thread claiming excellent performance using ultra settings on old hardware failed to read how we tested the game. If we wanted to show a GeForce 9600 GT doing well we would have tested a 1v1 match or the single player campaign. Unfortunately those results would have done the reader no good once they decide to really get into the game and play some fun 3v3 or 4v4 matches. Even 2v2 matches are considerably more demanding than what most of you guys are testing with.
 
Sorry, you blow. I got rid of my 1920x1200 monitor because I also used it to watch Blue Ray Movies, play Xbox 360 and playstation 3 and got SICK OF THE STUPID BLACK BORDERS TO MAKE IT THE STANDARD 1920x1080! Who wants black bands on the top and bottom of the screen all the time, *****, they made a standard for a reason. I don't miss those 120 vertical blacked out lines at ALL!

Although this is completely off topic and you have conducted yourself much as a 12 year old would, I think the first guest has a point. In my opinion 16:10 is much better for use as a computer screen, I much prefer it for viewing web sites, viewing e-mails and writing documents. However for watching Blu-ray movies 16:9 is arguably better and that is why I watch movies on a TV and use a computer screen for my computer.

Having said that I am aware many people prefer 16:10 for watching movies as the subtitles are placed in the black bar at the bottom and not over the picture. Really the black bars do not bother me as you are not missing out on any of the picture. Hell on my 1080p TV most movies still have the black lines anyway because they are shot in the 2.39:1 Cinema format.

On a more related topic if you play StarCraft II on a 16:9 screen you are missing out on some of the image and could be placed at a slight disadvantage to those using 16:10 screens.
 
Fair enough, but you could get A LOT more information on a 30 inch 2560 resolution or whatever. You would see a lot more of the map/units, but they would be tiny. On a 24 inch monitor it is tough to tell a marine from a ECV. Playing it on my 46" Samsung 1080p LCD HDTV is awesome, you can easily tell the difference of every icon. I think scrolling around the map is so annoying, it would be great if we had monitors with the resolution and size to show the whole map without scrolling, but that is a pipe dream, for now 1080p fits the gaming bill best.
 
Back