Crysis 1.1 patch performance with Multi-GPU testing

Julio Franco

Posts: 9,092   +2,043
Staff member
Let's assume for a second that anyone reading this article owns a copy of Crysis or has at least played the game. Then you should all know how potentially good the game is and how impressive its visuals can be with the proper hardware. Then again, since its release last November it's been a minority of players that have been able to play the game from beginning to end in all its visual glory.

Just last week Crytek released a highly awaited first patch for Crysis which weighted in at 139MB and carried a number of rather large promises. While it was claimed that SLI/Crossfire performance would improve with this patch as well as the overall rendering performance for all graphics cards in DX9 and DX10, they never stated by how much, so they did dodge a bullet there.

https://www.techspot.com/article/83-crysis-patch-performance-multigpu/

Please leave your feedback here. Thanks!
 
SLI on X38?

Your article states you used an X38 MB with a QX9650 CPU for testing. How did you use this set up to test the "GeForce 8800 GT (512MB) SLI" configuration? Last I heard, SLI wasn't possible on X38. Am I missing something?
 
Obvious mistake from our editorial team (including myself), it's now up, we used the ASUS Striker II Formula (NVIDIA nForce 780i SLI) for SLI tests.

Thanks for the tip!
 
Its an interesting article but I'm wondering why you left out the GTX in SLI or even the 2900 XT in Crossfire from the test. I assumed from the title that this was to be an article on how the 1.1 patch effected SLI/Crossfire performance. I have 3-Way SLI and while I didn't expect it to be covered in any benchmark I was expecting to see at least GTX SLI benches. Is 2xGTX so rare? I saw a pretty good increase in frames from the patch. More importantly the game doesn't simply crash out at Very High settings anymore. I can run Very High with no AA @ 1900x1200 at a rate of about 20fps on average with the patch. Without it Very High wouldn't last for more than 10 minutes before I would BSOD. Meh... I'm just blabbering now. I was just wondering why you left out the higher end cards in SLI/Crossfire configs from the test.
 
I would say it may have been because of how much it would cost. I believe they didn't want to show off what great performance you get when you have enough money. ;)

Seriously, I do believe it is because its pretty rare. Not many people could afford that, I would think.
 
It wasn't a matter of price points alone but how the newer mainstream cards can scale up in the game by using SLI/Crossfire.
As you know already these are the cards that offer the best value for consumers today (single card) and they are meant to stick around for a while.

The budget cards are not really a feasible option for high settings in Crysis, so those were left out. The higher-end cards like the GTX/Ultra have been around for a long time now, they are not value options and are about to be replaced, so we wouldn't recommend anyone to upgrade to dual GTXs today, even though they are still the fastest option out there.
 
Hi-Res for real?

Hi:

I don't own Crysis and haven't seen it in action (I just got a single 8800GT/512MB card but need a motherboard BIOS update for new E8400 before I can use them).

But I'm wondering aloud - do very many people really run 1440 x --- and 1920 x --- resolutions? My 19" PC monitor native resolution is 1280x1024, thats where I run it and it looks swell for the games I own. I guess the hi-res users are using much bigger monitors than me, or HDTVs .......... but for my $$ why bother with hi-res if you sacrifice so much frame-rate? Courteous replies only, please.
 
Hey there... I think I'm not alone here as the owner of a larger monitor, however the reason for me to use one is more of a workstation and productivity need rather than gaming.

If you live comfortably with your 19" monitor that's just fine, I owned like four 19" monitors between CRTs and eventually LCDs, but now I won't go back :)

Now, statistically speaking, a large portion of our readers (~55% ) use your resolution or 1024x768, while the rest uses higher resolutions on its majority, with considerable percentages also seen in custom widescreen resolutions often used by laptop screens. So, yes, there is people out there using higher resolutions although with Crysis it's a no go for most.
 
Also, quite often when people are spending enough money to own 8800 GT?, 2-8 GB of RAM, and Core2Duos or Core2Quads then they are probably willing to spend $200-900 on a decent monitor. People playing Crysis will probably have similar setups and even at $200 you will easily get a 20+ inch monitor. Once you play on an LCD for a while you'll never want to move off of the native resolution for that monitor - so for 24" - 27" you're talking 1920x1200.
 
Actually Crysis runs fine at 1900x1200 High settings 2xAA DX9 @about 30-40fps. The problems arise in the dx10 version of the game at with any AA at any resolution. At least for me thats how it is. I own a HannsG 28" WS and 1900x1200 is the native res. I had quite a few 19" and 22" monitors but getting anything more than a single GTX with one of those is a waste of money. The GTX will run everything at the smaller monitors max res with everything maxed with no problems. I, like the guy before me; cannot go back to 19" or 20" monitors. That may be ok for some but you haven't lived until you've played games on a large ws lcd. I have 3-Way SLI and my god is it pretty.
 
Hi guys-
Thanks for replies. Yes I understand (and envy) how nice a 24+ WS monitor would be - I'll probably get one later this year or next - and hence understand why the bigger screen would warrant higher res output from vidcard.

For several days after I got a 46" Sharp Aquos 1080p (1920x1080) 120Hz HDTV last summer, I played some GTR2 and Test Drive Unlimited (TDU supports HD) on it at high res (7950GT DVI-> HDMI) and wow it was sweet. But PC was in the way in family room where HDTV is, so PC is back into computer room ever since.

Edit 6Feb08: corrected Aquos 1080i to 1080p
 
Depends on what you call fine I guess. An average frame rate of 30fps is what I call poor performance. I cannot play a first person shooter with an average of just 30-40fps. I run a 30" Dell LCD and there is no chance of playing Crysis at 2560x1600 even with triple Ultra's ;)
 
Wow, thats what most of my games run at. I see no problem with 30. I would (of course) prefer more, but I don't see THAT much of a difference...
 
It's all a matter of what you become accustomed to - we had this discussion about what running fine actually means about 6 or 7 years ago here. I prefer my games to run close to or above 60 fps all the time. I'm not a fan of the 30's and won't play in the 20's at all. It's the reason I don't play certain games - I don't have enough machine for it to be enjoyable to me.
 
LNCPapa said:
It's all a matter of what you become accustomed to - we had this discussion about what running fine actually means about 6 or 7 years ago here. I prefer my games to run close to or above 60 fps all the time. I'm not a fan of the 30's and won't play in the 20's at all. It's the reason I don't play certain games - I don't have enough machine for it to be enjoyable to me.

I totally agree with what you have said there LNCPapa. As for most games running at 2560x1600 mopar man I have not really found this. Using a GeForce 8800 Ultra I have found that games like Call of Duty 4 for example cannot be played using the highest quality settings. Dropping the resolution to 1920x1200 does improve the situation quite a lot. You just have to look at the gaming performance articles we have done in the past to find that most graphics cards really struggle in the latest games at 1920x1200 with that we believe to be acceptable performance.
 
Back