Catalyst 13.2 beta brings 15% boost in Crysis 3, latency improvements

Matthew DeCarlo

Posts: 5,271   +104
Staff

On the heels of Nvidia's performance-oriented GeForce update, AMD has released its own beta driver bringing a handful of frame rate and latency enhancements. Among the speed improvements is a 15% gain when using high MSAA settings in Crysis 3, which is running a public multiplayer beta through February 12, allowing players to explore two maps ("Museum" and "Airport") as well as two game modes ("Crash Site" and "Hunter").

With Crysis 3 expected to launch in North America on February 19, AMD has promised that additional performance improvements will be delivered in future updates to the Catalyst 13.2 beta. For now, other title-specific performance improvements include up to a 50% boost when playing the recently released DmC Devil May Cry with a single graphics card and up to 10% better frame rates when running Crysis 2 with a CrossFire setup.

The latest Catalyst build also partially addresses a frame latency issue found by The Tech Report in a GTX 660 Ti versus HD 7950 review last December. Although AMD's GPU delivered decent average frame rates, it produced regular latency spikes when rendering frames that yielded choppier gameplay than might be expected. The site published a follow-up last week showing the improvements made in the beta drivers when playing Skyrim, Borderlands 2 and Guild Wars 2. Grab your copy of the new Catalyst or GeForce release below:

Catalyst 13.2 beta 3 (release notes)
Desktop/mobile: Windows Vista/7/8 32/64-bit | Linux (Red Hat/SUSE/OpenSUSE/Ubuntu) 32/64-bit

GeForce 313.95 beta (release notes)
Desktop: Windows XP 32-bit | Windows XP 64-bit | Windows Vista/7/8 32-bit | Windows Vista/7/8 64-bit
Mobile: Windows Vista/7/8 32-bit | Windows Vista/7/8 64-bit

Permalink to story.

 
My experience has been that Nvidia driver is worse. I switched over from Nvidia to AMD when building a HTPC, which the Nvidia driver at the time was horrendous for anything HTPC related. things like lossless output, to color space representation, and proper S3 resume were all none existence or broken on the Nvidia side. AMD wasn't perfect, but at least for a HTPC setup, it had some kind of workable Lossless support, the correct output of color space.
 
"The battle of GPUs is over."

Not if NV expects to launch the $899 GK110 Titan card in the next 2 months delivering near GTX690 level of performance. Also, the GPU battle will heat up again once we shift to 20nm with Maxwell/Volcanic Islands and more next generation DX11 titles launch. Crysis 3 should work better on NV cards since NV has been working closely with Crytek on this title for a long time now. That's not to say the game will be playable with everything maxed out on a GTX680 either. Preliminary benchmarks show that it's very punishing on existing single GPU hardware, regardless of the brand:

http://gamegpu.ru/images/stories/Test_GPU/Action/Crysis 3 beta/crysis_3-beta vhq 1920.jpg
http://gamegpu.ru/images/stories/Test_GPU/Action/Crysis 3 beta/crysis_3-beta vhq 2560.jpg
 
"The battle of GPUs is over."

Not if NV expects to launch the $899 GK110 Titan card in the next 2 months delivering near GTX690 level of performance. Also, the GPU battle will heat up again once we shift to 20nm with Maxwell/Volcanic Islands and more next generation DX11 titles launch. Crysis 3 should work better on NV cards since NV has been working closely with Crytek on this title for a long time now. That's not to say the game will be playable with everything maxed out on a GTX680 either. Preliminary benchmarks show that it's very punishing on existing single GPU hardware, regardless of the brand:

http://gamegpu.ru/images/stories/Test_GPU/Action/Crysis 3 beta/crysis_3-beta vhq 1920.jpg
http://gamegpu.ru/images/stories/Test_GPU/Action/Crysis 3 beta/crysis_3-beta vhq 2560.jpg

Another blatant lie from a typical nVidia fanboy. This comes from the developers log on the Crysis Facebook page.

http://www.crysis.com/us/crysis-3/beta

We're seeing 1.3-1.6x in Crysis 3 in our labs on 7970 GHz Edition @ 1080p and 1600p. I don't know about the other cards. This is is a beta driver for a beta game, and performance is most certainly not final. Additional optimizations are in the works, and they may or may not arrive within the timespan of the beta.

Ultimately, the build that Crytek releases as the final game is our most important optimization target. We're using the telemetry we're receiving from the beta to inform the decisions we will make for that.

Everyone must also keep in mind that Crytek decided to join AMD Gaming Evolved relatively late in the Crysis 3 development process. We've had developers on-site with Crytek around the clock since that point, and we're working together to squeeze every last drop out of Radeon architectures on this title. For example, GCN is one of the reasons Crytek made the switch. It's important to us and them that everything is hunky dory but, compared to other titles at this point in their development cycle, we've all had less time to do that work. So we're doing a little catchup this week and next, applying that spit polish, and making damn sure everything is smooth for launch.

Patience with double betas!
 
Another blatant lie from a typical nVidia fanboy. This comes from the developers log on the Crysis Facebook page.

http://www.crysis.com/us/crysis-3/beta

We're seeing 1.3-1.6x in Crysis 3 in our labs on 7970 GHz Edition @ 1080p and 1600p. I don't know about the other cards. This is is a beta driver for a beta game, and performance is most certainly not final. Additional optimizations are in the works, and they may or may not arrive within the timespan of the beta.

Ultimately, the build that Crytek releases as the final game is our most important optimization target. We're using the telemetry we're receiving from the beta to inform the decisions we will make for that.

Everyone must also keep in mind that Crytek decided to join AMD Gaming Evolved relatively late in the Crysis 3 development process. We've had developers on-site with Crytek around the clock since that point, and we're working together to squeeze every last drop out of Radeon architectures on this title. For example, GCN is one of the reasons Crytek made the switch. It's important to us and them that everything is hunky dory but, compared to other titles at this point in their development cycle, we've all had less time to do that work. So we're doing a little catchup this week and next, applying that spit polish, and making damn sure everything is smooth for launch.

Patience with double betas!

I didn't see any "lies" in his statement - at worst he omitted the fact that Crytek <I>finally</I> decided to pay some attention to the AMD owners. Your own re-post clearly shows that Nvidia was involved with the game well before AMD. Not that "dual betas" will make much difference..we all know that Nvidia gives kickbacks to devs so they'll optimize their code for Nvidia architecture. There's a reason that damn near every game shows that stupid Nvidia movie clip at startup. I have <b>never</b> seen a "plays best on AMD" movie from any program that AMD didn't write itself.
 
Well, I had AMD before, for like 6 months, about 1 and a half or almost 2 years ago. I can't quite remember what card it was, which says a lot in and of itself. Me with my 660 Ti I'm enjoy my ride on the NV bandwagon. Drivers are a lot more frequent.

That's not to say quality is bad on the AMD front. I just think, faster more specific drivers are better than slower more broad drivers, specially at the rate new games are released.
 
Back