GeForce 306.02 WHQL drivers offer GTX 660 Ti support, new profiles

Matthew DeCarlo

Posts: 5,271   +104
Staff

Folks with the new GeForce GTX 660 Ti can now run the card with WHQL-certified software courtesy of a fresh driver released by Nvidia yesterday. GeForce 306.02 covers operating systems spanning from Windows XP through Windows 8 and carries various new and updated profiles as well as bug fixes.

The release notes don't mention any performance improvements if you're upgrading from another R304 family driver, such as July's 304.79 beta, but depending on your hardware and software configurations, you can expect a handful of improvements if you're coming from the 301.24 beta. For example:

GeForce GTX 680:

  • Up to 18% in Batman: Arkham City
  • Up to 15% in Dragon Age II
  • Up to 10% in S.T.A.L.K.E.R.: Call of Pripyat
  • Up to 60% in Total War: Shogun 2 (fixes performance issue with latest game patch)

GeForce GTX 560:

  • Up to 14% in Batman: Arkham City
  • Up to 5% in Battlefield 3 with SLI
  • Up to 4% in Dragon Age II
  • Up to 8% in The Witcher 2: Assassins of Kings with SLI
  • Up to 7% in Lost Planet 2

The latest release introduces or improves over a dozen SLI profiles for titles including Alan Wake's American Nightmare, Borderlands 2, Darksiders 2, F1 2012, Nexuiz, Orcs Must Die! 2, Sleeping Dogs, Spec Ops: The Line, Torchlight II and Ghost Recon: Future Soldier. Likewise, there are updated 3D Vision profiles for a slew of games such as Borderlands 2, Dishonored, Mass Effect 3, Max Payne 3, The Secret World and The Walking Dead. Other profile additions include antialiasing support for Diablo III, L.A. Noire and Rayman Origins as well as ambient occlusion support for Star Wars: The Old Republic.

Download GeForce 306.02 WHQL (release notes)
Desktop: Windows XP 32-bit | Windows XP 64-bit | Windows Vista/7/8 32-bit | Windows Vista/7/8 64-bit

Version 306.02 squashes various bugs, including four on Windows 8: one caused GeForce driver installations to fail when trying to reinstall them, another resulted in bluescreens on Optimus-equipped notebooks when attempting to uninstall GeForce drivers, a third caused residual images to appear when playing Crysis 2 and Dirt Showdown in 3D with a GTX 500 series GPU, and the last resulted in bluescreens when enabling or disabling SLI or Surround modes on systems with Intel X59 and X79 chipsets. Several issues have been fixed for Vista and 7 too, so check the release notes if that's of interest.

A heads up for GeForce 6 and 7-series owners: this'll might be your last update. Nvidia's says those product lines will be moved to legacy support after the GeForce R304 drivers, meaning they won't be covered in the next major driver family, R310. It's also worth noting that with Borderlands 2 launching on September 18, Nvidia is working with Gearbox on various promotions, including a free copy of the upcoming shooter if you purchase a new GTX 660 Ti or better. Last week, Nvidia released a video showing the graphical benefits GeForce owners will experience via PhysX when playing Borderlands 2.

Permalink to story.

 
So wait, did Borderlands just **** over everyone without a PhysX capable system. What I mean is, did they just create an engine that won't do any kind of physics without Nvidia hardware? There can't be no physics at all is what I trying to figure out here if I run it on AMD hardware.
 
You have to understand borderlands 2 uses the Unreal Engine .. that supports Physx.

built into Physx API ..is level of details in physx.

on CPU level ..its minimal (and maybe even purposely dumbed down) .. on GPU .. level .. they go full out ..

if Borderlands 2 did not support GPU level phsyx .. it would look the same on AMD or NVIDIA ..

NVIDIA is usually very involved with the game developers to make sure the Engine is fully optimized for nvidia hardware ... this way they have an advantage in benchmarks when they reviews are written on new Graphics Cards ... (smart .. In My Opinion)
 
You have to understand borderlands 2 uses the Unreal Engine .. that supports Physx.
built into Physx API ..is level of details in physx.
on CPU level ..its minimal (and maybe even purposely dumbed down) .. on GPU .. level .. they go full out ..
if Borderlands 2 did not support GPU level phsyx .. it would look the same on AMD or NVIDIA ..
NVIDIA is usually very involved with the game developers to make sure the Engine is fully optimized for nvidia hardware ... this way they have an advantage in benchmarks when they reviews are written on new Graphics Cards ... (smart .. In My Opinion)
Spot on. UE3 has PhysX integration.
Game Dev 101: Pay for the development, reap the reward. Not much difference from AMD working with Codemasters to tailor games like DiRT Showdown specifically for AMD's GCN architecture...both represent eye candy above the standard game I.q. level (physics and global illumination respectively).

I might add that not all physics engines are created equal. PhysX does cloth and volumetric smoke particularly well. It also does n-body and path prediction in a gravitational field better than most (if not all)- although I don't think the latter is applicable to Borderlands 2.

PhysX isn't an isolated case even in physics engines. Intel's Havok engine is being optimized for the AVX extention...older CPU's need not apply.
 
You have to understand borderlands 2 uses the Unreal Engine .. that supports Physx.

built into Physx API ..is level of details in physx.

on CPU level ..its minimal (and maybe even purposely dumbed down) .. on GPU .. level .. they go full out ..

if Borderlands 2 did not support GPU level phsyx .. it would look the same on AMD or NVIDIA ..

NVIDIA is usually very involved with the game developers to make sure the Engine is fully optimized for nvidia hardware ... this way they have an advantage in benchmarks when they reviews are written on new Graphics Cards ... (smart .. In My Opinion)

In a way, Nvidia investing more money on optimizing is a perk for consumers despite how some people feel about their business practices or what not.

It's not much different than how people feel about Intel.
 
Back