Nvidia automates switchable mobile graphics with Optimus

By on February 9, 2010, 10:36 AM
After teasing the technology back in January, Nvidia has finally thrown back the curtain on Optimus and is promising notebook users the full performance benefits of a discrete GPU with the battery life of an integrated graphics solution. In a nutshell, it allows notebooks to dynamically switch between graphics systems without any user interaction.


Notebook manufacturers have been combining discrete and integrated graphics for a while now, but thus far choosing which card should handle graphics at any given time has remained a manual process. Sometimes this involved a software switch and a lot of screen flickering as the discrete graphics chip turns on or off, sometimes an actual physical switch on the laptop and a system restart. However, it always required some form of interaction and prioritizing notebook performance or battery life.

This process is just too cumbersome and confusing for mainstream users, some of which may not even know they have switchable graphics, and will simply leave the discrete GPU permanently off or on. Optimus on the other hand is automatic. It determines the best processor for the workload and routes it accordingly, with the decision being entirely transparent to users as they fire up a game or start writing an email.

Continue reading...

This seems like such an obvious thing, one might wonder why we weren't already doing it that way. But under the hood it's much more complicated. Nvidia claims it was an issue with both the integrated and discrete graphics sharing a multiplexer (or mux) connection to the monitor, which made it impossible to switch on the fly. Optimus takes a different approach by instead treating the GPU like a co-processor and routing its output through the IGP so there's only a single point of connection to the display.

The technology is designed to run on Nvidia's next-gen Ion and Geforce M products as well the upcoming Geforce 200M and 300M GPUs. What's more, it will work in conjunction with Intel's Arrandale Core i3, Core i5 and Core i7 processors as well as with its Penryn Core 2 Duo and Pine Trail Atom N4xx chips.

Nvidia says that several manufacturers are putting Optimus into their products. The first such laptops will hit at the end of this month courtesy of Asus, and will include the UL50Vf, N61Jv, N71Jv, N82Jv, and U30Jc.




User Comments: 8

Got something to say? Post a comment
compdata compdata, TechSpot Paladin, said:

Definitely will be nice to see, although i am afraid that most with IE, Windows, Flash, and many other programs relying on GPU acceleration these days that this will only really play into effect when you are not doing anything on your computer :-p Guess time will tell though.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

@compdata - you know, I was thinking the same thing. Just wondering how they calculate the break-off point between integrated and discrete GPUs. I mean, integrated will do many flash type games just fine, so the discrete would be overkill. But how does this system know that the draw is going to be heavy and make the switch? Or does it switch dynamically in-game? And, if so, how many games are going to buckle under this flip-flopping of GPU properties?

Hmmm, now I'm all curious and stuff! heh

dividebyzero dividebyzero, trainee n00b, said:

I believe the Optimus routing switch (?) looks for and detects CUDA , DVXA and DirectX calls (i.e. the graphics intesive operations) something like a semi-automated version of the nVidia control panel's 3D management settings.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

dividebyzero said:

I believe the Optimus routing switch (?) looks for and detects CUDA , DVXA and DirectX calls (i.e. the graphics intesive operations) something like a semi-automated version of the nVidia control panel's 3D management settings.

Ah, that would make sense. Digging further, it almost appears that the Optimus system "fibs" by presenting itself with the credentials of the full-on discrete GPU, then it internally decides which of the processors will do the work. So, to the outside applications, nothing ever appears to change. If I'm understanding that correctly, that's a pretty sneaky (and ingenious) way to handle things!

dividebyzero dividebyzero, trainee n00b, said:

Seems strange that two company's preparing to go to war in court seem to have managed to put together a technology that seems to have such a good future.

Just as well AMD has Llano in the pipeline, but that's one hell of a lead time for Intel/nV.

Kibaruk Kibaruk, TechSpot Paladin, said:

I'm just not convinced here, it's like the optimum way to allocate memory pagefiles that they teach you in school... if you do it in paper it IS the OPTIMUM way to go, but in reality it can't be applied because you will never know at first what will come 5 minutes later.

So you say no screen flickering?

dividebyzero dividebyzero, trainee n00b, said:

I'm just not convinced here, it's like the optimum way to allocate memory pagefiles that they teach you in school... if you do it in paper it IS the OPTIMUM way to go, but in reality it can't be applied because you will never know at first what will come 5 minutes later.

So you say no screen flickering?

Would Anandtech lie ?

[link]

alexandrionel said:

So, if I understood correctly, Tegra and Tegra2 are for smartphones and Tablet PC and the nVidia ION and OPTIMUS are for laptops and notebooks? Cant a notebook integrate Tegra2? They showed with Tegra1 runing HD movies and consuming less power than a notebook.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.