Optimus lets you yank a GPU out of a live system (video)

Status
Not open for further replies.

Matthew DeCarlo

Posts: 5,271   +104
Staff

Nvidia may be feeling some heat in the graphics segment, and despite being battered by AMD on many fronts, the company has revealed at least one interesting technology this quarter: Optimus. Nvidia has posted a video on the nTersect Blog, showing Optimus' ability to immediately power on and off a discrete GPU as needed -- and when it's off, it's really off.

The PCIe bus, the frame buffer memory, the GPU -- it's all shut down rather than switching to a low-power state. Nvidia says this translates to major power savings, extending battery life by up to double. What better way to demonstrate the technology than by yanking a GPU out of a live system, and then placing it back in as if nothing happened?

The ability to switch between integrated and discrete graphics chips has been around for a while, but has required user input. Users have had to manually make the swap, and sometimes the effect isn't instant. Optimus on the other hand, is totally transparent and allows systems to dynamically switch between graphics solutions without any user involvement.

Permalink to story.

 
Yup, while AMD is building better cards at the moment, Nvidia is actually being innovative (between this and the Ion 2 allowing for more friendly mobile gaming)!
 
I can just imagine CCC going crazy and crashing and BSODing if I did this to my computer.
 
@slh28

What did you expect...it wasn't design to do that so ATI stuff will behave the same as Nvidia other stuff that wasn't design with this feature
 
I wish it were that easy, alexandrionel... Laptop video cards are too varied, even 2 consecutive models from the same mobile graphics card manufacturers often have different connections and configurations. A few laptop makers tried to make gaming units with modular upgradable GPU configurations, and gave up frustrated.

However, this Optimus tech is perfect for laptop configurations with integrated and discrete GPUs both installed, particularly since it totally shuts down the discrete unit when not being used. I hadn't realized it was that thorough in the handling, very nice possibilities for extending battery life even further.
 
This is cool... makes me wonder why we can't do this with more components. My money still goes to ATi though. Nvidia lost my business b/c every Nvidia card I've owned ended up dying within a year of purchase. Maybe I have bad luck... I've owned 4 ATI cards and all still work. I've owned 3 Nvidia and all are dead.
 
Definitely looks great in terms of being able to extend battery life when the GPU isn't in use. However i will be interested to see how much of a difference this makes in practice as everything is becoming GPU enhanced these days (from your OS, web browser, flash, etc. . ). How many times aren't you using your GPU?
 
love the idea for extend battery life. That's always been my issue with gaming laptops is they suck juice down so fast you pretty much have to have them plugged in, and at that point u are one setup form a desktop. With this it might be possible to get a decent battery life outa a gaming laptop, at least when you don't need the graphics cranked up.
 
Really amazing. This is going to lower down the energy consumption on laptops, and help with global warming.
 
That is way cool. I was a bit skeptical of Optimus. It just sounded like a gimmick, but this is actually most impressive.

Kovach said:
Really amazing. This is going to lower down the energy consumption on laptops, and help with global warming.
Global warming? Seriously? Oh, man, that was a good one!
 
NVIDIA has obviously been hoping that Optimus Technology will be the answer to its critics and investors that ask how the company planned to survive in a world of CPU and GPU integration. For as long as Intel's GPU technology lags behind and NVIDIA continues to innovate, they will have a path to product viability and profitability that AMD will likely be missing out on.
 
This is great...It would be even better if this can happen also with other hardware parts (e.g. RAM, hard drive etc.)
 
Where Should I Stick the Power Cord.... Decisions..., Decisions....

This is great...It would be even better if this can happen also with other hardware parts (e.g. RAM, hard drive etc.)
Oh yeah, we should be able to install the PSU into a case, then just build a system with it powered on. At least that's how I understand they already do it in special ed.
 
@aaamir2u

How would AMD be missing out? They are a CPU and a GPU company, by your statement....the future is theirs to lose. Intel does not have competent gpu knowledge and nvidia is not a cpu company.
 
Ok, that's pretty cool. But, so what? It seems like a cool gimmick to me. I doubt any of us will ever use that feature. Honestly I'd rather see NVidia work on putting out some graphics cards that 1) work reliably(my last 2 were flaky) and 2) can keep up with ATI cards.
 
remedial electronics.....

Oh yeah, we should be able to install the PSU into a case, then just build a system with it powered on. At least that's how I understand they already do it in special ed.

:haha::haha:, oh I wish you could see my cartoon bubble right now.
 
Would be helpful with multi-GPU setups mostly, laptops too for battery power!
I'd prefer a physical switch though to control it.
 
I presume that it also has an onboard chipset graphics processor for general use (non 3D),
otherwise it really would be stuffed, since most other systems the one graphics card (or two combined as Crossfire/SLi) is a primary unit which is going to be static and constantly in use anyway so it wouldn't make sense for a desktop (as such) in the same way - unless you have a power hungry GPU and an onboard chipset GPU which is unused but would do the job as well.

What they have done (as far as I see) is to put a loop in the drivers and software for an automated "turn on" and "turn off" when an application initialises or closes which uses 3D rendering and switch over to the chipset graphics on the same physical output, to give it that "seemless" effect between the two.

Still a good idea for trying to get the best performance out of a limited amount of power (battery wise) for a mobile device.
 
Status
Not open for further replies.
Back