TechSpot

Optimus lets you yank a GPU out of a live system (video)

By Matthew
Mar 3, 2010
  1. Nvidia may be feeling some heat in the graphics segment, and despite being battered by AMD on many fronts, the company has revealed at least one interesting technology this quarter: Optimus. Nvidia has posted a video on the nTersect Blog, showing Optimus' ability to immediately power on and off a discrete GPU as needed -- and when it's off, it's really off.

    Read the whole story
     
  2. Tekkaraiden

    Tekkaraiden TS Evangelist Posts: 991   +90

    That is very impressive.
     
  3. Timonius

    Timonius TS Evangelist Posts: 640   +56

    Yup, while AMD is building better cards at the moment, Nvidia is actually being innovative (between this and the Ion 2 allowing for more friendly mobile gaming)!
     
  4. slh28

    slh28 TechSpot Paladin Posts: 1,706   +170

    I can just imagine CCC going crazy and crashing and BSODing if I did this to my computer.
     
  5. @slh28

    What did you expect...it wasn't design to do that so ATI stuff will behave the same as Nvidia other stuff that wasn't design with this feature
     
  6. alexandrionel

    alexandrionel TS Enthusiast Posts: 93

    That is nice, if you think about it, this kinda makes possible to upgrade your video card on your laptop.
     
  7. Vrmithrax

    Vrmithrax TechSpot Paladin Posts: 1,290   +239

    I wish it were that easy, alexandrionel... Laptop video cards are too varied, even 2 consecutive models from the same mobile graphics card manufacturers often have different connections and configurations. A few laptop makers tried to make gaming units with modular upgradable GPU configurations, and gave up frustrated.

    However, this Optimus tech is perfect for laptop configurations with integrated and discrete GPUs both installed, particularly since it totally shuts down the discrete unit when not being used. I hadn't realized it was that thorough in the handling, very nice possibilities for extending battery life even further.
     
  8. natefalk

    natefalk TS Rookie Posts: 78

    This is cool... makes me wonder why we can't do this with more components. My money still goes to ATi though. Nvidia lost my business b/c every Nvidia card I've owned ended up dying within a year of purchase. Maybe I have bad luck... I've owned 4 ATI cards and all still work. I've owned 3 Nvidia and all are dead.
     
  9. compdata

    compdata TechSpot Paladin Posts: 526

    Definitely looks great in terms of being able to extend battery life when the GPU isn't in use. However i will be interested to see how much of a difference this makes in practice as everything is becoming GPU enhanced these days (from your OS, web browser, flash, etc. . ). How many times aren't you using your GPU?
     
  10. TorturedChaos

    TorturedChaos TechSpot Chancellor Posts: 831   +18

    love the idea for extend battery life. That's always been my issue with gaming laptops is they suck juice down so fast you pretty much have to have them plugged in, and at that point u are one setup form a desktop. With this it might be possible to get a decent battery life outa a gaming laptop, at least when you don't need the graphics cranked up.
     
  11. Kovach

    Kovach TS Rookie Posts: 44

    Really amazing. This is going to lower down the energy consumption on laptops, and help with global warming.
     
  12. Wagan8r

    Wagan8r TS Evangelist Posts: 595   +52

    That is way cool. I was a bit skeptical of Optimus. It just sounded like a gimmick, but this is actually most impressive.

    Global warming? Seriously? Oh, man, that was a good one!
     
  13. aaamir2u

    aaamir2u TS Rookie

    NVIDIA has obviously been hoping that Optimus Technology will be the answer to its critics and investors that ask how the company planned to survive in a world of CPU and GPU integration. For as long as Intel's GPU technology lags behind and NVIDIA continues to innovate, they will have a path to product viability and profitability that AMD will likely be missing out on.
     
  14. kohtachi

    kohtachi TS Rookie

    wow, cool stuff. So this means more battery life and less heat?
     
  15. captaincranky

    captaincranky TechSpot Addict Posts: 11,687   +1,877

    And I would want to hot swap a video card why....?
     
  16. rajmond

    rajmond TS Rookie Posts: 45

    This is great...It would be even better if this can happen also with other hardware parts (e.g. RAM, hard drive etc.)
     
  17. captaincranky

    captaincranky TechSpot Addict Posts: 11,687   +1,877

    Where Should I Stick the Power Cord.... Decisions..., Decisions....

    Oh yeah, we should be able to install the PSU into a case, then just build a system with it powered on. At least that's how I understand they already do it in special ed.
     
  18. Chazz

    Chazz TS Evangelist Posts: 671   +73

    @aaamir2u

    How would AMD be missing out? They are a CPU and a GPU company, by your statement....the future is theirs to lose. Intel does not have competent gpu knowledge and nvidia is not a cpu company.
     
  19. jjbeard926

    jjbeard926 TS Rookie Posts: 69

    Ok, that's pretty cool. But, so what? It seems like a cool gimmick to me. I doubt any of us will ever use that feature. Honestly I'd rather see NVidia work on putting out some graphics cards that 1) work reliably(my last 2 were flaky) and 2) can keep up with ATI cards.
     
  20. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,219   +157

    remedial electronics.....

    :haha::haha:, oh I wish you could see my cartoon bubble right now.
     
  21. UT66

    UT66 TS Rookie Posts: 143

    cool, now if only they could send me a functioning 8600gt to replace my dead one
     
  22. Yoda8232

    Yoda8232 TS Rookie Posts: 145

    Would be helpful with multi-GPU setups mostly, laptops too for battery power!
    I'd prefer a physical switch though to control it.
     
  23. j4m32

    j4m32 TS Rookie Posts: 49

    I presume that it also has an onboard chipset graphics processor for general use (non 3D),
    otherwise it really would be stuffed, since most other systems the one graphics card (or two combined as Crossfire/SLi) is a primary unit which is going to be static and constantly in use anyway so it wouldn't make sense for a desktop (as such) in the same way - unless you have a power hungry GPU and an onboard chipset GPU which is unused but would do the job as well.

    What they have done (as far as I see) is to put a loop in the drivers and software for an automated "turn on" and "turn off" when an application initialises or closes which uses 3D rendering and switch over to the chipset graphics on the same physical output, to give it that "seemless" effect between the two.

    Still a good idea for trying to get the best performance out of a limited amount of power (battery wise) for a mobile device.
     
  24. Zenphic

    Zenphic TS Rookie Posts: 43

    AMD should have done something similar with their hybrid crossfire (IGP + discrete)
     
  25. jakeshjo1953

    jakeshjo1953 TS Member Posts: 26

    True to your name aren't you.
     
Topic Status:
Not open for further replies.

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...