Nvidia overhauls mobile graphics line with GeForce 800M GPUs featuring Battery Boost

By Shawn Knight ยท 23 replies
Mar 12, 2014
Post New Reply
  1. Nvidia on Wednesday launched its new GeForce 800M line of notebook GPUs, a complete overhaul of its mobile graphics line that promises to deliver more processing power while consuming less power. Key to the latter bit of that promise is...

    Read more
    cliffordcooley and misor like this.
  2. veLa

    veLa TS Evangelist Posts: 782   +235

    Primarily rebranded cards, nothing to see here folks.
  3. Primarily rebranded maxwells?

  4. Jad Chaar

    Jad Chaar Elite Techno Geek Posts: 6,515   +974

    The performance improvements are quite significant for a "rebrand".
  5. Nvidia has very few gpu's on laptops that actually work fine after a while .Most of them overheat very fast,even if the laptop is brand new and when this happens it will never be the same .I would get ATI anytime although I am an nvidia fan on PC the reliability on laptops is so bad .Majority of laptops that get gpu problems like overheat and they get unstuck from the motherboard are Nvidia. You can reball them but they won't last ,they are compromised.
  6. misor

    misor TS Evangelist Posts: 1,285   +243

    if this is done thru a driver, then can it be inferred it's also present in previous NVidia 700m series?
  7. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,264

    Nope. The Battery Boost feature is in part hardware based - revised silicon, whether it is Kepler or Maxwell based.
    It also seems to linked to both the firmware/BIOS of the laptop, and GeForce Experience software. I think the common conception is that all three are required to optimize both GPU and CPU workload.
    cliffordcooley and misor like this.
  8. Skidmarksdeluxe

    Skidmarksdeluxe TS Evangelist Posts: 8,647   +3,274

    Probably only worth taking note of when it's in it's 2nd or 3rd iteration much like GPU boost.
  9. misor

    misor TS Evangelist Posts: 1,285   +243

    thanks for the info. :)
    I thought NVidia just found a way to (further) bury amd's lackluster (so far) mantle technology.
  10. amstech

    amstech IT Overlord Posts: 1,936   +1,101

    Have any examples?
    We've tossed many laptops with dedicated Radeon graphics because they burnt up, I've never tossed one with Nvidia graphics. I'm not just saying that to say it or stick up for gang green either.

    I have a GT550M (40NM/1GB VRAM/128bit/144 CUDA CORES) in my L702X (over 2 years old now but that i5-2430M still rips) and I have the core overclocked 100mhz so it puts up GT555M type performance. I've never had it get anywhere near very hot/overheating and even though my laptop is only 1600 X 900 it plays very well at this rez, games like Borderlands, Left For Dead 2, Warcraft and Dragon Age: Origins I can max out or nearly max out @ 60 FPS (have screens).
    Last edited: Mar 13, 2014
    cliffordcooley likes this.
  11. Matt12345170

    Matt12345170 TS Booster Posts: 106   +31

    If anything, my amd 7970m runs hot, now you may not beleive that it hasent damaged anything yet, but it likes to run at about 100c-103 (highest ive seen speccy or speedfan pick it up at), had it two years so far
  12. Bannhammer

    Bannhammer TS Rookie Posts: 21

    NVIDIA has been really disappointing recently, Surround is broken as hell despite it being around for years. They're losing touch with consumers. I think its time to jump ship.
    GhostRyder likes this.
  13. GhostRyder

    GhostRyder This guy again... Posts: 2,198   +593

    If you really wanna compare either sides GPU's this go round on the mobile division they are all almost primarily rebranded cards even on the high end. The 880m is just a 780m with a core clock bump and the m290X is just a 8970m with a core clock bump.

    Either way, laptops are not getting much love which is starting to show because the M290X is essentially just a reference HD 7870 (R7 270) and the 880m is a GTX 770 (680) with a low core clock.

    Well, guess ill be swapping out my 675m for either an M290X or the 880m depending on the pricing for the MXM individual cards. I doubt this battery boost option though is going to make enough of a difference to matter on the high end because I would rather it just default to the Intel graphics when not gaming and run at max frequency when gaming while plugged in. Batteries on the mobile game laptops don't last anyways so while im all for improved power consumption, this is not going to help much.
    Last edited: Mar 13, 2014
    misor likes this.
  14. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 9,738   +3,706

    Arg, requiring GeForce Experience to work? I hope they plan on migrating the feature over to drivers.

    Looks to me they are trying to make it mandatory for nVidia owners to install GeForce Experience. Don't get me wrong, I currently have it installed for driver notifications, as well as game optimizations. Battery boost though to me is a feature that should be included within the driver package, for those who don't want GeForce Experience installed.
    misor likes this.
  15. Burty117

    Burty117 TechSpot Chancellor Posts: 3,148   +915

    Nvidia, please stop teasing us with lower end Maxwells and get on with a new beast please :)

    I'm joking, take your time, I don't think I'll need to upgrade for a while anyway, I'm waiting until Nvidia release a DX12 compatible card before I buy a new card.

    This was kinda expected, re-brands but with newer silicon to support battery saving and ShadowPlay, Nothing new or unexpected but nothing to be alarmed about either, once Maxwell is in full swing, that's when things will get interesting again.
  16. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,264

    It's the higher end of the model lineups that are rebrands. The low/medium end of both AMD and Nvidia are new silicon. The HD 85x0M/86x0M/87x0M are all based on the new-ish Oland GPU, and the 830M/840M (GM108) and 850M/860M 640 core (GM107) are Maxwell based
    The M290X has exactly the same clocks as the 8970M. It's the 8970M that is clock bumped, since it is a 7970M with a 50MHz boost state added.
    That has been the case for some considerable time. OEMs have always looked to established silicon when allying the GPU with higher power demand/heat output. The OEMs value stability and lessened warranty issues to a higher degree with prebuilts since the unit replacement/repair cost is appreciably higher than an individual component.
    You wont be the only one
    It's usually the Nvidia way to try to package the software with the hardware, so I wouldn't hold my breath waiting for Nvidia to make it less intrusive/smaller footprint. There probably wouldn't be much impediment to including Battery Boost in the driver + requisite BIOS/firmware addition...excepting Nvidia's wanting to package more software- which comes down to a visibility/marketing point rather than any real consumer need (IMO).
    Burty117, misor and cliffordcooley like this.
  17. GhostRyder

    GhostRyder This guy again... Posts: 2,198   +593

    Hence why I said "Almost All", the low end get a refresh but in reality they are not worth much when looking at how far the on board HD4XXX and Richland/Trinity/Kaveri APU's have come minus the GTX 860m and 850m.
    I miss-read the notebook check, thought it said 900 with 50 mhz boost not 900 with the boost max.
    Doesn't change the fact a refresh is really needed. The notebook sector is starting to fall behind faster and faster on the high end division (While the mid range jumps up). The fact is that bumping the core clocks constantly on the higher GPU's is at this point not enough considering the GPU's have not changed in a year yet the prices remain the same. Kinda annoying to keep having to spend the same money for minor clock bumps instead of actual performance boosts or increase in Cuda core counts/Stream processor counts.

    The only saving grace is that the 100+ core clock bump on the 880m brings it closer to its desktop equivalent meaning its got some massive power while the 8970/m290X has the same performance but a few mantle titles and its price. Makes it hard to shop for a notebook when not much changes year after year. Part of the reason my GTX 675m stayed so relevant and is still a decent performer.
    misor likes this.
  18. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,264

    That is always the case when the top desktop parts become basically outliers to the rest of the model line up. The current AMD and Nvidia top tier GPUs are there by dint of their size and requisite power budget/heat production.
    For AMD to maintain parity between desktop and mobile they would have to try to shoehorn a Hawaii GPU into a 100W TDP* thermal budget. Feasible ? Likewise you won't see Nvidia offering a mobile GK110 for the same reason.
    Both Hawaii and GK110 offer a huge bump in performance over the next tier GPUs (at the cost of power/heat). Remove both from the equation and the desktop-to-mobile comparison looks much less exaggerated.
    You wont see any real change in that situation unless Nvidia release a GM106 on 28nm (looking unlikely), or until both camps move to 16nm FinFET ( 20nm BEOL + 16nm FEOL), since it looks increasingly unlikely that either AMD or Nvidia will use the (low-power optimized) 20nm process, thanks to the lack of power/die space savings offered by 20nm, and the unexpectedly quick progression of TSMC's 16nm FF node.

    * 100 Watts being the de-facto standard upper limit for mobile GPUs from an OEM standpoint.
    misor likes this.
  19. GhostRyder

    GhostRyder This guy again... Posts: 2,198   +593

    Well the 7970m/8970m/M290X is based on the Pitcairn GPU which also leaves out the Tahiti GPU and working around that which I could . I would not believe that making a Hawaii in a 100 watt is easy without significant clock sacrifices, however cooling the has improved throughout the years along with lower voltage CPU's which can leave more headroom on the GPU side. With the GTX 880m and the M290X I just see a lag behind in the mobile devision on the high end while the mid and low start to jump up pretty close to their levels.

    It almost makes the high end less nice when the mid range is offering such a good performance especially considering the different costs.
  20. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,264

    Downclocking a Hawaii would negate any benefit it could have in relationship to upclocking a Pitcairn (or similar) GPU. Hawaii's silicon real estate isn't conducive to practical mobile operation, any more than Cayman and Tahiti before it.
    A fully enabled Hawaii die at 440MHz core is sucking down almost 80 watts, and it certainly isn't a better option than a Pitcairn running twice as fast.
    Power budget for the mobile system as a whole isn't really the issue either. It is about localized heat output and dissipation and the fact that the MXM 3.0B specification has an upper limit of 100 watts
  21. GhostRyder

    GhostRyder This guy again... Posts: 2,198   +593

    Well I never said it was possible on hawaii, I was infering tahiti or cayman on AMD side. MXM 3.0b with a 100watt limit does hold back which also infers a need for MXM 4.0 and higher power limits. However 2-3 generations of just bumping core clocks is getting old no matter what.

    Localized heat output on high systems is possible very easily, thats why we can have system with dual 8970m's and GTX 780m's. In reality the problem only comes down to the companies controlling their power output which I had hope Nvidia was going to refresh with a Maxwell 880m that would completely stomp outthe 780m and 8970m/M290X since Maxwell is very power efficient as already seen on the 750ti SKU.

    Whatever, does not matter at this point, whats done is done. Just is going to come down to the overall performance and price for me at least, and how hard adjusting my MSI notebook to the different cards will be.
  22. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,264

    Ah! My bad, when you said:
    ...I thought you were inferring that Hawaii could be a 100 watt card if clockspeed was sacrificed. Since Hawaii pulls ~80W in some 2D applications, I thought it unlikely.
    Most people would probably see the challenge as improving efficiency within the existing specification rather than raising the thermal power limits, since increased power has the not uncommon effect of cratering battery life...and of course the no small matter that input power equals heat production. AFAIK there isn't any plans for an MXM 4.0 specification- hardly surprising since the revised 3.1 spec is fairly new.
    Not a hell of lot you can do when the only silicon foundry capable of outputting high performance/high power GPUs that isn't Intel is stuck on the same process node for those 2- 3 generations. At least Maxwell is making an attempt at maximizing the potential of an in-place foundry process
    Which actually presents two localized heat areas, not one large one. The MXM specification also covers heatsinks as well as TDP (and bus width IIRC)
    And the highest performing Maxwell part is already used for the 860M (750Ti). As I noted earlier, a GM106 on 28nm seems extremely unlikely since 1. its probable successor on 16nmFF isn't that far off meaning a short product life, 2. the improved efficiency demonstrated with the GM107 (and likely GM108) wont translate as dramatically to a larger GPU, and 3 - from a sales standpoint, high end mobile GPUs are both a niche market, and one that Nvidia can fight successfully with the present lineup.
    Correct. Both Nvidia and AMD are shackled by process node, OEM wish lists and product cycles, and the need to recoup design investment
    Last edited: Mar 14, 2014
  23. GhostRyder

    GhostRyder This guy again... Posts: 2,198   +593

    Well yea but at the same time, a "Gaming" notebook never has great battery life to begin with. I like better battery life on mine when im doing normal tasks which is where things like Zero Core or Nvidias underclock state to save power. But for gaming most laptops stay plugged in as is (At least among the groups/people I see with them).
    Quite, I was more hoping that they would release something not on desktop SKU yet for the mobile as another taste of Maxwell if you will on the high end. Was more of a Cross your fingers moment if you will.
    Well yes, but all I was saying is that its more than possible to dissipate that much heat in a laptop in general in a local area. Even if were saying at one single point, its more than probable that we could dissipate say 125-150watts if the MXM port allowed it.
    I know, again it was more of a cross your fingers moment hoping for possibly another taste of maxwell since we have already seen a 60 watt 750ti that gives good performance.

    Plus if you recall, nvidia did a similar idea to what I was hoping for with the release of the gtx 680m
    Last edited: Mar 14, 2014
  24. Tech Analyzer

    Tech Analyzer TS Rookie

    I'm going to buy The New Razer Blade.... can't believe it has 870M graphic card, multi-touch 4K Screen, and 4th gen Intel i7 in this ultimate thin design, thinner than the retina macbook pro!!!

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...