Nvidia overhauls mobile graphics line with GeForce 800M GPUs featuring Battery Boost

Shawn Knight

Posts: 15,289   +192
Staff member

Nvidia on Wednesday launched its new GeForce 800M line of notebook GPUs, a complete overhaul of its mobile graphics line that promises to deliver more processing power while consuming less power. Key to the latter bit of that promise is Battery Boost, a driver-level governor of sorts capable of doubling battery life while gaming.

Nvidia’s new lineup consists of GPUs based on a variety of architectures. The GTX 880M and 870M are built on Kepler, the same one used in the beastly GeForce GTX Titan Black for desktops. The 860M will be available on either architecture depending on OEM needs while the 850M is Maxwell-only.

Two lower-end GPUs, the 840M and 830M, also use Maxwell while the 820M is based on the older 28nm Fermi architecture.

Battery Boost is activated when a user fires up a game while the system is running on battery power. It allows players to lock in a target frame rate – let’s say 30FPS as an example – and by taking control of the GPU, CPU and memory systems, ensures that frames never exceed the target. This prevents components from using more power than needed and in turn, extends battery life. It sounds pretty clever but we’ll reserve final judgment until we can get a system in and test it firsthand.

Another feature coming to notebooks with the launch is ShadowPlay. This allows gamers to capture in-game footage to share with others or broadcast their gaming exploits to Twitch with virtually no performance impact.

Elsewhere, GameStream support will allow gamers to stream gameplay from their notebook to a GameStream-compatible device like the Nvidia Shield.

The 800M can be found in a growing number of new notebooks from various manufacturers like the refreshed Razer Blade Pro.

Permalink to story.

 
Nvidia has very few gpu's on laptops that actually work fine after a while .Most of them overheat very fast,even if the laptop is brand new and when this happens it will never be the same .I would get ATI anytime although I am an nvidia fan on PC the reliability on laptops is so bad .Majority of laptops that get gpu problems like overheat and they get unstuck from the motherboard are Nvidia. You can reball them but they won't last ,they are compromised.
 
Key to the latter bit of that promise is Battery Boost, a driver-level governor of sorts capable of doubling battery life while gaming.
if this is done thru a driver, then can it be inferred it's also present in previous NVidia 700m series?
 
if this is done thru a driver, then can it be inferred it's also present in previous NVidia 700m series?
Nope. The Battery Boost feature is in part hardware based - revised silicon, whether it is Kepler or Maxwell based.
It also seems to linked to both the firmware/BIOS of the laptop, and GeForce Experience software. I think the common conception is that all three are required to optimize both GPU and CPU workload.
 
Nope. The Battery Boost feature is in part hardware based - revised silicon, whether it is Kepler or Maxwell based.
It also seems to linked to both the firmware/BIOS of the laptop, and GeForce Experience software. I think the common conception is that all three are required to optimize both GPU and CPU workload.
thanks for the info. :)
I thought NVidia just found a way to (further) bury amd's lackluster (so far) mantle technology.
 
Nvidia has very few gpu's on laptops that actually work fine after a while .Most of them overheat very fast,even if the laptop is brand new and when this happens it will never be the same .I would get ATI anytime although I am an nvidia fan on PC the reliability on laptops is so bad .Majority of laptops that get gpu problems like overheat and they get unstuck from the motherboard are Nvidia. You can reball them but they won't last ,they are compromised.

Have any examples?
We've tossed many laptops with dedicated Radeon graphics because they burnt up, I've never tossed one with Nvidia graphics. I'm not just saying that to say it or stick up for gang green either.

I have a GT550M (40NM/1GB VRAM/128bit/144 CUDA CORES) in my L702X (over 2 years old now but that i5-2430M still rips) and I have the core overclocked 100mhz so it puts up GT555M type performance. I've never had it get anywhere near very hot/overheating and even though my laptop is only 1600 X 900 it plays very well at this rez, games like Borderlands, Left For Dead 2, Warcraft and Dragon Age: Origins I can max out or nearly max out @ 60 FPS (have screens).
 
Last edited:
If anything, my amd 7970m runs hot, now you may not beleive that it hasent damaged anything yet, but it likes to run at about 100c-103 (highest ive seen speccy or speedfan pick it up at), had it two years so far
 
NVIDIA has been really disappointing recently, Surround is broken as hell despite it being around for years. They're losing touch with consumers. I think its time to jump ship.
 
If you really wanna compare either sides GPU's this go round on the mobile division they are all almost primarily rebranded cards even on the high end. The 880m is just a 780m with a core clock bump and the m290X is just a 8970m with a core clock bump.

Either way, laptops are not getting much love which is starting to show because the M290X is essentially just a reference HD 7870 (R7 270) and the 880m is a GTX 770 (680) with a low core clock.

Well, guess ill be swapping out my 675m for either an M290X or the 880m depending on the pricing for the MXM individual cards. I doubt this battery boost option though is going to make enough of a difference to matter on the high end because I would rather it just default to the Intel graphics when not gaming and run at max frequency when gaming while plugged in. Batteries on the mobile game laptops don't last anyways so while im all for improved power consumption, this is not going to help much.
 
Last edited:
Nope. The Battery Boost feature is in part hardware based - revised silicon, whether it is Kepler or Maxwell based.
It also seems to linked to both the firmware/BIOS of the laptop, and GeForce Experience software. I think the common conception is that all three are required to optimize both GPU and CPU workload.
Arg, requiring GeForce Experience to work? I hope they plan on migrating the feature over to drivers.

Looks to me they are trying to make it mandatory for nVidia owners to install GeForce Experience. Don't get me wrong, I currently have it installed for driver notifications, as well as game optimizations. Battery boost though to me is a feature that should be included within the driver package, for those who don't want GeForce Experience installed.
 
Nvidia, please stop teasing us with lower end Maxwells and get on with a new beast please :)

I'm joking, take your time, I don't think I'll need to upgrade for a while anyway, I'm waiting until Nvidia release a DX12 compatible card before I buy a new card.

This was kinda expected, re-brands but with newer silicon to support battery saving and ShadowPlay, Nothing new or unexpected but nothing to be alarmed about either, once Maxwell is in full swing, that's when things will get interesting again.
 
If you really wanna compare either sides GPU's this go round on the mobile division they are all almost primarily rebranded cards even on the high end.
It's the higher end of the model lineups that are rebrands. The low/medium end of both AMD and Nvidia are new silicon. The HD 85x0M/86x0M/87x0M are all based on the new-ish Oland GPU, and the 830M/840M (GM108) and 850M/860M 640 core (GM107) are Maxwell based
The 880m is just a 780m with a core clock bump and the m290X is just a 8970m with a core clock bump.
.
The M290X has exactly the same clocks as the 8970M. It's the 8970M that is clock bumped, since it is a 7970M with a 50MHz boost state added.
Either way, laptops are not getting much love which is starting to show
That has been the case for some considerable time. OEMs have always looked to established silicon when allying the GPU with higher power demand/heat output. The OEMs value stability and lessened warranty issues to a higher degree with prebuilts since the unit replacement/repair cost is appreciably higher than an individual component.
Arg, requiring GeForce Experience to work? I hope they plan on migrating the feature over to drivers.
You wont be the only one
Looks to me they are trying to make it mandatory for nVidia owners to install GeForce Experience. Don't get me wrong, I currently have it installed for driver notifications, as well as game optimizations. Battery boost though to me is a feature that should be included within the driver package, for those who don't want GeForce Experience installed.
It's usually the Nvidia way to try to package the software with the hardware, so I wouldn't hold my breath waiting for Nvidia to make it less intrusive/smaller footprint. There probably wouldn't be much impediment to including Battery Boost in the driver + requisite BIOS/firmware addition...excepting Nvidia's wanting to package more software- which comes down to a visibility/marketing point rather than any real consumer need (IMO).
 
It's the higher end of the model lineups that are rebrands. The low/medium end of both AMD and Nvidia are new silicon. The HD 85x0M/86x0M/87x0M are all based on the new-ish Oland GPU, and the 830M/840M (GM108) and 850M/860M 640 core (GM107) are Maxwell based
Hence why I said "Almost All", the low end get a refresh but in reality they are not worth much when looking at how far the on board HD4XXX and Richland/Trinity/Kaveri APU's have come minus the GTX 860m and 850m.
I miss-read the notebook check, thought it said 900 with 50 mhz boost not 900 with the boost max.
That has been the case for some considerable time. OEMs have always looked to established silicon when allying the GPU with higher power demand/heat output. The OEMs value stability and lessened warranty issues to a higher degree with prebuilts since the unit replacement/repair cost is appreciably higher than an individual component.
Doesn't change the fact a refresh is really needed. The notebook sector is starting to fall behind faster and faster on the high end division (While the mid range jumps up). The fact is that bumping the core clocks constantly on the higher GPU's is at this point not enough considering the GPU's have not changed in a year yet the prices remain the same. Kinda annoying to keep having to spend the same money for minor clock bumps instead of actual performance boosts or increase in Cuda core counts/Stream processor counts.

The only saving grace is that the 100+ core clock bump on the 880m brings it closer to its desktop equivalent meaning its got some massive power while the 8970/m290X has the same performance but a few mantle titles and its price. Makes it hard to shop for a notebook when not much changes year after year. Part of the reason my GTX 675m stayed so relevant and is still a decent performer.
 
Doesn't change the fact a refresh is really needed. The notebook sector is starting to fall behind faster and faster on the high end division (While the mid range jumps up).
That is always the case when the top desktop parts become basically outliers to the rest of the model line up. The current AMD and Nvidia top tier GPUs are there by dint of their size and requisite power budget/heat production.
For AMD to maintain parity between desktop and mobile they would have to try to shoehorn a Hawaii GPU into a 100W TDP* thermal budget. Feasible ? Likewise you won't see Nvidia offering a mobile GK110 for the same reason.
Both Hawaii and GK110 offer a huge bump in performance over the next tier GPUs (at the cost of power/heat). Remove both from the equation and the desktop-to-mobile comparison looks much less exaggerated.
You wont see any real change in that situation unless Nvidia release a GM106 on 28nm (looking unlikely), or until both camps move to 16nm FinFET ( 20nm BEOL + 16nm FEOL), since it looks increasingly unlikely that either AMD or Nvidia will use the (low-power optimized) 20nm process, thanks to the lack of power/die space savings offered by 20nm, and the unexpectedly quick progression of TSMC's 16nm FF node.

* 100 Watts being the de-facto standard upper limit for mobile GPUs from an OEM standpoint.
 
That is always the case when the top desktop parts become basically outliers to the rest of the model line up. The current AMD and Nvidia top tier GPUs are there by dint of their size and requisite power budget/heat production.
For AMD to maintain parity between desktop and mobile they would have to try to shoehorn a Hawaii GPU into a 100W TDP* thermal budget. Feasible ? Likewise you won't see Nvidia offering a mobile GK110 for the same reason.
Both Hawaii and GK110 offer a huge bump in performance over the next tier GPUs (at the cost of power/heat). Remove both from the equation and the desktop-to-mobile comparison looks much less exaggerated.
You wont see any real change in that situation unless Nvidia release a GM106 on 28nm (looking unlikely), or until both camps move to 16nm FinFET ( 20nm BEOL + 16nm FEOL), since it looks increasingly unlikely that either AMD or Nvidia will use the (low-power optimized) 20nm process, thanks to the lack of power/die space savings offered by 20nm, and the unexpectedly quick progression of TSMC's 16nm FF node.

* 100 Watts being the de-facto standard upper limit for mobile GPUs from an OEM standpoint.
Well the 7970m/8970m/M290X is based on the Pitcairn GPU which also leaves out the Tahiti GPU and working around that which I could . I would not believe that making a Hawaii in a 100 watt is easy without significant clock sacrifices, however cooling the has improved throughout the years along with lower voltage CPU's which can leave more headroom on the GPU side. With the GTX 880m and the M290X I just see a lag behind in the mobile devision on the high end while the mid and low start to jump up pretty close to their levels.

It almost makes the high end less nice when the mid range is offering such a good performance especially considering the different costs.
 
I would not believe that making a Hawaii in a 100 watt is easy without significant clock sacrifices, however cooling the has improved throughout the years along with lower voltage CPU's which can leave more headroom on the GPU side.
Downclocking a Hawaii would negate any benefit it could have in relationship to upclocking a Pitcairn (or similar) GPU. Hawaii's silicon real estate isn't conducive to practical mobile operation, any more than Cayman and Tahiti before it.
A fully enabled Hawaii die at 440MHz core is sucking down almost 80 watts, and it certainly isn't a better option than a Pitcairn running twice as fast.
Power budget for the mobile system as a whole isn't really the issue either. It is about localized heat output and dissipation and the fact that the MXM 3.0B specification has an upper limit of 100 watts
 
Downclocking a Hawaii would negate any benefit it could have in relationship to upclocking a Pitcairn (or similar) GPU. Hawaii's silicon real estate isn't conducive to practical mobile operation, any more than Cayman and Tahiti before it.
A fully enabled Hawaii die at 440MHz core is sucking down almost 80 watts, and it certainly isn't a better option than a Pitcairn running twice as fast.
Power budget for the mobile system as a whole isn't really the issue either. It is about localized heat output and dissipation and the fact that the MXM 3.0B specification has an upper limit of 100 watts
Well I never said it was possible on hawaii, I was infering tahiti or cayman on AMD side. MXM 3.0b with a 100watt limit does hold back which also infers a need for MXM 4.0 and higher power limits. However 2-3 generations of just bumping core clocks is getting old no matter what.

Localized heat output on high systems is possible very easily, thats why we can have system with dual 8970m's and GTX 780m's. In reality the problem only comes down to the companies controlling their power output which I had hope Nvidia was going to refresh with a Maxwell 880m that would completely stomp outthe 780m and 8970m/M290X since Maxwell is very power efficient as already seen on the 750ti SKU.

Whatever, does not matter at this point, whats done is done. Just is going to come down to the overall performance and price for me at least, and how hard adjusting my MSI notebook to the different cards will be.
 
Well I never said it was possible on hawaii, I was infering tahiti or cayman on AMD side
Ah! My bad, when you said:
I would not believe that making a Hawaii in a 100 watt is easy without significant clock sacrifices
...I thought you were inferring that Hawaii could be a 100 watt card if clockspeed was sacrificed. Since Hawaii pulls ~80W in some 2D applications, I thought it unlikely.
MXM 3.0b with a 100watt limit does hold back which also infers a need for MXM 4.0 and higher power limits.
Most people would probably see the challenge as improving efficiency within the existing specification rather than raising the thermal power limits, since increased power has the not uncommon effect of cratering battery life...and of course the no small matter that input power equals heat production. AFAIK there isn't any plans for an MXM 4.0 specification- hardly surprising since the revised 3.1 spec is fairly new.
However 2-3 generations of just bumping core clocks is getting old no matter what.
Not a hell of lot you can do when the only silicon foundry capable of outputting high performance/high power GPUs that isn't Intel is stuck on the same process node for those 2- 3 generations. At least Maxwell is making an attempt at maximizing the potential of an in-place foundry process
Localized heat output on high systems is possible very easily, thats why we can have system with dual 8970m's and GTX 780m's.
Which actually presents two localized heat areas, not one large one. The MXM specification also covers heatsinks as well as TDP (and bus width IIRC)
In reality the problem only comes down to the companies controlling their power output which I had hope Nvidia was going to refresh with a Maxwell 880m that would completely stomp outthe 780m and 8970m/M290X since Maxwell is very power efficient as already seen on the 750ti SKU.
And the highest performing Maxwell part is already used for the 860M (750Ti). As I noted earlier, a GM106 on 28nm seems extremely unlikely since 1. its probable successor on 16nmFF isn't that far off meaning a short product life, 2. the improved efficiency demonstrated with the GM107 (and likely GM108) wont translate as dramatically to a larger GPU, and 3 - from a sales standpoint, high end mobile GPUs are both a niche market, and one that Nvidia can fight successfully with the present lineup.
Whatever, does not matter at this point, whats done is done.
Correct. Both Nvidia and AMD are shackled by process node, OEM wish lists and product cycles, and the need to recoup design investment
 
Last edited:
Most people would probably see the challenge as improving efficiency within the existing specification rather than raising the thermal power limits, since increased power has the not uncommon effect of cratering battery life...and of course the no small matter that input power equals heat production. AFAIK there isn't any plans for an MXM 4.0 specification- hardly surprising since the revised 3.1 spec is fairly new.
Well yea but at the same time, a "Gaming" notebook never has great battery life to begin with. I like better battery life on mine when im doing normal tasks which is where things like Zero Core or Nvidias underclock state to save power. But for gaming most laptops stay plugged in as is (At least among the groups/people I see with them).
Not a hell of lot you can do when the only silicon foundry capable of outputting high performance/high power GPUs that isn't Intel is stuck on the same process node for those 2- 3 generations. At least Maxwell is making an attempt at maximizing the potential of an in-place foundry process
Quite, I was more hoping that they would release something not on desktop SKU yet for the mobile as another taste of Maxwell if you will on the high end. Was more of a Cross your fingers moment if you will.
Which actually presents two localized heat areas, not one large one. The MXM specification also covers heatsinks as well as TDP (and bus width IIRC)
Well yes, but all I was saying is that its more than possible to dissipate that much heat in a laptop in general in a local area. Even if were saying at one single point, its more than probable that we could dissipate say 125-150watts if the MXM port allowed it.
And the highest performing Maxwell part is already used for the 860M (750Ti). As I noted earlier, a GM106 on 28nm seems extremely unlikely since 1. its probable successor on 16nmFF isn't that far off meaning a short product life, 2. the improved efficiency demonstrated with the GM107 (and likely GM108) wont translate as dramatically to a larger GPU, and 3 - from a sales standpoint, high end mobile GPUs are both a niche market, and one that Nvidia can fight successfully with the present lineup.
I know, again it was more of a cross your fingers moment hoping for possibly another taste of maxwell since we have already seen a 60 watt 750ti that gives good performance.

Plus if you recall, nvidia did a similar idea to what I was hoping for with the release of the gtx 680m
 
Last edited:
I'm going to buy The New Razer Blade.... can't believe it has 870M graphic card, multi-touch 4K Screen, and 4th gen Intel i7 in this ultimate thin design, thinner than the retina macbook pro!!!
 
Back