GeForce Laptop Roundup: Comparison of Gaming Laptops' Graphics Performance

There are only two laptops I've really considered purchasing to have a Core i7 + 2060:
The HP OMEN which is around $1100 and the Alienware M15r3 for $1700

Most of my favorite games don't have Ray Tracing and I've yet to be fully sold on the gimmick. I don't see myself spending top dollar to get Ray Tracing for my next laptop.
 
Just bought myself a Acer Triton 500 with i7 10875H + 2070 Super Max-Q, 32GB 2933mhz RAM and 1TB SSD, 300hz 1080p screen, cost me an arm and a leg ;)

Here is the timespy score, pretty close to a desktop 9700K + 2060 Super, not bad at all. Using ThrottleStop and Afterburner to undervolt the CPU/GPU kept the temp and fan speeds in check.

I was waiting for 4900HS laptop with a good GPU but couldn't wait any longer, however I do none of the workstation stuff so an Intel chip is still a good choice here. Gotta say though all those single core boost clock of the 10th gen is complete useless (5.1ghz in the case of 10875H), at stock config the CPU can just randomly reach 85C due to the aggressive clock/voltage. Playing any game and the CPU would just throttle down to 3.7ghz all core anyways.
 
The industry would love you to believe that when it comes to gaming laptops, you can just pick the parts you want and you'll reliably get the performance to match.

Personally, I'm not a believer. I think there are often significant differences in cooling design, fan settings, throttling firmware, power delivery, reliability over time, and even individual variability in the quality of each chip. The net usable performance you actually receive can easily be less than the quoted peak performance potential (and even less if you have a limit to how much fan noise you'll tolerate.) I've also seen cases where the units sent to reviewers had different components (I.e., component lottery) than what end users got, which could be either normal variations in parts availability or maybe something worse.

The laptop reviews I'd really want would be focused on a specific model with documented components where I could be assured of ordering an identical model, and would spend at least 50% of their effort on investigating the reliability and support of that unit. Of course no one does that and for brand new models the data doesn't exist anyway.
 
Tim please forgive this but could you test out a desktop replacement laptop and add it to the result tables? I have an MSI GT76 Titan 9SG that would work or maybe the Asus ROG mothership. Anything with a truly full RTX 2080 (205 watt) gpu.

Not an Area 51m though. They are 180 watt and overheat so quickly and throttle badly.
 
Last high end laptop I bought was 2000$ only 5 years ago, it had the last high end laptop graphic card at the moment (980m), 16Gb RAM, 256Go SSD, 1T 7200rpm HDD and best last gen CPU.
I still use it today for gaming and I have no complaint except that I can't run games on very high settings anymore.

Nowadays when I look at high end laptops I just see overpriced frying pans. They made GPU more efficient but fans are noisy again and can't even compensate for the last Gen CPU that became so hot because of the frequencies... I remember the pentium CPUs running 5GH, that's when they started to add more CPU cores and reduced frequency then temperature got lower.

What are they waiting for? Burning batteries?
 
Last high end laptop I bought was 2000$ only 5 years ago, it had the last high end laptop graphic card at the moment (980m), 16Gb RAM, 256Go SSD, 1T 7200rpm HDD and best last gen CPU.
I still use it today for gaming and I have no complaint except that I can't run games on very high settings anymore.

Nowadays when I look at high end laptops I just see overpriced frying pans. They made GPU more efficient but fans are noisy again and can't even compensate for the last Gen CPU that became so hot because of the frequencies... I remember the pentium CPUs running 5GH, that's when they started to add more CPU cores and reduced frequency then temperature got lower.

What are they waiting for? Burning batteries?

Yup, sadly Intel has been in a standstill for 3 generations of mobile CPU now, 8750H - 9750H - 10750H are the same freaking CPU. The only thing they did were adding 2 more cores and raising the single core boost by increasing the voltages during single core load, result ? CPU temp just randomly spike to 90C and the fan spin up like crazy when doing normal tasks.

With my 10875H I had to undervolt by -95mV and reduce the boost clocks by -800mhz just to reduce the fan speeds to acceptable range. Before tweaking the CPU voltage was 1.37V, after tweaking 1.08V, improved temp by -30C and cut the fan noise in half while only losing 10% performance.

202001118ba0-6ef2-4227-ad2c-fe12991f3d09.jpg
 
Last edited:
It is a real mess now, because when you buy a gaming notebook, you don’t know which model of GPU you are actually buying. They are advertised as “2070 Super Max Q”, or “2060”, without any indication about TDP.
 
Yup, sadly Intel has been in a standstill for 3 generations of mobile CPU now, 8750H - 9750H - 10750H are the same freaking CPU. The only thing they did were adding 2 more cores and raising the single core boost by increasing the voltages during single core load, result ? CPU temp just randomly spike to 90C and the fan spin up like crazy when doing normal tasks.

With my 10875H I had to undervolt by -95mV and reduce the boost clocks by -800mhz just to reduce the fan speeds to acceptable range. Before tweaking the CPU voltage was 1.37V, after tweaking 1.08V, improved temp by -30C and cut the fan noise in half while only losing 10% performance.

202001118ba0-6ef2-4227-ad2c-fe12991f3d09.jpg

I beg your pardon but you did a very poor job here, in my opinion.
It is a good thing to undervolt the CPU, but by cutting turbo boost frequencies so much you are hampering performance too much.
It could be a good idea to lower the all cores turbo ratio by 100/200 MHz, but you are keeping it even below base clock. That’s insane.
There is no reason to buy an expensive high performance notebook and then do this.
You want to avoid thermal throttling, for sure, but to lower temperatures by 30 degrees means you are not taking advantage by your powerful cpu.
I know the noise could be an issue, but that notebook isn’t intended to be very quiet by design.

maybe you should buy a 15W TDP notebook with i7, which is definitely less hot and more quiet.
 
Last edited:
I beg your pardon but you did a very poor job here, in my opinion.
It is a good thing to undervolt the CPU, but by cutting turbo boost frequencies so much you are hampering performance too much.
It could be a good idea to lower the all cores turbo ratio by 100/200 MHz, but you are keeping it even below base clock. That’s insane.
There is no reason to buy an expensive high performance notebook and then do this.
You want to avoid thermal throttling, for sure, but to lower temperatures by 30 degrees means you are not taking advantage by your powerful cpu.
I know the noise could be an issue, but that notebook isn’t intended to be very quiet by design.

maybe you should buy a 15W TDP notebook with i7, which is definitely less hot and more quiet.

Base clock for 10875H is 2.2Ghz, you also miss some information here, lower TDP doesn't mean lower CPU temperature, it's a combination of voltage and frequency also. If you use only 1-2 core and load it up with 1.4v, which the 10875H will do if you leave it alone, will cause CPU to reach 85C almost instantly even with low TDP cap. That cause the fan that is tied to CPU temp to spin to 5000rpm at 52dBa just doing normal tasks.

Now for everyday task there is no need for CPU to run at 5ghz, it's just the way it is. For gaming I actually gain performance due to Nvidia dynamic boost feature that allow GPU to draw more power when CPU is not fully utilized. The 2070 Super Max-Q in my laptop is in fact drawing 105W during gaming here, which would make it faster than a 2080 Super Max-Q at 80W.

Here is the review for my laptop but with different config, mine is a 10875H + 2070 Super while this one is 10750H + 2080 Super
Here is my timespy score with the lowered CPU clocks (stock score is 9k2), with a CPU score of 8k6 that is already faster than a desktop 9700K CPU, there is really no point going for faster clocks here beside damaging my eardrum :D.

It is a real mess now, because when you buy a gaming notebook, you don’t know which model of GPU you are actually buying. They are advertised as “2070 Super Max Q”, or “2060”, without any indication about TDP.

Yeah it's really confusing, my 2070 Super Max-Q is 80W at stock, activating "Extreme" profile on Acer software will increase the TDP to 90W, using Nvidia Dynamic Boost (which is only available on selected models) will allow an additional 15W for the GPU. With 25W more power the performance can easily improve by 10%...
 
Last edited:
Base clock for 10875H is 2.2Ghz, you also miss some information here, lower TDP doesn't mean lower CPU temperature, it's a combination of voltage and frequency also. If you use only 1-2 core and load it up with 1.4v, which the 10875H will do if you leave it alone, will cause CPU to reach 85C almost instantly even with low TDP cap. That cause the fan that is tied to CPU temp to spin to 5000rpm at 52dBa just doing normal tasks.

Now for everyday task there is no need for CPU to run at 5ghz, it's just the way it is. For gaming I actually gain performance due to Nvidia dynamic boost feature that allow GPU to draw more power when CPU is not fully utilized. The 2070 Super Max-Q in my laptop is in fact drawing 105W during gaming here, which would make it faster than a 2080 Super Max-Q at 80W.

Here is the review for my laptop but with different config, mine is a 10875H + 2070 Super while this one is 10750H + 2080 Super
Here is my timespy score with the lowered CPU clocks (stock score is 9k2), with a CPU score of 8k6 that is already faster than a desktop 9700K CPU, there is really no point going for faster clocks here beside damaging my eardrum :D.



Yeah it's really confusing, my 2070 Super Max-Q is 80W at stock, activating "Extreme" profile on Acer software will increase the TDP to 90W, using Nvidia Dynamic Boost (which is only available on selected models) will allow an additional 15W for the GPU. With 25W more power the performance can easily improve by 10%...
You are right about base clock (I've got confused with desktop version) but my point stands: you are reducing too much the turbo boost in my opinion.
Don't get me wrong, you can do whatever you want with your computer and performance are still great, but what is the point in buying such a powerful and pricey CPU like the i7-10875H if you are not exploiting its full potential ?
For sure while gaming the GPU is more important, but there are games where CPU is more relevant than in TimeSpy.
 
You are right about base clock (I've got confused with desktop version) but my point stands: you are reducing too much the turbo boost in my opinion.
Don't get me wrong, you can do whatever you want with your computer and performance are still great, but what is the point in buying such a powerful and pricey CPU like the i7-10875H if you are not exploiting its full potential ?
For sure while gaming the GPU is more important, but there are games where CPU is more relevant than in TimeSpy.


Well this review will answer any of your doubt


So the 10875H when configured to 62W tie with the 4900HS at 35W, the 10875H at 94W TDP is only 5% faster while using 30W more (and 27C hotter), the clocks difference is almost 500mhz also.

Every game nowadays will try to spread the load to every core on the CPU, so even when gaming, the 10875H would use the all-core boost clock, not the single-core boost. But as you can see above, 500mhz core clocks only translate to 5% more performance but with 50% power penalty. That is a huge price to pay.


The benefit of 10875H over the 10750H is the extra 2 cores, that means I can run those 8 cores at lower, more efficient voltage but still getting higher overall performance + lower temperature than the 10750H.

And yeah I wanted the 4900HS option but it's only available on the Zephyrus G14 which has terrible cooling solution and monitor.
 
Last edited:
I beg your pardon but you did a very poor job here, in my opinion.
It is a good thing to undervolt the CPU, but by cutting turbo boost frequencies so much you are hampering performance too much.
It could be a good idea to lower the all cores turbo ratio by 100/200 MHz, but you are keeping it even below base clock. That’s insane.
There is no reason to buy an expensive high performance notebook and then do this.
You want to avoid thermal throttling, for sure, but to lower temperatures by 30 degrees means you are not taking advantage by your powerful cpu.
I know the noise could be an issue, but that notebook isn’t intended to be very quiet by design.

maybe you should buy a 15W TDP notebook with i7, which is definitely less hot and more quiet.
I bought a (ryzen 3500 I think) matebook laptop about six months ago cos it was a fairly cheap but well reviewed replacement for an ailing MacBook air. Four months ago I had to start doing online teaching though and I found the fans created a distrating buzz for the students. There weren't too many settings I could adjust so I just lived with it for a while until I learned I could go into the power settings and adjust the CPU performance. I found that setting it to 95% made the clock speeds reach a maximum of half their potential, but this kept the CPU temperature under 55 degrees which stopped the fans from ramping up. Computer still works perfectly for my productivity purposes even at only 50% potential!
 
I bought a (ryzen 3500 I think) matebook laptop about six months ago cos it was a fairly cheap but well reviewed replacement for an ailing MacBook air. Four months ago I had to start doing online teaching though and I found the fans created a distrating buzz for the students. There weren't too many settings I could adjust so I just lived with it for a while until I learned I could go into the power settings and adjust the CPU performance. I found that setting it to 95% made the clock speeds reach a maximum of half their potential, but this kept the CPU temperature under 55 degrees which stopped the fans from ramping up. Computer still works perfectly for my productivity purposes even at only 50% potential!

Yup, the sporadic nature with CPU temp is a problem when you have loud fans that can't be manually tuned. Even on desktop PC if you have a cheap CPU cooler, it can be quite a distraction when the fans just spin up and slow down erratically.

My done and forget solution for desktop is plugging the CPU cooler fans into the Sys_Fan headers and tie fan speeds to VRM temp. That allows for a gentler fan curve.

For laptop I would just use Throttlestop to create 2 different clocks profile, one for normal usage which never spin up the fans and one for maximum performance during gaming. So far I haven't use the maximum CPU performance profile though, the cost (fan noise) far out-weight the benefits.
 
Last edited:
Yup, the sporadic nature with CPU temp is a problem when you have loud fans that can't be manually tuned. Even on desktop PC if you have a cheap CPU cooler, it can be quite a distraction when the fans just spin up and slow down erratically.

My done and forget solution for desktop is plugging the CPU cooler fans into the Sys_Fan headers and tie fan speeds to VRM temp. That allows for a gentler fan curve.

For laptop I would just use Throttlestop to create 2 different clocks profile, one for normal usage which never spin up the fans and one for maximum performance during gaming. So far I haven't use the maximum CPU performance profile though, the cost (fan noise) far out-weight the benefits.
I'm really confused about VRM temps, perhaps you could help. With my b450 carbon pro and 2700 OC'D to limit and now 3800x at 4.4ghz all core my vrm temperature never goes/went above 65 degrees even if my CPU was in the 80s for a stress test. So I'm not quite understanding why having good VRMs are considered to be so important. Is my board perhaps not set up properly. On hardware unboxed they regularly report 80 or 90 degree VRMs on stress tests but my CPU will always hit the thermal limit quite quickly if I've pushed it too much. One thing that occurs to me in writing this is perhaps they do extended stress tests whereas I only ever do something like cinebench.
 
VRM temp is not really important if all you do is gaming or web surfing. VRM temp depend on how much power your CPU pulls, so if you leave your 3800X at stock, it will never pull more than 105W and it's not really a problem for any motherboard but the cheapest ones (like A320 boards).

Using higher TDP CPU like the 3900X and 3950X would increase VRM temp and so does overclocking your CPU. You can try increasing PBO power limit value to 200W and see how hotter your VRM is.

Prolonged stress testing would also increase VRM temp until it reaches an equilibrium point, HUB test VRM temp by running blender for an hour, which is a realistic load for anyone doing video editing. The thing is they use watercooling to cool their 3900X, so the CPU can pull as much power as it wants (200W+) and not thermal throttles, which your CPU does.

You can stress test your VRM with tools like OCCT, Prime95 v26.5 or AIDA64 stability test. Just monitor your CPU power usage along with VRM temp.
 
VRM temp is not really important if all you do is gaming or web surfing. VRM temp depend on how much power your CPU pulls, so if you leave your 3800X at stock, it will never pull more than 105W and it's not really a problem for any motherboard but the cheapest ones (like A320 boards).

Using higher TDP CPU like the 3900X and 3950X would increase VRM temp and so does overclocking your CPU. You can try increasing PBO power limit value to 200W and see how hotter your VRM is.

Prolonged stress testing would also increase VRM temp until it reaches an equilibrium point, HUB test VRM temp by running blender for an hour, which is a realistic load for anyone doing video editing. The thing is they use watercooling to cool their 3900X, so the CPU can pull as much power as it wants (200W+) and not thermal throttles, which your CPU does.

You can stress test your VRM with tools like OCCT, Prime95 v26.5 or AIDA64 stability test. Just monitor your CPU power usage along with VRM temp.
Cheers, that's just what I needed to know. I just game on my PC and now leave the 3800x at stock (and high pbo settings) cos I can't see any gaming performance increase between all core overclock and stock. Thanks for your advice.
 
Back