Desktop GeForce vs. Laptop GeForce: Gaming Performance Compared

Like here's a question, if you could design your own laptop from scratch, how would you make it. Would the cpu be socketed or soldered for example.
 
Like here's a question, if you could design your own laptop from scratch, how would you make it. Would the cpu be socketed or soldered for example.
A gaming laptop ? I wouldn’t mind about weight and dimensions, but I would use socketed CPU and a big cooling system.
a gaming laptop in my opinion is something you can move, not something you are going to use while moving.
 
A gaming laptop ? I wouldn’t mind about weight and dimensions, but I would use socketed CPU and a big cooling system.
a gaming laptop in my opinion is something you can move, not something you are going to use while moving.
Most of those are 1 inch laptops and above.
 
I believe there is.. it doesn't go to 200w though (that's the "desktop" version adapted for the Area 51). It's different than the Max Q though...

There are true mobile versions that go to 200W . I have an Aorus 17X that while is pretty heavy for a laptop at 8.25lbs it still uses a mobile CPU (1875H), not a socketed desktop and a 2080 Super mobile with a 200W TDP. It is not quite a true desktop replacement (price, performace and size wise) but it will sustain an average wattages near the max for extended periods of time and has topped out at 205W in benchmarking. For the 2080 Super Mobile (non max q) Nvida says 150W+ because they allow 160W, 170W, 180W, 190W, and 200W in the max p versions. Notebook check has a list of clock speeds based on the wattage configuration for these. Each higher wattage version gets a higher max boost clock. I haven't done any overclocking thus far and mine is listed at 1740mhz max but will sustain 1877 in Time Spy and over 1900 in Fire Strike. I do get that it's not possible to test everything especially at the top end but there are enough 150W units out there in units that are not desktop replacement by any stretch but no one seems even willing to consider comparing any max p to a max q in the 2080 super. In 150W cards alone Asus uses it in the Strix Scar 17, HP Omen uses it, MSI GE75 Raider (5.74lbs only) Alienware m17 R3 (nowhere near as heavy as an Area51m R2), etc. All of these have been reviewed but it's almost like an unwritten rule from Nvidia not to compare them to a max q model. Even most articles explaining the differences between max p and max q candy coat it, downplaying the performance differences and focus on the efficiency of max q. This is one of the few that is more straightforward. With the gap being bigger on the 2080 Super than any of the other cards since it's a minimum of 60W difference if not more it's a shame there aren't any comparisons. I'd lke to see a comparision between an otherwise similarly equipped Strix Scar 17 and Zephyrus S17 for a good example of max p versus max q.
 
There are true mobile versions that go to 200W . I have an Aorus 17X that while is pretty heavy for a laptop at 8.25lbs it still uses a mobile CPU (1875H), not a socketed desktop and a 2080 Super mobile with a 200W TDP. It is not quite a true desktop replacement (price, performace and size wise) but it will sustain an average wattages near the max for extended periods of time and has topped out at 205W in benchmarking. For the 2080 Super Mobile (non max q) Nvida says 150W+ because they allow 160W, 170W, 180W, 190W, and 200W in the max p versions. Notebook check has a list of clock speeds based on the wattage configuration for these. Each higher wattage version gets a higher max boost clock. I haven't done any overclocking thus far and mine is listed at 1740mhz max but will sustain 1877 in Time Spy and over 1900 in Fire Strike. I do get that it's not possible to test everything especially at the top end but there are enough 150W units out there in units that are not desktop replacement by any stretch but no one seems even willing to consider comparing any max p to a max q in the 2080 super. In 150W cards alone Asus uses it in the Strix Scar 17, HP Omen uses it, MSI GE75 Raider (5.74lbs only) Alienware m17 R3 (nowhere near as heavy as an Area51m R2), etc. All of these have been reviewed but it's almost like an unwritten rule from Nvidia not to compare them to a max q model. Even most articles explaining the differences between max p and max q candy coat it, downplaying the performance differences and focus on the efficiency of max q. This is one of the few that is more straightforward. With the gap being bigger on the 2080 Super than any of the other cards since it's a minimum of 60W difference if not more it's a shame there aren't any comparisons. I'd lke to see a comparision between an otherwise similarly equipped Strix Scar 17 and Zephyrus S17 for a good example of max p versus max q.

Having higher wattage GPU on laptop is just useless, especially the 2080 Super Max-P version. It's not the GPU fault though, Intel 10th gen mobile CPU are just lacking, offering no efficiency gain vs 9th, with a max DDR4 frequency of just 2933mhz, having ultra fast GPU would just give you bad frametimes.

Here is an example

As you can see the 150W 2080 Super variant is a little bit faster on AVG FPS but it lose to the 80-90W variant on 1% Low FPS (and that is with 10980HK vs 10875H CPU)

Jup, with Intel keep dropping the balls with their CPU design and AMD limiting their Renoir CPU with just PCIe 3.0 x8 for dGPU, chasing higher GPU performance on laptop is just a wild-goose chase.
 
Last edited:
Having higher wattage GPU on laptop is just useless, especially the 2080 Super Max-P version. It's not the GPU fault though, Intel 10th gen mobile CPU are just lacking, offering no efficiency gain vs 9th, with a max DDR4 frequency of just 2933mhz, having ultra fast GPU would just give you bad frametimes.

Here is an example

As you can see the 150W 2080 Super variant is a little bit faster on AVG FPS but it lose to the 80-90W variant on 1% Low FPS (and that is with 10980HK vs 10875H CPU)

Jup, with Intel keep dropping the balls with their CPU design and AMD limiting their Renoir CPU with just PCIe 3.0 x8 for dGPU, expecting higher GPU performance on laptop is just a wild-goose chase.
PCIe 3.0 x8 is good enough for 2070 or 2080. You wouldn't really lose.more than 5%.
 
Having higher wattage GPU on laptop is just useless, especially the 2080 Super Max-P version. It's not the GPU fault though, Intel 10th gen mobile CPU are just lacking, offering no efficiency gain vs 9th, with a max DDR4 frequency of just 2933mhz, having ultra fast GPU would just give you bad frametimes.

Here is an example

As you can see the 150W 2080 Super variant is a little bit faster on AVG FPS but it lose to the 80-90W variant on 1% Low FPS (and that is with 10980HK vs 10875H CPU)

Jup, with Intel keep dropping the balls with their CPU design and AMD limiting their Renoir CPU with just PCIe 3.0 x8 for dGPU, chasing higher GPU performance on laptop is just a wild-goose chase.

I don't agree it's useless to have a higher wattage GPU. It's an oversimplification to say it's a wild-goose chase IMO. Sure there are some games that are going to be CPU limited, but there are so many that are not (and if you are using an external monitor for 2k or 4k instead of 1080p you''re much more likely to be GPU limited. There are also two things to note regarding Jarrod's review. One is his comment that the GE75 did not have the option to disable the iGPU and that was why he thought it wasn't performing as well as expected. And while it's not mentioned in the comparisons later on when you see better 1% lows in the GS66, you can disable the iGPU which is likely why the 1% lows are higher. I also think cooler thermals would help in CPU limited games. There's a Tech Shock video of a GE75 (same 2080 150W 2080 Super but the lower 10750H. This was routinely at 95C topping out at 99C on RDRD2 and also reached nasty temps on Battlefield 5 so I would expect further throttling that doesn't take advantage of the the capabilities of a 10980HK configuration. I tend to think your thoughts would be mostly applicable to thin and light laptops and while GE75 isn't exactly thin and light per se I do believe it's the lightest of any 150W or higher 2080 Super. The Alienware m17 R3 is ~1lb heavier and it throttles a lot even undervolted and will hit stay at 100C in the physics test on FireStrike. For larger laptops with proper cooling I think the increased wattage is worthwhile. I didn't care all that much about weight (my Aorus weighs in at 8.26lbs) and my temps stay reasonable. My time spy scores exceed what I've seen on an m17 R3 with an 10980HK i9 that's undervolted and paired with an overclocked 2080 Super 150W. I haven't done any OC on mine, andI'm also getting equivalent Fire Strike scores to the Alienware with the lesser 10875Hprocessor and higher 200W GPU configuration Luckily my unit can also disable the iGPU like the GS66 could as well so I think I would get better 1% lows. Phew....way more long winded than I planned on originally.
 
I don't agree it's useless to have a higher wattage GPU. It's an oversimplification to say it's a wild-goose chase IMO. Sure there are some games that are going to be CPU limited, but there are so many that are not (and if you are using an external monitor for 2k or 4k instead of 1080p you''re much more likely to be GPU limited. There are also two things to note regarding Jarrod's review. One is his comment that the GE75 did not have the option to disable the iGPU and that was why he thought it wasn't performing as well as expected. And while it's not mentioned in the comparisons later on when you see better 1% lows in the GS66, you can disable the iGPU which is likely why the 1% lows are higher. I also think cooler thermals would help in CPU limited games. There's a Tech Shock video of a GE75 (same 2080 150W 2080 Super but the lower 10750H. This was routinely at 95C topping out at 99C on RDRD2 and also reached nasty temps on Battlefield 5 so I would expect further throttling that doesn't take advantage of the the capabilities of a 10980HK configuration. I tend to think your thoughts would be mostly applicable to thin and light laptops and while GE75 isn't exactly thin and light per se I do believe it's the lightest of any 150W or higher 2080 Super. The Alienware m17 R3 is ~1lb heavier and it throttles a lot even undervolted and will hit stay at 100C in the physics test on FireStrike. For larger laptops with proper cooling I think the increased wattage is worthwhile. I didn't care all that much about weight (my Aorus weighs in at 8.26lbs) and my temps stay reasonable. My time spy scores exceed what I've seen on an m17 R3 with an 10980HK i9 that's undervolted and paired with an overclocked 2080 Super 150W. I haven't done any OC on mine, andI'm also getting equivalent Fire Strike scores to the Alienware with the lesser 10875Hprocessor and higher 200W GPU configuration Luckily my unit can also disable the iGPU like the GS66 could as well so I think I would get better 1% lows. Phew....way more long winded than I planned on originally.

Yes my point is that the Intel 10th gen mobile CPU is not good enough for 2070 Super GPU and above, I easily gain 5% more perf on my 10875H + 2070 Super @ 1080p just by increasing the CPU TDP from 45W to 65W but it basically un-coolable (CPU reaching 95C even with max fan, at 45W TDP the CPU stay below 80C with 70% fan). Intel 10th gen CPU has a max T-junction of 100C but it start throttle at 90C, so all you get is bad frametimes when you unlock the CPU TDP to take advantage of your powerful GPU.

It also doesn't help that all laptop feature shared heat pipes between CPU and GPU, so higher wattage GPU --> hotter CPU.

So yeah, this is all Intel faults. Had there be laptop with 4900HS + 2080 Super Max-Q you will see better results than those listed in this article.
 
Yes my point is that the Intel 10th gen mobile CPU is not good enough for 2070 Super GPU and above, I easily gain 5% more perf on my 10875H + 2070 Super @ 1080p just by increasing the CPU TDP from 45W to 65W but it basically un-coolable (CPU reaching 95C even with max fan, at 45W TDP the CPU stay below 80C with 70% fan). Intel 10th gen CPU has a max T-junction of 100C but it start throttle at 90C, so all you get is bad frametimes when you unlock the CPU TDP to take advantage of your powerful GPU.

It also doesn't help that all laptop feature shared heat pipes between CPU and GPU, so higher wattage GPU --> hotter CPU.

So yeah, this is all Intel faults. Had there be laptop with 4900HS + 2080 Super Max-Q you will see better results than those listed in this article.

I've got an XTU profile with PL1 at 65W and PL2 at 107W and I don't get temperatures that high. I've not reached 90C thus far (with a 120mV undervolt fwiw). I may just have more room to play with in regards to cooling because this is a relatively large unit with pretty hefty fans and large heat pipes. Nonetheless I would love to see a Ryzen paired with a high end Nvidia card I think we can both agree on that!
 
Last edited:
Yes my point is that the Intel 10th gen mobile CPU is not good enough for 2070 Super GPU and above, I easily gain 5% more perf on my 10875H + 2070 Super @ 1080p just by increasing the CPU TDP from 45W to 65W but it basically un-coolable (CPU reaching 95C even with max fan, at 45W TDP the CPU stay below 80C with 70% fan). Intel 10th gen CPU has a max T-junction of 100C but it start throttle at 90C, so all you get is bad frametimes when you unlock the CPU TDP to take advantage of your powerful GPU.

It also doesn't help that all laptop feature shared heat pipes between CPU and GPU, so higher wattage GPU --> hotter CPU.

So yeah, this is all Intel faults. Had there be laptop with 4900HS + 2080 Super Max-Q you will see better results than those listed in this article.
There is one thing that can be done to reduce your gpu temps and I can help you with that. Also ryzen is a bit hotter due to the node but the peformance is the same at probably less wattage so that's why there's less heat.
 
There is one thing that can be done to reduce your gpu temps and I can help you with that. Also ryzen is a bit hotter due to the node but the peformance is the same at probably less wattage so that's why there's less heat.

GPU temp is really good on my 2070 Super Max-Q 90W, max at around 70C during gaming.
I also have undervolting profile (1480mhz/700mV) that further reduce temp.

Overall I'm only slightly disappointed with the 10875H CPU as it's the limiting factor at 1080p gaming, kinda wasting the 300hz panel on my Triton 500 laptop :D
 
GPU temp is really good on my 2070 Super Max-Q 90W, max at around 70C during gaming.
I also have undervolting profile (1480mhz/700mV) that further reduce temp.

Overall I'm only slightly disappointed with the 10875H CPU as it's the limiting factor at 1080p gaming, kinda wasting the 300hz panel on my Triton 500 laptop :D
I could help you if you want. Do you have somewhere where we could talk like discord or something like that. It's easier that way.
 
Overall I'm only slightly disappointed with the 10875H CPU as it's the limiting factor at 1080p gaming, kinda wasting the 300hz panel on my Triton 500 laptop :D

I wonder if it is the boost that hurts with mobile CPUs. The 1 and 2 core boost is quite impressive but when gaming the all core boost is only 4.1 ghz I think.
 
I wonder if it is the boost that hurts with mobile CPUs. The 1 and 2 core boost is quite impressive but when gaming the all core boost is only 4.1 ghz I think.

Jup, it's basically impossible to run the rated all core boost clocks on the 10750H and 10875H (and even the 10980HK) at the default 45W TDP. They will either be power or thermal constrained.

Couple that with the max supported DDR4-2933Mhz, Intel 10th gen platform is just not fast enough for high end GPU.
 
Maybe AMD should use some of their Zen cash on their GPU department because it looks extremely bad at this point. Nvidia pretty much steamrolls them at this point and DLSS 2.0 did not help AMD haha.

Oh well. I guess AMD is competitive in the low to mid-end market and this is enough for them.

Sadly I think "Big Navi" is going to be a 300+ watt monster with RTX 3070 performance. Please don't watercool it AMD... Keep those cheap AIO's out of this, Fury X was a joke because of that AIO with 33dB in idle/2D...
Why are you making this comment on an article that compares NVidia desktop to NVidia mobile? Stick to the topic.
 
Back