Desktop GeForce vs. Laptop GeForce: Gaming Performance Compared

Julio Franco

Posts: 9,099   +2,049
Staff member
Theres something wrong there .. in the games test the 2070 mobile is below the 2070 Max Q, ( witch doesnt make sense)
but on the sumary the 2070 Desktop wins more agains the Maq Q that the 2070 mobile

Also, the performance will vary a lot depending on the laptop itself, specially Max Q versions .
 
The biggest advantage of the laptops is that most of them are running in 1080p. Older games average high FPS in 1080p when using the older GTX model GPU.

NEW games demand a 2060, even in 1080p.

I would personally never buy a laptop I intend to use for gaming with anything less than a Core i7 and a 2060 (at least). I'd also expect to be plugging it in rather than gaming on battery to avoid CPU or GPU step downs.

 
Maybe AMD should use some of their Zen cash on their GPU department because it looks extremely bad at this point. Nvidia pretty much steamrolls them at this point and DLSS 2.0 did not help AMD haha.

Oh well. I guess AMD is competitive in the low to mid-end market and this is enough for them.

Sadly I think "Big Navi" is going to be a 300+ watt monster with RTX 3070 performance. Please don't watercool it AMD... Keep those cheap AIO's out of this, Fury X was a joke because of that AIO with 33dB in idle/2D...
 
Looking at the first few games was extremely disappointing on how even the better mobile chips can struggle in 1080P. Hard to cram that much performance in a small package while keeping heat/power down in a laptop.
 
I personally am not a fan of hardcore gaming laptops. sure I understand the appeal: I was a university student too many years ago. but you're simply paying way too much money for reduced experience.

it is my opinion that you're actually better off buying 2 separate devices. either buy 1 proper gaming PC and 1 lightweight laptop OR buy 2 midrange PCs to put in both your home and dorm. buying these 2 devices will probably cost the same as a hardcore gaming laptop but you'll get similar framerates in both devices. of course this doesn't apply if you're the gamer&traveller type.
 
Basically Nvidia is using bad binning chips as mobile solutions, calling with the same name of the “good Turing” to deceive customers. $3000+ notebooks with a 2060 level VGA aren’t a good solution at all. I realized that going from a i7-9750H + 2070 Max-Q to an i5-9600K + 2070 Super: gaming performance almost doubled. To be transparent Nvidia should use the M suffix.
 
I personally am not a fan of hardcore gaming laptops. sure I understand the appeal: I was a university student too many years ago. but you're simply paying way too much money for reduced experience.

it is my opinion that you're actually better off buying 2 separate devices. either buy 1 proper gaming PC and 1 lightweight laptop OR buy 2 midrange PCs to put in both your home and dorm. buying these 2 devices will probably cost the same as a hardcore gaming laptop but you'll get similar framerates in both devices. of course this doesn't apply if you're the gamer&traveller type.
This.
I bought a €1800 notebook last year and realized it wasn’t portable at all and gaming performance were good but not stellar. Ended up buying a light and slim notebook with great battery life and a gaming desktop.
 
Once again, this article was written without the actual top-of-the-line GPUs... the 2080 and 2080 Super (non-MAX-Q)...

I understood the first time, kind of, as it was fairly new... but come on Techspot, it’s got to be available somewhere now!

A true “desktop replacement” will clearly have one...
 
Looking at the first few games was extremely disappointing on how even the better mobile chips can struggle in 1080P. Hard to cram that much performance in a small package while keeping heat/power down in a laptop.
Older gaming laptops were bricks: heavy and very thick due to the cooling system. Now gaming notebooks are sleek and light, full of rgb lightning, but manufacturers aren’t saying the truth about gaming power available. Some of them are crippled by a sustained TDP of 45W which is ridiculous while gaming, and even the best usually don’t go higher than 60W, after a few seconds at 90W. And this only if the cooling system is Good enough to keep temperatures under control (most of the time they are hanging in the 90’s degrees).
This means CPUs throttling at about 3 GHz and limited GPUs.
 
Once again, this article was written without the actual top-of-the-line GPUs... the 2080 and 2080 Super (non-MAX-Q)...

I understood the first time, kind of, as it was fairly new... but come on Techspot, it’s got to be available somewhere now!

A true “desktop replacement” will clearly have one...
I'm glad someone else is on board with this. I have been mentioning it in threads such as this for quite some time now. My laptop has a 2080 OC with a 200-watt TDP and performance is on par to slightly above a 2070 Super Desktop GPU. It doesn't make sense not to include it in so-called Desktop vs Laptop articles.
 
I’m t is confusing. And it’s about to get more confusing as DLSS takes off as performance will be affected by the tensor core count too.

Speaking of which, can TechSpot do an article about DLSS2.0 on death stranding which every other tech press is stating looks better than native? Would love to hear the thoughts of this publication of this revolutionary feature.
 
Last edited:
Basically Nvidia is using bad binning chips as mobile solutions, calling with the same name of the “good Turing” to deceive customers. $3000+ notebooks with a 2060 level VGA aren’t a good solution at all. I realized that going from a i7-9750H + 2070 Max-Q to an i5-9600K + 2070 Super: gaming performance almost doubled. To be transparent Nvidia should use the M suffix.

Huh, mobile chips and especially those Max-Q variant are those that are binned for higher efficiency, meaning higher ASIC score. If you reduce the power limit on the desktop GPU to be the same as the Max-Q variant, the Max-Q variant would win. Just try to drag your power limit all the way down to 50% in Afterburner and see how your desktop 2070 Super score in Timespy Graphic (my 2070 Super Max-Q 90W score 7700 graphic without any overclocking)

Oh well I think adding the M suffix is even more confusing, what if Nvidia just decide to slap the 2080 Ti M moniker on their highest end laptop GPU, even though it use a TU 104 chip ? Same chip same name as desktop is just better. Would be a little less confusing if Nvidia just get rid of the Max-Q variants with 2 power limits (80W vs 90W, which the manufacturers won't disclose)
 
I'm glad someone else is on board with this. I have been mentioning it in threads such as this for quite some time now. My laptop has a 2080 OC with a 200-watt TDP and performance is on par to slightly above a 2070 Super Desktop GPU. It doesn't make sense not to include it in so-called Desktop vs Laptop articles.
I've been mentioning it as well... my assumption is that the laptops with them in it (like the Area 51 from Dell) cost a fortune and no one is giving them as freebies to Techspot to review...
 
Thank you for this public service reporting! People might quibble over minor details, but it gets the main point across.
 
Since desktops generally use 24-32 inch monitors compared to laptops' 13-17 inch screens, it'd be interesting to compare a desktop's performance @ 1440 against a laptop's performance @ 1080.
 
I'm glad someone else is on board with this. I have been mentioning it in threads such as this for quite some time now. My laptop has a 2080 OC with a 200-watt TDP and performance is on par to slightly above a 2070 Super Desktop GPU. It doesn't make sense not to include it in so-called Desktop vs Laptop articles.
Would you mind to share the model ? Because I never found any. Thank you
 
Basically Nvidia is using bad binning chips as mobile solutions, calling with the same name of the “good Turing” to deceive customers. $3000+ notebooks with a 2060 level VGA aren’t a good solution at all. I realized that going from a i7-9750H + 2070 Max-Q to an i5-9600K + 2070 Super: gaming performance almost doubled. To be transparent Nvidia should use the M suffix.

Probably more like good binned chips, which will run at lowest wattage possible, which is why AMD can't compete at all in the mobile GPU department.
 
@Squid Surprise - But the MSI GS75 Stealth, for example, is an i9 10980HK and a 2080 Super and sells for over $4000.

@thenandback - A lot of people that buy these Desktop Replacements often game with an external monitor.

Often? How do you know, why pay 3 times the price for a mobile gamer (price/fps) and then hook an external monitor and keyboard/mouse up to it? Might as well build a small ITX gaming rig then, with much better components for less money.

Most people that buy these, are gaming on them, in schools etc. You find most of these people at the BACK OF THE CLASS, drinking soda or energy drinks, while gaming. Facts from real life.
 
why Techspot didn’t include the 2080 or 2080 super in their reviews
Steve and Tim can only work with what they have and what can be achieved within publication schedules. Hardware vendors are unlikely to provide a sample that’s not going to receive a full review within a reasonable timeframe, and while going out and buying a motherboard or two as an addition for a comparison article is a viable option, shelling out $1500 or more for a laptop is another matter entirely.

Benchmarking also takes a huge amount of time - tests are run multiple times to produce a more useful data sample, and each of these runs vary a lot in time too. As an example, it took almost 20 hours of testing just to collate all of the data required for this article:


Then there’s the images to produce, text and videos to write/script, and edit. As much as we’d all like every review and article to cover every possible combination of available hardware, there’s always some constraints that prevent this.
 
Um... did you actually read what I posted? I asked why Techspot didn’t include the 2080 or 2080 super in their reviews... I gave an example of a laptop (the Area 51 from Dell) but clearly there are others...
Yeah, I got it, what I meant was that the price ($4000+) of a laptop with an i9 10980HK\2080 S was already included in the charts they posted so I didn't get why not include one with a 200+ watt power limit.
 
Probably more like good binned chips, which will run at lowest wattage possible, which is why AMD can't compete at all in the mobile GPU department.
If you keep a “2 GHz” chip (my 2070 Super constantly keeps 1925/1950 MHz) at 1.3/1.5 GHz you can lower wattage quite easily
 
Often? How do you know,
Because I post at the MSI Titan boards and the Area 51m boards as well as the Asus Mothership boards. A lot of us use an external monitor when using the laptop at home.
Most people that buy these, are gaming on them, in schools etc. You find most of these people at the BACK OF THE CLASS, drinking soda or energy drinks, while gaming. Facts from real life.
Man you don't have a clue what you are talking about.
 
Last edited:
Back