Nvidia GeForce RTX 3080 Ti Laptop GPU Review: Fat GPU, Slim Gains

It's unfortunate you have to spend so much to get a GPU capable of running current games at 1080p above 60FPS.

The 3060 and 3070 are quite capable, but nothing shines like that 3080 does.
 
In the case of the 12GB desktop 3080, it often performs so close to the desktop 3080 Ti that it seems the extra SMs in the 3080 Ti (and 3090 for that matter) just can't find enough work to do or the memory bandwidth isn't quite enough to fully feed them.
 
I looked up the price, and it's too expensive at €3,277/$3,747. I think it will be OK at $2500 though. Considering the GPU shortage, it's not easy to purchase a Desktop GPU right now.
 
In the case of the 12GB desktop 3080, it often performs so close to the desktop 3080 Ti that it seems the extra SMs in the 3080 Ti (and 3090 for that matter) just can't find enough work to do or the memory bandwidth isn't quite enough to fully feed them.
I think its a case of the extra SMs just providing so little additional hardware that it's a fart in the wind.

256 extra cores on a 1024 core GPU is a much bigger deal then 128 extra cores on a 10496 core GPU, to use the 3090ti as an example. Memory bandwidth could be holding the 3080ti back, or it could just be its insane power consumption limitng clock speed. Or both.

In the case of the 3080ti, the huge increase in CUDA count is misleading, as the increased textre and pixel fill rate are not nearly as impressive. RT results favor the 3080ti across the board, but without RT it doesnt really matter.
 
I looked up the price, and it's too expensive at €3,277/$3,747. I think it will be OK at $2500 though. Considering the GPU shortage, it's not easy to purchase a Desktop GPU right now.
There are prebuilds ar a good price, I think that dell XPS with 3080ti and i7 12700 is down to about $3000... Which is not that bad .
 
$3,700 for Laptop 'Gaming'?
Added benefits include thermal throttling, constant monitoring of temp, poor backup and burnt lap!

No thanks.
Im honestly surprised people still think this.
I have a near 3 year old gaming laptop with a 210 watt desktop 2080 and i9 9900k.

Now I could go on about how this old thing is still within a few percentage points in Firestrike graphics score (The 3080 ti 170 watt +\- laptops are getting around 33000 graphics scores) but check out my last run from 3 weeks ago and look at the average temps of the GPU. Most desktops would go way over that.

https://www.3dmark.com/fs/27054963

Now check the average GPU temps on the stress test.

https://www.3dmark.com/fsst/1909185
 
Last edited:
I've always found these top tier gaming "laptops" to be a strange form factor with most constraints being some what artificial in nature. While you do have to design for thermals, when you hit the top tier gaming level power constraints aren't all that important because few of these machines ever run on battery alone.

Everyone I've known that had a top tier gaming laptop bought it because it was easy to take with them for LAN parties, etc. Not so they could play games in the park or what have you. So as long as the OEM can keep thermals in line why not just slap the same silicon as desktop parts into them? IMHO even with the revised GPU silicon anyone expecting a good battery experience is going to be SOL.
 
There are 2 reasons (personal opinion) why you won’t see meaningful improvement in performance here,
1. The main elephant in the room is power limit and thermal headroom that we all know is very limiting in a laptop. The mobile RTX 3080 and 3080 Ti operates at a power limit that’s lower than a RTX 3060 Ti for desktop, and in some cases, lower than the RTX 3060 desktop itself. Even if they can increase the power limit, they cannot go around the physical limit of the chassis to implement beefier cooling, unless they choose to go back to using very thick laptop chassis that weighs a lot, which will not appeal to most,

2. CUDA core increases is too small to make a meaningful impact. The product stacks between a RTX 3090 and 3080 proves that there is no meaningful gain in performance with small increase in CUDA cores. While Ampere and Turing are quite different, but with the supposed 2x increase in CUDA cores from the RTX 2080 Ti to RTX 3080, we see around 20 to 30% performance uplift on average, and that’s with significant increase in power limit from around 250W to 320W based on Nvidia’s TDP numbers. One can argue that we can undervolt the RTX 3080 and still offers very decent performance, but I feel the same can be said about undervolting the RTX 2080 Ti.
 
Back