But with the death of Moore's Law, the only way we're going to see major performance gains in the future is either through increased power consumption, or abandoning silicon lithography entirely.
You forgot a third way: Marketing graphs and slides praising the newest AI solutions
Also, imho, not Moore's Law is dead, rasterization is.
The next generation of GPUs could save Moore's Law by adding another yearly 200% increase of generated AI frames into the mixture. There is nothing to stop this now. And we could get the GPUs cheaper, as you don't need heaps of VRAM, die size and general silicon complexity in a future GPU to compute one frame and generate the other 99% of them. The next 1slot 1fan RTX6030 4GB with an improved DLSS 5.0 could be faster than the 5090 32GB, if they solve the latency problem and generate the textures on the fly. At least they will claim it like they did with the 5070 and the 4090.
There is nothing to stop this development. Unlike the data problems LLMs face now, there is no real data peak in training frame gen models - you could generate data with random user agents endlessly, so there is not a limit in sight for this transition.
I hope you see my cynism - I do not want this, but I fear, this is what the future wiil hold for us. More and more improvement on the mixture of AI upscaling + framegen + low latency tricks.
And it's an industry wide transformation of GPU technology. Just watch the other two (consumer) GPU companies. They are executing the exact same strategy. Intel starts to sell the newest XESS 2.0 as a perfect frame gen software package locked to their newest gen of GPUs. Even their slides with doubled (or more) fps correspond to the Nvidia CES slides and narrative:
AMD will do the same with FSR 4 (even, what is unusual for them, locking this technology to their newest GPU AI cores). Also, the AMD CEO said, she wants AMD to become a software company, so emphasis on software, APIs and AI.
For me, it's a clear shift towards proprietary AI software solutions by Nvidia, AMD and Intel. Someday Nvidia will laugh in our faces, telling us (again) that 8 GB VRAM is enough for Ultra resolutions with everything maxed out. Just use generated frames and textures. And while old gamers will see the lie (AI came to their rescue), new gamers will accept the trinity of upscalers, frame generators and latency fixes as a standard.
Not what I want, but I fear, the development is unstoppable now, as Nvidia leads the way and the two other companies will follow. So, every new gen, we will see doubled or quadrupled frames bundled with proprietary software packages and slides referencing the newest games from friendly studios that support those technologies.