I still don't get why everyone is so obsessed with VRAM these days. Games have been exceeding 8GB of textures for years now. Whatever isn't available in VRAM is made up for with system RAM for a latency penalty so nominal that it's only noticeable if you're already CPU bottlenecked. Even if your system RAM is limited, upgrading that is a far cry cheaper than the arm and leg needed to buy a higher tier graphics card. If you're really that worried about VRAM, AMD offers 12GB for $400, 16GB starting at $550 and 20GB for $800. Even the a770 offers 16GB for $350 as opposed to Nvidia expecting $1,200.
Rumor is that there's a 16GB variant of the 4060ti but I highly doubt it as these cards were fully manufactured and distributed before the VRAM panic began. And that's not to mention that Nvidia wouldn't produce it knowing that it would kill sales of the 4070 and 4070ti to offer a 4060ti that out specs both. This is the company that lowered the 12GB VRAM of the 3060 to only give an 8GB 3060ti after all.
8GB VRAM and 16GB system RAM will suffice for a lot longer than people realize. The budget market is flooded with reclaimed Office PCs models running Haswell procs and 6GB 1660 Super GPUs that can still pass 60 fps at 1440p on medium settings. Consoles and mobiles will have memory issues because they utilize IGPUs that share system memory. Even then, the only one to have issues as of yet is the Xbox Series S which only has 8GB total shared memory. Other systems using 16GB+ should at least last until replaced in 3 years.
Nvidia's NT compression looks interesting but there isn't a single game engine coded to use it. They'll probably charge game devs the same arm and leg rates to use it, just like DLSS, which will only drive up the cost of AAA titles. Expecting Nvidia to solve the problem when they've already established that their far too cheap to solder on more MMs is only going to make PC gaming more expensive. That's why I haven't even considered an Nvidia GPU since the GTX 770.