All in all:
- most games can be perfectly played with a mid-range 20- or 30-series with 8 Gb of vram and mid to high level graphic unless people really have high demands and search for the differences
- manufactures with a push ($$$) towards game companies try to artificially time limit the use of a graphics card so that you feel that you own a 12 year-old card.
Do people notice a difference, on a 65" 4K TV, between a 4K 85% jpg photo or a 100 megapixel converted to 4K raw image, this changing 60 times a second? No.
The same effect is seen when people have dolby vision vs hdr10, 1440p vs 4K, DLSS 1440p @ 4K vs real 4K, good jpg vs raw, etc. People playing games hardly notice any difference, it's not the same as just watching still images.
- I will replace my RTX 3060Ti 8 Gb when I find a 4070 ti equivalent with 16 Gb of vram for 500€ or less and that may happen with the 50 or 60 series. I'll wait, I still have fun at 1440p or 4k