At these price points I don't care what the cards name is. The 3070 had a 256 bit bus and a larger die size. I'm also not saying you have to max out the game, but if you want high resolution textures at 4k then 192bit is going to be a bottleneck.Sorry, we can't agree here. A standard 4070 does well at 4k, so the "Super" variant will also do well.
And when I say "does well", I mean 4k with reasonable settings, customized and tailored for personal taste, NOT defaulted to "ULTRA MEGA MAXIMUM!!" settings.
That's really one of the reasons people are so angry about this generation of cards, not as many people care about high end gimmicks as nVidia marketing material would have us believe. There is now a large group of PC gamers who want an immersive experience using their 4k TV to play games and something like a 192 bit bus becomes an issue there. It's one thing if you're paying $300 for a 7600xt 16GB but not $600 for a 12GB card. There are more households with 4k TVs than their are with high refreshrate monitors. While I have both, I find my self laying in bed playing single player games on my 65" more than I do sitting at my desk using my 27" 1440p.
The 4070S(yes, im calling it the 4070S now), as I see it, is mostly just 1080P card meant for competitive FPS games. The 4070TiS will be a powerhouse.
That 192bit bus would be fine but once you start messing with shaders you start to see a real impact with frame stuttering and 1%lows which is entirely unacceptable on a card that is marked as a 70s class card. Which, we all know it's not. They shrank the die size and the memory bus from the 3070. The 4070s is truer to a 70s class card but I still can't help but be angry about it.