Well, NOW you can get much better than 2080Ti performance for $699. What's the problem?
"Titans are supposed to use around 250W which is important to the pro-sumers that use them. There is no Titan this time around, so they called what would have been the Ti version of the 3080, the 3090."
Sources, please?
"Here we have nVidia playing a tune and you're dancing to it"
Ahem... what "tune", exactly? Buy it- or don't.
"You'll see that the performance delta between the 2080 and 2080 Ti is about the same as the projected difference between the 3080 and 3090 which means that there isn't really enough room between them to fit another card without cannibalising sales from the two other cards."
You're missing the fact that there is an $800 difference in price ($700 vs $1500) between the 3080 and 3090. They could easily release a 3080 Super/Ti with 16/20GB VRAM for $1000-1100. Plenty of space for an in-between card here.
Um, it seems I triggered you and that wasn't my intent so I apologise for that. My intent was to show you that it's not quite as good as it seems. I honestly don't know how they'll wedge a "Ti" version into this mix but looking at the metrics of price and performance, they already have, the 3090 (which is pie-in the-sky expensive just like the 2080 Ti). Is this better than before? Sure it is, but that's not exactly setting the bar high, eh?
I wanted to point out that this isn't as good as nVidia makes it out to be because a lot of people have short memories. Is this good? Yes and no because good is relative. Pascal was a much better launch for its time and it's not even close. That GTX 1080 Ti was perhaps the best high-end video card value that I've ever seen. That thing is still a monster today and Pascal should be the benchmark against which all nVidia launches are compared, not Turing.
I just want people to temper their emotional response of euphoria (and believe me, I get it) and remember that this isn't really all that great. Sure. it's just great compared to how horrible Turing was but it's still pretty bad compared to how great Pascal was. Pascal was a GIGANTIC LEAP from Kepler and the prices remained THE SAME, as they should have.
How long did we suffer with Intel sandbagging because AMD's FX wasn't up to scratch but were too busy being happy about the little incremental improvements that Intel made? We were so preoccupied with how much better Devil's Canyon was compared to Ivy Bridge that we didn't realise what Intel was doing to us until after Ryzen launched. We needed AMD to wake us up and it's because we couldn't see the forest for the trees. We need to be a little bit more cynical and a little less naive, even though that's not always fun (actually it never really is). Yes, Ampere is better than Turing and nVidia wants us to focus on that because it's to their advantage if we do so. However, if we forget how good that it can be or fail to notice the bigger picture and the path that we seem to be on, do you know what that makes us?
Boiled frogs my friend. boiled frogs. I don't want that for any of us, even if I have to be a bit of a killjoy to point it out.
As for my source about the prosumers and the 250W Titan, my source is AdoredTV. Love him or hate him (I personally think Jim's awesome), nobody can ever say that he's wrong because he backs up everything he says. I checked and he was 100% right about the 250W Titans. I didn't even know that they were 250W cards before Jim pointed it out but it makes sense.