I want to see nVidia release a Titan X x2, and set a price at $3K or above. So people can comment for days about the price being out of their range. and recommend AMD's poor performance/watt cards. All because AMD is cheaper. Yep, that's what I want to see.
To get the maximum enjoyment, the comments need to be arguing the merits of a $3K card versus a $1.5K card by a bunch of people with a three-generations-old midrange card or a laptop with integrated graphics.
The discussion needs to centre upon f.p.s. in a console port that uses a 5-10 year old game engine that will inevitably be discounted by 70% in the next Steam sale.
As an aside, I don't see any gamers* looking at this card, but the graphics forums seem to a have more than a few people queuing up to buy the card for the other push Nvidia is making with the card thanks to the 12GB of vRAM -
machine learning (and the new growth area of
deep neural networks in particular), and of course rendering is now moving into
4K and higher necessitating larger framebuffers.
No doubt when the card proves to be more valuable in compute situations**, the mainstream gaming forums will remain as perplexed at its sales as they were with the original Titan that became the de facto choice for 3D rendering.
For gaming this card has little practical application aside from not being vRAM limited and a being single GPU. A couple of R9 290's or GTX 970's will provide the same basic horsepower for a lot less...and in less than three months the 390X should debut- and if it lives up to the hype, so should an unrestrained GM200 with everything this card is missing ( voltage control options, dual 8-pin power, custom cooling, higher clocks etc.).
The same raging arguments against the Titan range are exactly the same as those levelled against the G80 powered 8800GTX - too expensive, gaming performance largely equalled by a second-tier offering (G92 8800GTS), and yet that tight focus doesn't begin to scratch the surface of what it achieved in creating a whole GPGPU industry. I suspect that the same might also be said for what comes next for cards with sufficient resources to code and run simulations for deep learning.
* People with the wherewithal to actually purchase the hardware
** Yes, AMD does compute - but they have little presence in CG rendering, and are exactly nowhere in machine learning,