Sounds awful.
If this card comes in priced under $699, I'd be surprised.
I don't think that surprised is the word... "shocked" would be more my reaction to it.
Considering the 4080 12GB (a.k.a 4070Ti) is on par in that 3090/90Ti range which only lands about 15-20% faster than the 3080 I'd in terms of performance I venture to guess that the 4070 be around 5% faster than a 10GB 3080. I say this because Nvidia needs to leave a gap there to fill with the future 4070 Super that'll release just before they drop their 5090 at $2100.
Only $2100??? What a BARGAIN!
Might be a hit because of 200 watt TGP but will depend on price. I doubt it will be 200 watt, more like 225. They are upping the clockspeed slightly compared to 4070 Ti - meaning less headroom for OC and higher watt usage @ stock, core for core.
I don't think that anyone actually cares about power draw anymore. Just look at the people who bought RTX 30-series cards despite the higher power draw compared to the RX 6000 cards. Hell, just look at the people buying Intel 13th-gen CPUs despite some of them drawing more power than video cards!
Just being 200W won't be enough to sell this card. Fortunately for nVidia, the fact that it's in a green box is enough for a lot of clueless noobs to buy it.
At least they are not gimping bus to 160 bit and put 10GB on it.
Sure, but the 12GB still not enough though. A level-7 of this generation should have 16GB because, as we've seen, GPUs with RTX 3080/RX 6800 XT-levels of performance can use 16GB of VRAM and will surely need to in the future. Putting 12GB on a level-7 card in the RTX 40 generation is just nVidia's way of saying "LOOK, we're putting more VRAM!" and people will fall for it simply because the level-7 of the last generation had set the bar so low with only 8GB (the same as an RX 6600). It is already generally accepted that 10GB was nowhere near enough VRAM for the RTX 3080 and I honestly believe that 12GB on this card will still be a limiting factor in the mid-to-long term. Sure, it will outlive the RTX 3080 but will still not live as long as an RX 6800 XT, despite being newer and more expensive.
However, I can't really fault nVidia for doing it because if people were dumb enough to buy a level-8 card with 10GB of VRAM, they'll certainly clamour to buy a level-7 card with more. After all, a lot of these clueless noobs don't realise that as GPUs get faster, they need more VRAM to ensure that the end user can leverage said speed for as long as it's usable. We've already seen that the effects of insufficient VRAM can be catastrophic, especially when trying to use RT, something that nVidia pushes like crazy.
I expect 3080 perf from 4070 but at much lower watts with slightly more VRAM.
I agree and while it will look much better on paper to the clueless noobs, its longevity will still be severely hampered. Of course, since these noobs who buy this stuff won't know that cards like the 6800 XT will still be viable when their card is finally unusable so they won't have learnt anything.
It will sell just fine at 499-599 dollars in this market. $499-549 would actually be great compared to what you can get for this money today... (Last gen and powerhungry cards)
I have a term for the thought of nVidia selling the RTX 4070 for $500-$600 and that term is "PDP" or "Pipe Dream Pricing" because I 100% expect that it will have an MSRP of no less than $700USD, count on it!
AMD, where is 7800 series...? Do you want to compete or not?
Yeah, it's just insane what has been going on at AMD. The releases of Zen1-3 and RDNA1-2 were incredibly well-done. Sure, Zen3 and RDNA2 got ruined by all the crap that followed but that wasn't their fault. Their marketing was honest, forthcoming and amazingly transparent. I truly believe that the trust they built led to far more success than they ever would have achieved by over-hyping and under-delivering like they did with Bulldozer and RDNA3. Don't get me started on their decision to have R9 X3D CPUs while not having an R5 X3D CPU.
However, when I think of the
current conduct of AMD executives, I keep getting videos like this in my head: