GPU Pricing Update, April 2023: Is the Nvidia RTX 4070 Another Flop?

4070 uses 200 watts in gaming (50-60% less than 3080 and 6800XT) while delivering same perf, I don't want one, but I understand why it's selling.

It's a good card for small case builders. You don't want a 300-350 watt GPU in a small case...
Also, it has good availability. After ~2 years with bad availablity and high prices.

Where is AMDs 7800XT and 7800? Price cuts on ~3 year old GPUs is not really what most people want. Most want new and shiny stuff, with good performance per watt (and dollar). This is especially true for AMD. If AMD cards don't have way superior value, they won't sell. People will simply go Nvidia and get superior features and drivers if AMD don't undercut Nvidia alot more than they do right now (with 7900XT and 7900XTX series).

7900XT should have been 699 dollars from day one.
7900XTX should have been 899 to 949 tops.

They will drop eventually. AMD hardware always drops in price overtime and it's only a matter of time before 4080 Ti hits at 1200-1300 dollars, which mean 4080 drops to 899-999 bracket.
 
Last edited:
I'm starting to think the Intel ARC could serve a large % of gamers, who like me still game at 1080p. They are priced the way things should be priced.

When Intel actually moves away from using 3rd party to make their GPUs, they can probably get highly competitive on perf per dollar. This is what they need to eat up dGPU marketshare.

Intel 4, 20A and 18A should allow them to make GPUs themselves, while still be competitive in the low to mid-end market.

Low to mid-end is AMDs prime segment, when it comes to GPUs. Intel could easily eat up alot of this, when AMD is forced to pay TSMC milking prices for their chips.

Most PC gamers still use 1080p or less. Cheap GPUs will always sell.
75% of Steam users are at 1080p or below.

High-end GPUs are a niche market really. This is probably why AMD don't really want to compete here, they take away from their CPU output, when they make GPUs instead, same nodes. CPUs = more money per wafer.

Even Nvidia is moving away from "gaming GPU" focus, less than 50% of their income is now from GPU sales. They are focussing on AI, enterprise etc. Where the true money is.
 
AMD are unlikely to be truly competitive with dGPU until things change as they have a better use for their silicon allocation. But when demand tanks thanks to the looming recession, the strategy might change with a bigger focus on dGPU and/or gaining market share.

Whatever happens with AMD I am looking forward to arc 2.0. intel or AMD I don't care, as long as the card is good and the Linux drivers are good and built into the kernel.
 
The 70 series is more for guys that wanted mid to high at $400, now that everything is back to normal, no one is spending $600 on mid tier.

Same with Intel and the 13700k, they aren't selling as well as the previous because its just too dam much.

3060Ti is what they are buying...and at $400 There is literally no market for the 4070.
 
Prices will drop eventually? I can see Nvidia keeping the prices even if they are not selling. AIB Partners maybe will discount to the point of recovering at least invested money. Some of them will take the EVGA way, leaving Nvidia to sell alone.
AMD and Intel maybe will drop some % along with AIB's but don't expect a miracle.

P.S. I want to see prices after the financial results for H1 if something changes, or maybe Q3.

 
The big news for April is, of course, the launch of Nvidia's GeForce RTX 4070, which has been met with mixed reviews overall. The new GPU offers somewhat better value than the previous-gen GPUs it's replacing, but it's hardly an exciting uplift
This, and other comments make TechSpot’s 90/100 score for the RTX 4070 all the more perplexing. 90/100 implies “strong value/good buy”. Reading the comments here and on GamersNexus’ YouTube review of the 4070, it seems like the community overwhelming feels that this product is not great value at $600 compared to prior gens (which I concur with). I think the 90/100 score should be revised accordingly, similar to other recent reviews.

As an aside, how is TechSpot arriving at these scores? Steve commented recently in the forums that he doesn’t apply the number, and value relative to prior generations doesn’t seem to factor into this as highly as it should.

The reviews themselves are well done; my criticism is focused on the scoring.
 
"Such low level of interest has surprised us a little as the 4070 is one of the better value cards on the market right now, but it seems that the PC community isn't interested in a new GPU just being slightly or somewhat better value than existing options; that isn't good enough and it's going to take a really compelling product offering outstanding value for things to pick up again."

Because its still a terrible value overall. I paid $600 for my EVGA 1080ti Hybrid back in the day - and I thought that was a massive splurge at the time. Now they want that for a xx70 series, air-cooled card? No thanks.
 
When Intel actually moves away from using 3rd party to make their GPUs, they can probably get highly competitive on perf per dollar. This is what they need to eat up dGPU marketshare.

Intel 4, 20A and 18A should allow them to make GPUs themselves, while still be competitive in the low to mid-end market.
20A and 18A won't be ready for mass production for a couple more years and given that these specific fabs are going to be heavily used for Intel's primary focus for them (and its hawking for foundry business with outside customers), it doesn't leave much scope for its dGPUs. Even if Intel has sidelined 20/18A output for such dies, it's not going to them make any headway into the dGPU market share until that time, at which point, it may well be too late.

Besides, Intel needs to seriously re-think its dGPU design before it can truly compete in the mass volume sector -- the current ACM-G10 is huge (406 mm2) for the performance it offers. The closest GPU to that, on roughly the same node, is the Navi 22 (335 mm2) and the likes of the RX 6700 XT noticeably out-performs the best Arc model using the G10.
 
20A and 18A won't be ready for mass production for a couple more years and given that these specific fabs are going to be heavily used for Intel's primary focus for them (and its hawking for foundry business with outside customers), it doesn't leave much scope for its dGPUs. Even if Intel has sidelined 20/18A output for such dies, it's not going to them make any headway into the dGPU market share until that time, at which point, it may well be too late.

Besides, Intel needs to seriously re-think its dGPU design before it can truly compete in the mass volume sector -- the current ACM-G10 is huge (406 mm2) for the performance it offers. The closest GPU to that, on roughly the same node, is the Navi 22 (335 mm2) and the likes of the RX 6700 XT noticeably out-performs the best Arc model using the G10.
Won't take years before 20A is ready for mass production, Intel said it was ahead of schedule and ready for 2024.

Intel 4 Q4 this year
Intel 20A next

Intel owns the iGPU market, they would not enter dGPU market if they thought it was impossible - TMSC needs to be cut off tho - if not Intel will not be able to make good money on this

TSMC is a big part of why AMD chips are more expensive than ever, TSMC raised prices, because without TSMC, AMD would be out of business. AMD relies 100% on TSMC to deliver.

Without Intel in competition with TSMC, TSMC would just increase prices more and more over time. Samsung probably won't be up to the task.

Samsung 8nm = more like 10nm TSMC
 
Last edited:
Just don't read the TS score and just the article, you can score it yourself after reading/viewing 4-6 reviews.
I for one don't make purchases after TS scoring, hope you guys don't either.
100% agree, but I think a lot of people will still be influenced by the score.

I think TechSpot needs to review their scoring methodology; either eliminate the score, or provide some context for how it was derived (e.g., value, features, performance, efficiency, etc.).
 
100% agree, but I think a lot of people will still be influenced by the score.

I think TechSpot needs to review their scoring methodology; either eliminate the score, or provide some context for how it was derived (e.g., value, features, performance, efficiency, etc.).
As prices fluctuate and time goes they would need to update scores a lot cause a bad deal GPU can drop in price and become a good deal so it's probably just better to remove it. It never made any sense to me.
 
Nvidia is leading the charge in the AI datacenters, and I think we're going to find out that they are not feeling these soft GPU sales nearly as much as we would like them to. Unfortunately, I just think that is going to be the case and we might as well resign to the fact that gamers are not Nvidia's primary focus going forward. It doesn't mean we should buy ridiculously priced products. We might just need to consider moving on to AMD and maybe, hopefully, Intel will stay in the GPU game too.
 
Won't take years before 20A is ready for mass production, Intel said it was ahead of schedule and ready for 2024.

Intel 4 Q4 this year
Intel 20A next

Intel owns the iGPU market, they would not enter dGPU market if they thought it was impossible - TMSC needs to be cut off tho - if not Intel will not be able to make good money on this

TSMC is a big part of why AMD chips are more expensive than ever, TSMC raised prices, because without TSMC, AMD would be out of business. AMD relies 100% on TSMC to deliver.

Without Intel in competition with TSMC, TSMC would just increase prices more and more over time. Samsung probably won't be up to the task.

Samsung 8nm = more like 10nm TSMC
I think you need to go do your own research on the TSMC pricing subject.

Yes they raised prices, no where near the hikes you've recently seen.

I can dig it out but TSMC do release pricing for wafers of each node. All said and done, a 7950X costs (to make) around £80. They sell for £750.

I'm just saying, TSMC isn't fully to blame here, the likes of AMD and Nvidia blamed their massive price hikes on TSMC but it's mostly a lie.
 
What's really lacking at the moment is an AMD competitor in the $500 range.
That's not really true though. Annoyingly, the 6800 / 6800XT still make a lot of sense in the current market. Maybe they'll tick down by another $20 when the 4060 cards come out, but they're already priced to where the market says they should be for the performance.
 
That's not really true though. Annoyingly, the 6800 / 6800XT still make a lot of sense in the current market. Maybe they'll tick down by another $20 when the 4060 cards come out, but they're already priced to where the market says they should be for the performance.
Maybe people are holding out to see what the 7800 XT\7700 XT has to offer. Seems like there has been dead silence from AMD though. Not even any recent rumors on the 7800 XT. Techpowerup has the 7800 XT at 60 CUs and only 12GB of VRAM though (sounds like this should be the 7700 XT specs not 7800 XT). If that is the case, you'd probably be better off with a 6800 XT. I really hope that is not the case. Hopefully it gets 72 CUs and 16GB of VRAM like the previous generation.
 
4070 uses 200 watts in gaming (50-60% less than 3080 and 6800XT) while delivering same perf, I don't want one, but I understand why it's selling.

You're saying the 4070 is "selling"?? Are you really reading this article??

It plains states: "The bad news for Nvidia, which is good news for consumers, is that the RTX 4070 is selling badly even at the MSRP. "....!!!!!!

Try reading the article again before posting.
 
I'm starting to think the Intel ARC could serve a large % of gamers, who like me still game at 1080p. They are priced the way things should be priced.

I have been considering getting an Intel card just to tinker with and try it out. Ultimately it could replace my daughters RX 580, which still does fine for 1080p on the games she plays.
 
Really hope these things rot in a warehouse.
600$ for mid range - just no.
Nvidia and AMD are doing a fine job building up the console market.
Games wise beyond online multiplayer what offering does PC have that can’t be had from a PS5 Xbox series ?
 
You're saying the 4070 is "selling"?? Are you really reading this article??

It plains states: "The bad news for Nvidia, which is good news for consumers, is that the RTX 4070 is selling badly even at the MSRP. "....!!!!!!

Try reading the article again before posting.
I know at least 5 people IRL that picked one up so far, I bet the 4070 will show up on Steam HW Survey in bigger numbers than the entire AMD 7900 series in 3-6 months.

4070 is selling just fine, availablity is just high, this is why Nvidia reduces it now. They wanted to deliver a bunch on release, better than AMDs paper lauching yeah?
 
Really hope these things rot in a warehouse.
600$ for mid range - just no.
Nvidia and AMD are doing a fine job building up the console market.
Games wise beyond online multiplayer what offering does PC have that can’t be had from a PS5 Xbox series ?
PC is not just a stupid locked ecosystem, where you have to buy overpriced games in a single store. Also, you can actually do work on a PC.

If you go all-in on console, you will get milked badly over time as well. Software is much more expensive, accesories as well. The console itself is the cheap part, and it needs to be cheap to lure people in.

Think printers. Sell cheap, then milk.

A console is not a PC. Locked ecosystem. Little to no modding. Very limited settings in games. Current gen already had a bunch of games locked at 30 fps, which is just sad. Expensive games. Paid online multiplayer. Paid cloud storage.

Over time, a console always turns to garbage. Devs are pushing games more and more, we are closing in on half way thru this console generation. PS5 Pro rumours out, prob. launch in 2024 and so far PS5 had very little exclusive games.

Yeah, lets just forget consles. Most I know that got a console, including myself, have a PC too, and that is the primary platform.

I have XSX, PS5 and PS4 Pro + Emulator Box for pretty much all the other consoles. I won the XSX, or I would not have gotten that one. Waste of money when you have a PC.

PC + Playstation for me is the perfect combo, I'd never settle with just the PS that is for sure...
 
Last edited:
Back