Nvidia and AMD Keep Dropping GPU Prices After New Products Flop

All of the low and mid-tier cards are DOA because a $500 console will deliver better performance.
No, it won't. The consoles are roughly equivalent to a $230 RX 6600 XT, or a $270 RTX 3060. The 4060 Ti is awful value, but it is still quite a bit faster (~30%) than the console GPUs.
 
They dropped a few dollars under MSRP for some products only in some regions (US mostly). So no change on prices here in Europe. There is no real value in current gen products so skipping looks the way to go.

Later edit: oh I forgot, they bundled some games also. No value for me since it's not an coupon/voucher for me to get the game I want not what they want to include.
 
No, it won't. The consoles are roughly equivalent to a $230 RX 6600 XT, or a $270 RTX 3060. The 4060 Ti is awful value, but it is still quite a bit faster (~30%) than the console GPUs.
Which is balanced out by consoles having no Windows performance penalty and general half-assed effort in porting most games that were created for consoles.
Or quite simply compare actual game performance. In Last of Us part 1. It takes a 3070 to match the same quality that the PS5 has.
 
Inflation?
I don't see much inflation in PC components outside of GPUs. Have you seen RAM and NAND prices? Even CPU prices have only nudged up a bit. The GPU is eating half of the entire build cost now.
He didn't mean inflation of hardware components, based on what he wrote. He was talking about general cost of things: food, clothing, gas, electricity and so on.

Biggest one a lot of people feel is the food prices. When 24 months ago my typical trip to the grocery store was maybe $180 on average, today it's $280. I generally visit the grocery store about every 11 days on average. That's at least 33 visits a year which means if I'm spending an extra $100 on average for every grocery visit, my cost has gone up $3300 in two years for groceries.

2 years ago, averaged just shy of $6000 for groceries.
Today I average just about $9300 a year - that's a 56% jump in cost overall.

2 years ago (June 2021) gas was (in my area) roughly $2.80 a gallon.
Today it's $3.50.
That's a 25% price increase and the kicker of it is......
Oil prices in June 2021 was around $72 a barrel.
Oil price as of right now is roughly the same, yet gas has been staying higher.

As for hardware components, some stuff hasn't really gone up much, but the main issue we see here with prices is with GPUs being marked up so high compared to where they were 2-3 years ago. Hopefully the prices on them will continue to come down.
 
"so the low consumer interest in new GPUs might be considered... normal?"

It is a simple as this:

- people that "money for this is no object" (a minority) just care: which are the best? I'll buy. That's why the 4090 and somewhat the 4080 sold relatively good. These buyers don't buy because they make sense, it's just "I want".

- people which are very budget limited, they have a price target like 200€ and they will buy the best for that money which generally are either second-hand from last gen or rest of old gen stock.

- all others (most of the buyers) buy in conscience and most try to buy from 200 - 500€ which is the price of a new console. Last gen price was very inflated due to mining (on the seller's side); with this new gen, Nvidia and AMD changed the game and inflated the price and sunk what they offer on their side. Well buyers are not stupid...

All in all, both brands tried to milk the cow even without milk and reached a limit (more or less like apple). 2023 cards are too expensive and too limited, it's outrageous that a "mighty" 4060 Ti only has 8 GB in 2023 or that a 4070 ti has anything south from 16 GB RAM.

I would expect in 2023 a 4060 with 8 or 10 GB, a 4060 ti with 12 GB, 4070 12 GB, and 70 Ti and 4080 with 16 GB and then the 80 ti with 20 - 24 GB and the 4090 with 24, 90 ti with 32 GB.
 
Which is balanced out by consoles having no Windows performance penalty and general half-assed effort in porting most games that were created for consoles.

No, it isn't. Digital Foundry has done several dozens of tests of games running on PCs vs consoles, and they found them to perform about the same as a RTX 2070 Super on average (which performs the same as a RX 6600 XT), or like a RTX 2080 at most in a few absolute best case scenarios. And those are tests running actual, released games in real time with a framerate counter on the screen. Any "console optimization" or "Windows performance penalty" (which is complete nonsense to begin with) is already accounted for in those comparisons.

In Last of Us part 1. It takes a 3070 to match the same quality that the PS5 has.

Complete and utter nonsense with no source. You can see from reviews all over the web, including right here on TechSpot, that even a 6700 XT already runs the game significantly better than the PS5 does at 1440p (which is the resolution the PS5 uses in performance mode).

Also, using one single game (that is a questinable port with VRAM issues) to compare PC and consoles, instead of the myriad of tests done by DF, is just hilarious.
 
No, it isn't. Digital Foundry has done several dozens of tests of games running on PCs vs consoles, and they found them to perform about the same as a RTX 2070 Super on average (which performs the same as a RX 6600 XT), or like a RTX 2080 at most in a few absolute best case scenarios. And those are tests running actual, released games in real time with a framerate counter on the screen. Any "console optimization" or "Windows performance penalty" (which is complete nonsense to begin with) is already accounted for in those comparisons.



Complete and utter nonsense with no source. You can see from reviews all over the web, including right here on TechSpot, that even a 6700 XT already runs the game significantly better than the PS5 does at 1440p (which is the resolution the PS5 uses in performance mode).

Also, using one single game (that is a questinable port with VRAM issues) to compare PC and consoles, instead of the myriad of tests done by DF, is just hilarious.
Wrong, stop spouting anti-console gibberish.
My 1080 cannot play Cyberpunk 2077 at all. Well, 15fps but frequently dropping into the single digits at 1080p isn't 'playing'. PS5 can run 2240x1260 that mostly maintains 60fps. This is how I knew I needed an upgrade.
CP2077_1440p.png

Oh lookie here. Running at similar resolutions to what PS5 is actually outputting and only at 'high' settings it appears that the 3070 is getting about 60fps too.
 
My 1080 cannot play Cyberpunk 2077 at all. Well, 15fps but frequently dropping into the single digits at 1080p isn't 'playing'.

LMAO

My RTX 2060 (similar performance to a 1080) can run Cyberpunk using Digital Foundry's optimized settings at around 60 FPS (ranging from 50 to 80) at 1080p. And before that I had a GTX 1060 (half of the performance of the 1080) and could still do 30 to 40 FPS. You're either blatantly lying, or have a very defective 1080.

Oh lookie here. Running at similar resolutions to what PS5 is actually outputting and only at 'high' settings it appears that the 3070 is getting about 60fps too.

Except the consoles run Cyberpunk with a mix of low, medium and high settings. So this image you linked shows those GPUs running it not only at a higher resolution than the consoles' floor, but also at higher settings than the consoles use. Are you under the delusion that consoles run games with maxed out graphics settings?

And again, you don't need to do all of this (incompetent) guesswork to try to compare PCs and consoles, Digital Foundry already did it for you. You can go see on their videos that, for example, consoles match a RTX 2080 in AC Valhalla, Death Stranding and Gears 5 (the best case scenario), they match a 2060 Super in Watch Dogs Legion (the worst case scenario), and they match a RTX 2070, RTX 2070 Super, or RTX 3060, which is in-between the two extremes, in most other tests they did. And again, that's with carefully matched settings (or in some cases, like Hitman 3, the devs provided them the list of console settings) and with a FPS counter on the screen.
 
Consoles are not = PC. The minimalist OS on console cannot be compared to full OS on PC.
Optimisations are not the same between the two. Not talking about hardware, that's different story. Yeah both run the same game but that's about all they have in common. If a console is what make you happy then stick with it.
Me, I had one console and never will get one again, not going to pay to use the Internet twice.
 
Enough of the consoles argument, thank you. Stick to the actual content of the article.
 
IMHO, both Nvidia and AMD launched these products a tier too high.

The 4060 would have been well received if labelled and priced as a 4050. The 4060Ti would probably have been well received as a 4060 (or 4050Ti/4050 Super). In fact, they may even have done well enough with those labels if launched at prices only slightly lower than they did.

Buyers have become used to significant performance bumps each generation for the same level of card. Price increases could be put down to inflation (most of this generation's cars have launched at higher prices than the last). But to see the 4060 models only a tiny amount faster than the 3060s, especially when you can now pick up a 3060Ti for £100 (25%) less than a 4060Ti... It makes little sense to choose the latest generation card right now.
 
Had my eye on the 7900XTX from Sapphire since they came out. It will just barely fit in my HAF XB, and is very quiet. However, I just wouldn't pay the European price of €1400+ for any GPU and said if it drops below the €1000 I'd think about it again. Well they are slowly dropping and getting there. But my current thought is, we're heading for the next gen at this stage, so I may just wait it out some more until then...
 
I just can't see the 406X series doing well at any price. I suppose there has to be a price they can drop to eventually to sell, but I suspect that's a price Nvidia would instead prefer to throw them into landfills instead, as has happened in the past in the video game industry.

I still think the 40 series is going to be short lived. Nvidia overplayed their hands, they will come back with the 50 series probably around when Battlemage launches, if it's any good whatsoever that will cannibalize the mid-range. Once the mid-range is completely gone, Nvidia will struggle to regain mind-share in the consumer space, if they really have moved on to enterprise and AI that's fine I suppose though.
 
With an shaky economic reality; I'd expect sales to keep plummeting when even the GTX 1060 from 5 years ago edges out the RTX 3060 (as does the GTX 1650) and even 2012/13 2/3/4GB Gpus can play any game at 720 or 1080p resolution.

CPUs might aa well follow suit at this point as they are even less critical in term of upgrade aa even a 2008 Bloomfield era i7/Xeon with 4c8t or more can still play current games fairly well. Certainly upgrading CPUs for gaming since 3rd gen i7 or AMD ryzen with 4c8t is very much unnecessary for gaming.
 
Last edited:
Inflation, low wages, bad value of the GPU's (high prices) with relatively smaller than expected uptick in performance, virtually no change in RAM, and where applicable, smaller bus.

Given I'm still using my Acer Predator Helios PH517-61 with Ryzen 2700 and Vega 56, jumping to Zen 4 dragon range of 12 or 16 cores CPU and NV RTX 4060 or 4070 would be a rather huge jump for me... and therefore worth my while (on a laptop that is - as I need a portable workstation for content creation/productivity and some gaming - and I'm waiting for Acer Nitro 17 for that because Asus and others are just ridiculously overpriced for what they offer).

But yeah, in the desktop space, the current value of GPU's compared to last generation in terms of performance and efficiency is just not worth it.
If a person already has RDNA 2 or RTX 3xxx series GPU, they should probably skip this generation and wait for the next, or just wait 3 or 4 years before they consider upgrading to start with.

The only ones who should probably consider upgrading are those on RDNA 1 and RTX 2xxx series GPU's - and even then, it will depend on their usage and overall needs (plus budget constraints).

If people continue refusing buying this latest generation of GPU's, then that will only force the prices to come down.

My guess is that RDNA 2 and RTX 3xxx are still offering better bang per buck and are clearing inventories of those to make room for RDNA 3 and RTX 4xxx series.

 
Considering I know a few people who bought an RTX3080 liquid cooled OEM for $1,800 just ovr a year ago... the Radeon XT is a steal for any 1440p Gamer.


nVidia just can justify their prices without crypto.
That is why RTX flopped, because CUDA is not needed in gaming and most of RTX sales were for crypto and creation. Not gaming.
 
Welp, I just got infected by the FOMO bug and the fever caused me to bite on that Amazon Red Devil 7900xtx for $840 deal (with $100 instant rebate and the Amazon Prime card of 5% back). It arrives this week and already planning on isolating just to make sure ... you know, for society's sake.
 
Thanks to miners, economy and maybe Intel? I can't wait to upgrade from my tired RX580 that I have finely tuned that in theory it can be marketed as an RX581 ha ha.
 
Well...
Another flop coming our way in the form of the RTX4060, reviewed today by many sites (with TS coming soon I am sure), with nearly all Reviews showing it costing 20% more then the Radeon 7600, without being much faster...

Both cards should be priced at $200 cards, not $250 and $300.



 
Most of these cards are sadly out of stock in my country, used market is insane (people asking like 340€ for used 6700xt when new one goes for 370€) and new cards below/around 500€ are just 6700xt and then just Nvidia with plenty of 8gb garbage. I am not paying 300€ for 12gb 3060 which is 2,5 years old technology nor am I paying 370€ for 6700XT. I guess my 4790k will push it towards the end of windows 10 after all.
 
Back