Nvidia might cut the supply of RTX 4070 GPUs in response to slow sales

nanoguy

Posts: 1,355   +27
Staff member
Why it matters: The average selling price for a GPU at retail has more or less doubled in the past three years for both Nvidia and AMD. Intel's first Arc GPUs have yet to pose any significant competition for either company, so it shouldn't surprise anyone if Nvidia went out of its way to prevent prices from dropping.

Earlier this month, Nvidia launched the GeForce RTX 4070 – a highly anticipated mid-range graphics card based on the Ada Lovelace microarchitecture. Unfortunately, it arrived at a time when gamers are much more reluctant to pay a premium for a product that only looks like good value because GPUs, in general, are still overpriced.

The silver lining for this release is that it prompted AMD to respond with discounts on last-gen graphics cards like the Radeon RX 6800 and RX 6800 XT. After a few years of sky-high prices for the most crucial component in a gaming PC, it's nice to see the price wars heating up again.

Rumors from China say Nvidia may temporarily restrict the supply of RTX 4070 cards in response to slower-than-expected sales. The company flooded retailers with its newest GPU hoping that positive reviews would nudge gamers toward an upgrade, but the $600 MSRP turned out to be a bit too optimistic.

According to supply chain insiders, Nvidia recently informed AIB partners that it would pause production of RTX 4070 GPUs for at least one month to prevent an overstock issue and allow retailers more time to clear existing inventory. Some, like Novatech and Overclockers in the UK, have already started discounting models, including the Palit RTX 4070 Dual or the Inno3D RTX 4070 Twin X2, but they're still far from flying off the shelves.

If anything, we're glad to see gamers voting with their wallets when GPU manufacturers push their luck with a premium pricing strategy in the middle of an economic storm. We can only hope they take a more sensible approach with future products like the RTX 4060 Ti and mid-range RDNA 3 models from AMD, which shoud break cover later this year.

Permalink to story.

 
Now that the scarcity & Cryptohype is over, they won't get rid of those things on the paving stones. Beautiful, beautiful, beautiful. 20 years ago I built a very competent PC for 500€. Now a video card costs more than that. Yeah, bye. My income has really not doubled in those 20 years. It is still approx. the same. --edit--, say, 20% more, to compensate for inflation.
 
"we can't drop the price, cut supply!"
*same unsold cards are still sitting there for the same reasons*
"okay, so the 12gb 4070 for $600 didn't sell, I think it's time to drop the 8gb 4060ti for $450 now"
*also sits on shelves unsold for the same reasons*

I'm going to call any card in the $1000+ range as "ultra-high end". Ultra-highend buyers didn't want the 4080, people in that price range can afford the best and will buy the best and don't want second best. Whether the best is $500 or $2000, they will buy it. The spread of the amount of money you can charge for a graphics card isn't some slider you can adjust. There is the ultra highend buyers and then there is everyone else. To "everyone else", a $600-700 graphics card is ultra highend. Changing the name isn't going to get the people willing to spend $600 on a 3080 to spend $1200 on a 4080. They aren't going to go "O'well, 80 series cards are $1200 now, guess I have to spend $1200"

That's not how the spread of the amount of money people are willing to spend works.

There is nothing physically wrong with the 40 series. Take away the price tag and they're a really great line of products. You add the price tag and people start seriously scratching their heads. What did nVidia's marketing team think of a 4070 being released at $600 was going to do, have people cheering, "our savior card is here! They finally heard us, us plebs can finally afford graphics cards again! Hail nVidia!"

The 4070 MIGHT have been a $600 card if it came with 16 gigs of VRAM.

The worst part of all this, cheap AMD 6000 series cards might start selling out soon and the price go up with nVidia acting like this. If you're on the fence about buying a 6000 series graphics card, now might be the time to do it.
 
Um… the 4070 was the top selling card that week at Mindfactory.

Although it would be interesting to compare that to sales numbers for other cards after release.
 
If the last of us hadn’t tanked on 8GB cards then these would be selling much quicker. That and Hogwarts. Suddenly (yeah I know) 12GB for $600 looks like a bad buy unless you want to upgrade your GPU again in 2 years once all the new UE5 titles start churning through textures.
 
Remember the guy who posted here who bought a 3090 at the height of Crypto-Fever times at like over $2.5K and thought he got a bargain too? NVIDIA and the Leatherman think most gamers are like that guy.

Two years of Crypto Madness made NVIDIA to forego all logic and detach themselves completely from reality.

They really, really thought that there are millions of mindless drooling gamer drones out there who would rush to buy a -70 series card, which ought to sell for around $350 at the price of over $600.

Time to rethink pricing NVIDIA. In the meantime I happily game with my 1080 at 3440X1440 while the Leatherman gets red in rage.
 
Um… the 4070 was the top selling card that week at Mindfactory.
It still is, I believe, though AMD is selling more combined than Nvidia -- sales revenues are pretty much the same, though.
 
Funny thing is how I see the fanboys spout inflation numbers.

Hey fanboys, can you tell me what $0 inflation adjusts for in 2023?
 
They're not euphemistically referred to as nGreedia for nothing.

At this point, only AMD and Intel can save the gaming community because nGreedia sure ain't going to.
Nope, AMD just follows Nvidia, look at their overpriced 7000 crap series and Intel has yet to make a decent product. I think the market itself will punish them for the extreme greed and only then, they will all rethink their strategy. Sure, for now, they will revert to their old trick, produce scarcity, but if people still won`t give a fck about their cards, then they`ll have no choice.
 
I'm proud of every consumer who said no to this card at $599.

I'm curious what will happen after a few more generations where mainstream cards are caught up to 4K resolutions. I think desktop monitors first started creeping over 1080p primarily for non-gaming reasons; and that 4K will eventually become standard mostly because that's where the volume for TVs is. This in turn creates an opportunity for GPU makers to sell catch-up capability for the new higher resolutions.

But will this go on forever? Is 8K going to be a standard in the living room or in monitors? If not is there a large audience of people that are going to pay for increasingly higher framerates at 4K beyond 30 / 60 / 120 depending on game types played?

I think I wouldn't mind if developers and gamers decided that there is a point that is "enough" as far as pixel pushing and invested game dev budgets in other more exciting aspects of the medium.
 
I'm proud of every consumer who said no to this card at $599.

I'm curious what will happen after a few more generations where mainstream cards are caught up to 4K resolutions. I think desktop monitors first started creeping over 1080p primarily for non-gaming reasons; and that 4K will eventually become standard mostly because that's where the volume for TVs is. This in turn creates an opportunity for GPU makers to sell catch-up capability for the new higher resolutions.

But will this go on forever? Is 8K going to be a standard in the living room or in monitors? If not is there a large audience of people that are going to pay for increasingly higher framerates at 4K beyond 30 / 60 / 120 depending on game types played?

I think I wouldn't mind if developers and gamers decided that there is a point that is "enough" as far as pixel pushing and invested game dev budgets in other more exciting aspects of the medium.
I`ve been wondering this myself. It`s either pixel push or achieve true photorealism. The former was always easier, but now it`s becoming redundant, because few people want to play on PC on a TV size monitor. So, 8k will be a niche. I`ve been dreaming of photorealism since forever and path tracing looks great and hopefully, a step in the right direction. Unfortunately, I fear there is a consistent opposition to photorealism, of games becoming too real, so, it could never happen even if we`ll have the technology to do it. On the other hand, what else is there? Who`s gonna buy the 20th series Nvidia, as in 20080ti, for a million fps on 4 or 8k?
 
I'm proud of every consumer who said no to this card at $599.

I'm curious what will happen after a few more generations where mainstream cards are caught up to 4K resolutions. I think desktop monitors first started creeping over 1080p primarily for non-gaming reasons; and that 4K will eventually become standard mostly because that's where the volume for TVs is. This in turn creates an opportunity for GPU makers to sell catch-up capability for the new higher resolutions.

But will this go on forever? Is 8K going to be a standard in the living room or in monitors? If not is there a large audience of people that are going to pay for increasingly higher framerates at 4K beyond 30 / 60 / 120 depending on game types played?

I think I wouldn't mind if developers and gamers decided that there is a point that is "enough" as far as pixel pushing and invested game dev budgets in other more exciting aspects of the medium.
I don't see 8k gaming becoming a thing or atleast a large one. Going from 1440p to 4k for me is like going from 60hz to 120. I haven't gamed at 8k but I've had plenty of time in person comparing 4k and 8k displays, I can't tell the difference between 4k and 8k on a screen smaller than 75 inches and that's me looking for it. If I walked into a room and someone told me a screen was 8k I wouldn't be able to tell without walking up to it and actually looking at the pixels.

There are so many cool things coming out that we can be using GPU power for that going from 4k to 8k is pointless. The biggest is that you have a really hard time telling the difference. The other is that people have started to work on more advanced NPC AI in games using graphics cards. To me, having more interesting AI in say skyrim or cyberpunk is way more interesting than going from 4k to 8k. Another limiting factor is that there currently isn't a display, to my knowledge, that supports 8k120hz. 120hz is a pretty important feature for people spending that kind of money
 
I just don't see where the value is at $600 for the RTX 4070. NVIDIA is pricing themselves out of the rest of us customers who value our money, and don't have a lot of it to throw at this stupid price point. I shudder to think how stupidly priced the RTX 4060 will be, and have even less VRAM, making it useless apparently for modern games as they eat more and more of it up for texturing,etc.
 
Funny thing is how I see the fanboys spout inflation numbers.

Hey fanboys, can you tell me what $0 inflation adjusts for in 2023?
To understand inflation, you don't have to be a "fanboy", you just have to have brains.

Nevertheless, NV can take this card up their arse. 8GB just won't cut it in this range anymore.
 
I'm proud of every consumer who said no to this card at $599.

I'm curious what will happen after a few more generations where mainstream cards are caught up to 4K resolutions. I think desktop monitors first started creeping over 1080p primarily for non-gaming reasons; and that 4K will eventually become standard mostly because that's where the volume for TVs is. This in turn creates an opportunity for GPU makers to sell catch-up capability for the new higher resolutions.

But will this go on forever? Is 8K going to be a standard in the living room or in monitors? If not is there a large audience of people that are going to pay for increasingly higher framerates at 4K beyond 30 / 60 / 120 depending on game types played?

I think I wouldn't mind if developers and gamers decided that there is a point that is "enough" as far as pixel pushing and invested game dev budgets in other more exciting aspects of the medium.
though back in the days of the GeForce 3/4 and Radeon 8500 to 9700 you already had gamers using 1600x1200 monitors. I guess 4K and upscaling tech will be around for a while then geometry, textures and RT/path-traced lighting just get more and more detailed.
 
You got this all wrong .... Nvidia is limiting production of consumer graphics cards because of the high demand for the H100 professional AI products which are harder to make and take longer to produce ..... Elon Musk just ordered 10,000 H100 and he isn't even one of their biggest customers .... That's $350 million in sales and even at $600 Nvidia would have to sell 583,333 4070's to generate that kind of revenue and that's just not going to happen in a consumer PC market that is down by over 30% across the board ...... So why waste time making a product that isn't going to sell no matter the price when you can make a high demand high margin product like the H100?

AMD discounting their 2+ year old devices may be nice for consumers but it's bad news for AMD's GPU division because it means they have a lot of left over stock they desperately need to sell at near cost before they can release anything new than can compete with the 4070 and 4070 Ti or the upcoming 4060 and 4060 Ti .... In the long run that just weakens AMD as a competitor to Nvidia
 
Back