Desktop graphics card sales reach lowest point since 2005

Daniel Sims

Posts: 1,376   +43
Staff
The big picture: Shipments in 2022 showed declines across the entire tech industry, but the slump in desktop GPU sales might be historic. In a year with multiple major product launches, the dramatic drop in shipments likely stems from multiple factors. Despite the market-wide fall, Nvidia managed to significantly increase its market share.

In Jon Peddie's 2022 GPU market summary, Q3 shows the lowest desktop graphics card sales total since at least 2005. Like many other products, GPUs received a sales bump during 2020 and 2021 from remote-working customers, but the hangover looks especially hard for this market.

Intel, AMD, and Nvidia combined shipped just under 6.9 million discrete desktop graphics cards in Q3 2022, a 47 percent year-over-year drop. For comparison, the 2008 economic collapse caused a similar 46 percent decline from Q4 2007 (an all-time market high coinciding with the launch of Nvidia's legendary 8800 GT) to Q4 2008.

The 2022 plunge hit team red harder than team green. While Nvidia suffered a 40 percent year-over-year fall in discrete desktop GPU shipments, AMD fell 74 percent.

The post-lockdown slump that has affected PCs, tablets, smartphones, and other devices is a primary factor, but another is unique to GPUs – the end of crypto mining. Jon Peddie's data includes users who bought GPUs for mining Ethereum, which inflated sales and prices in recent years. 2022's crypto winter, combined with the Ethereum merge that ended GPU-based mining, significantly disrupted the market.

Nvidia and AMD launched new flagship GPUs this year, debuting the RTX 4000 and Radeon RX 7000 series, respectively, but only in the high-priced enthusiast segment. 2023 could see somewhat better sales depending on how the companies price the upcoming mainstream entries in those graphics card lines.

Intel's troubled entrance into the discrete graphics arena in 2022 with the Arc Alchemist series got the company up to 4 percent market share. Meanwhile, Nvidia's Q3 share shot up 10.2 percent year-over-year to 86 percent, squeezing AMD down to 10 percent market share – a 52 percent decline.

Most of the 2020 and 2021 spike in GPU shipments went to notebook graphics, which also saw the sharpest drop. Dedicated desktop graphics sales have gradually declined over the last two decades as laptop and integrated GPUs became good enough for growing numbers of consumers.

Permalink to story.

 
""Shipments in 2022 showed declines across the entire tech industry, but the slump in desktop GPU sales might be historic.""

well duh .. 2021 and 2020 forced everyone to buy ''something'' .. now no-one needs a new card except the Jones 4090ti uber wide butt edition for sweaty grinds
 
If you have a 3070 or better, you've got everything you need. Frankly, I'm not sure there's going to be any really good reason to upgrade for a very long time. For most of us 4K is overkill and demands a costly CPU and ample RAM to keep from bottlenecking your graphics. QHD (2560x1440) is more than adequate for most of us and your mid-range chip will handle it just fine, plus there are a lot of really nice, affordable monitors in that resolution. The vast majority of gamers are way more interested in FPS and smooth loading of world data than near-photorealism.

Remember when the Nvidia CEO said
"Price discounts on Video Graphics Cards was a thing of the past"

Well let's say he couldn't have been more wrong....

I'm not sure you can call it a discount when its really just "a bit less gouging". We're still way above a realistic MSRP for graphic silicon.
 
If you have a 3070 or better, you've got everything you need. Frankly, I'm not sure there's going to be any really good reason to upgrade for a very long time. For most of us 4K is overkill and demands a costly CPU and ample RAM to keep from bottlenecking your graphics. QHD (2560x1440) is more than adequate for most of us and your mid-range chip will handle it just fine, plus there are a lot of really nice, affordable monitors in that resolution. The vast majority of gamers are way more interested in FPS and smooth loading of world data than near-photorealism.

I'm not sure you can call it a discount when its really just "a bit less gouging". We're still way above a realistic MSRP for graphic silicon.

This ^ ... unless you are a sweaty hardcore RTX gamer on 4k .. 3070 is fine
 
Nvidia has lost touch with reality with their pricing. They conflated "rich gamer" and "cryptominer" with "mass market." Not everyone has the disposable income to spend over $200 on a card, let alone $1,500, especially with inflation disproportionately impacting everyday things like food, fuel, and energy.
 
Last edited:
This ^ ... unless you are a sweaty hardcore RTX gamer on 4k .. 3070 is fine


and like I said about discrete GPU being a cyclical market full-of-terror: if its already bad for players like AMD to sustain market-share, you can imagine the pain new players like lintel are forced to pay!

its going to need a consistent sea of discounts for Arc to make any double-digit headway. I expected the"new magical never-ending growing desktop market" for discrete to be just as weak!
 
Last edited:
OH NO! Anyway…

P.S. Is it me, or these JPR do like drawing what ppl or corpos like to see, but not real data? Like wut? - peak sales in 2007? I mean I know 8800gt was great, but…
 
Do people actually notice the minimum requirements for new games? 15 years ago "minimum requirements" were actually taken seriously because you need recent cards (DX and pixel shader support) to be able to play latest titles.

today if you have 6-yo GTX1060 you can play almost everything at low settings. it's crazy how we used to upgrade so often back in the days. thank god for direct x feature level 12 huh
 
Honestly there is little reason to upgrade from a high end 2015 GPU onward at this point for 1080p; even 2013 territory high end like a GTX 780 or R9 290 can do fine on low-medium settings. Likewise 2018 onward GPU works great for 1440p. Raytracing to me is just an unecessary gimmick/fad like the overblown bloom of the 00s; not realistic; overly done...and in the case of raytracing ridiculously performance degrading.
 
Not really surprised people are tired of paying $1000+ for videocards. And the 100k people that bought 4090s and with nvidia having so much market share you are the problem and the prices will never go down on the NV side.

Looks like being a hardcore RTX guy is bad for the general market.
 
Economic circumstances are such that many people need to think twice or longer about big purchases.

It is encouraging that Nvidia is still able to grow market share in such a challenging environment.
It makes sense that more discerning and better educated buyers go for higher quality products like Nvidia.
 
The prices of Nvidia 4080 and 4090 are outrageous and I wont be buying any of them. Shame on Nvidia (and AMD for taking advantage and following the trend).
 
Economic circumstances are such that many people need to think twice or longer about big purchases.

It is encouraging that Nvidia is still able to grow market share in such a challenging environment.
It makes sense that more discerning and better educated buyers go for higher quality products like Nvidia.
I don't think Nvidia "grew" market share as much as AMD and Intel lost ground. Likewise, I think a lot of people (like me) are in a holding pattern because not all of these GPUs came out at the same time. Once we get the 4070Ti benchmarked, I think people will then make their purchase decision. And, frankly, at $800+ I doubt any of these GPUs will sell well. Prices for a high-end card should be $700-800, a mid range maybe $550-650 but it seems like this isn't happening and until both companies feel a little pain over lack of sales I doubt any of these cards will get major discounts.
 
Well just hold your gun, ignore the hype and if we can do it for this whole GPU gen, we might probably see magic in prices next GPU gen. After AMD pricing the 7900XTX, I'm seeing another good sign here.
 
Honestly there is little reason to upgrade from a high end 2015 GPU onward at this point for 1080p; even 2013 territory high end like a GTX 780 or R9 290 can do fine on low-medium settings. Likewise 2018 onward GPU works great for 1440p. Raytracing to me is just an unecessary gimmick/fad like the overblown bloom of the 00s; not realistic; overly done...and in the case of raytracing ridiculously performance degrading.

Dunno about that... Playing Control or Bright Memory: Infinite with ray tracing effects on (and DLSS 2 too) has been a blast...
 
Things that don't help GPU sales *beside* the price (which doesn't really help any):

1) I went to get a GPU to pop into my Ivy Bridge desktop. Problem? The ONLY cards on the market that run off the 75W PCIe power and don't require extra power connectors are some 20-year-old Geforce2s that for some reason are still on the market; and the NVidia GTX1650 (and not all of those, some need the extra power connector and some don't!) Popping a video card in there is easy, figuring out what monstrosity of a power supply a HP Compaq Elite 8300 CMT uses to replace it with higher wattage? I have no idea. I also have no idea if some 150W or whatever GPU would get enough airflow or if it'd just melt down.

2) If you get a new system, the AMD Ryzens have lovely GPUs built in. The newer Intel GPUs are great too. I had a Ryzen 3500U recently and that ran every game I threw at it; I now have an 11th gen Intel Core system (with 12th gen GPU) and that also runs everything I throw at it. The Linux drivers are fast and compatible. I'm not worried about 4K gaming, or running games at 120 or 240hz. In general I can run games at high settings, but if I threw Cyberpunk 2079 or something on there I'm not going to sweat it if I have to run a game at medium every now and then.

So, the sale of cards for older systems is limited by lack of product availability. And sale of cards for newer systems is limited by the on-board GPUs being way better than they were like 5 or 10 years ago, it's now perfectly reasonable to expect to run games at reasonable settings and frame rate using the onboard GPU.
 
If you have a 3070 or better, you've got everything you need. Frankly, I'm not sure there's going to be any really good reason to upgrade for a very long time. For most of us 4K is overkill and demands a costly CPU and ample RAM to keep from bottlenecking your graphics. QHD (2560x1440) is more than adequate for most of us and your mid-range chip will handle it just fine, plus there are a lot of really nice, affordable monitors in that resolution. The vast majority of gamers are way more interested in FPS and smooth loading of world data than near-photorealism.



I'm not sure you can call it a discount when its really just "a bit less gouging". We're still way above a realistic MSRP for graphic silicon.

Yeah, the thing is the 30 series, more specifically the 3080 and 3090 cards, actually unlocked reliable high frame rate 4K gaming, even without DLSS. It was a major leap over the 10 and 20 series because of that. And with DLSS their useful lives will be extended further. I still don't see RTX as that big of a selling point and I'm sure many people would forego frame generation, so the 40 series offers no significant improvement over what came before, especially considering what you're paying. As a bonus, AIB 30 series cards still use standard 8-pin connectors.
 
I don't think Nvidia "grew" market share as much as AMD and Intel lost ground.
It didn't. Those numbers are units shipped, not sold. It's known fact that shops have plenty of old Nvidia stuff unsold. That means Nvidia will ship much less GPUs near future.
 
I feel that the prices are getting too high for an individual to justify the purchase, given the current state of the U.S. economy. There just isn't enough "fun money" around for most people. There would be more GPU's sold if they were cheaper.
 
Graphics cards are much more expensive than they used to be. Spend 300 dollars and you still feel like you've bought a bit of a slow lemon.
 
Volume sales model is dead anyway. The established players know the ceiling is much higher for enough consumers that simply charging more, knowing less can or will pay, is indeed sustainable. All the more reason everyone should be rooting for Intel to succeed in the discrete gfx market.
 
Back