Global chip shortages expected to last into 2022

midian182

Posts: 9,766   +121
Staff member
Forward-looking: Are you hoping the worldwide chip shortage will come to an end soon? Sadly, the current situation of demand far outweighing supply is predicted to last for another three or four quarters, followed by a further one or two quarters before inventories reach normal levels. If that's accurate, it'll be sometime in 2022 before the industry gets back to normal.

We've seen pretty much every market where the end product uses a chip suffer in the global shortage, with automobiles, PC hardware, and games consoles hit hardest.

The biggest issue has been Covid-19 and the stay-at-home orders. With most of the world suddenly switching to remote working, and home-based entertainment products becoming the norm as everything from bars to cinemas pauses trading, demand for PCs, laptops, TVs, consoles, etc. reached unprecedented levels.

Exacerbating the issue has been the increasingly complex manufacturing process making chips more difficult to produce, the larger number of chips in every device, logistical problems, and package shortages. The China/US trade war saw companies trying to stockpile chips in advance, putting further pressure on manufacturers.

Wafer capacity leaders

Company Monthly wafer manufacturing capacity Total global capacity share
Samsung 3.1 million 14.7%
TSMC 2.7 million 13.1%
Micron Technology 1.9 million+ 9.3%
SK Hynix ~1.85 million 9%
Kioxia 1.6 million 7.7%
Intel 884,000 ~4.1%

With production already struggling to meet demand over the previous two years, the problem came to a head in 2020.

MarketWatch writes that normality is still a long way off. "We believe semi companies are shipping 10% to 30% below current demand levels and it will take at least 3-4 quarters for supply to catch up with demand and then another 1-2 quarters for inventories at customers/distribution channels to be replenished back to normal levels," said Harlan Sur, an analyst with J.P. Morgan.

Susquehanna Financial analyst Christopher Rolland said chip shortages would worsen as we head into spring as economies open up following lockdown easing and continuing vaccine rollouts. Lead times—the length of time between an order being placed and delivery—are entering a "danger zone" of above 14 weeks, the longest they've been since 2018.

"We do not see any major correction on the horizon, given ongoing supply constraints as well as continued optimism about improving demand in 2H21," wrote Stifel analyst Matthew Sheerin. "We remain more concerned with continued supply disruptions, and increased materials costs, than we do an imminent multi-quarter inventory correction."

While all this might be good news for chipmakers who have seen their stock prices soar, it's leaving consumers frustrated as attempts to buy the latest products prove fruitless. Wafer producers increased their output 40 percent in December, and the Biden administration is getting involved, but this problem isn't going away anytime soon.

Permalink to story.

 
Of course a cynic might say that when these chip makers can make more profits selling their in short supply products at the inflated prices as they could ever make selling them at normal profit margins and production levels, there really isn't much incentive for these companies to turn the production taps back on too quickly, is there.
 
Why does COVID have anything to do with this shortage? Don’t these people normally work in a giant clean room dressed from head-to-toe in protective gear everyday anyway? Should be just like a hospital ....they didn’t shut down, they ramped-up. If anything we should have an oversupply of chips and an undersupply of boards.
 
Why does COVID have anything to do with this shortage? Don’t these people normally work in a giant clean room dressed from head-to-toe in protective gear everyday anyway? Should be just like a hospital ....they didn’t shut down, they ramped-up. If anything we should have an oversupply of chips and an undersupply of boards.
could be their supply chain that is highly impacted?
 
Protective gear to protect against carcinogens and trap dust, not viruses. Outside of a full pressure suit or active respirator there is no way to stop viruses from spreading. And the company would have to abide by government guidelines limiting the number of people in an area ece.

Not to mention even if TAMC isn’t affected, their suppliers may be. Substrates have been in chronic short supply for months now.
Why does COVID have anything to do with this shortage? Don’t these people normally work in a giant clean room dressed from head-to-toe in protective gear everyday anyway? Should be just like a hospital ....they didn’t shut down, they ramped-up. If anything we should have an oversupply of chips and an undersupply of boards.
 
Of course a cynic might say that when these chip makers can make more profits selling their in short supply products at the inflated prices as they could ever make selling them at normal profit margins and production levels, there really isn't much incentive for these companies to turn the production taps back on too quickly, is there.
The production "taps" are already at 100% so your conspiracy theory is silly.

And I know this is shocking but you cannot build a new factory that creates bleeding edge technology at nanometer dimensions overnight. It takes *years* to build. Google the new fabs going in Austin if you don't believe me.
 
Glancing at the Wafer Capacity Leaders chart quickly, I thought for intel it said:

-4.1% (negative 4.1%) and I was thinking that's bad...Intel is so bad at this they're actually having a negative impact on overall wafer capacity and bringing the numbers down.
 
Yeah, the number of people in here who think increasing production levels is as simple as 'make conveyor go faster' is astounding. They would fit right in with upper management at some of these companies who probably are raging that they can't increase their supply 'now' to 'take advantage' of the higher prices (ignoring the fact that if they succeeded in doing that, prices would fall again in fairly short order).
 
It's hardly surprising. Demand is going to continue to be high for another few quarters until normal life resumes. New capacity can't come online overnight, and while some sectors (e.g. automotive) will have continued increased demand due to market shifts, I would not be surprised if consumer demand falls in 2022 and 2023, as people clamor for more real-world interactive activities as opposed to virtual ones. Thus if you invest too much in increasing capacity now, you risk being caught holding the bag when demand falls to previous levels, or potentially lower for a time.

The crypto boom is exacerbating this; we saw a similar situation last time where nVIDIA and AMD knew demand was likely to crater, and were thus sensibly hesitant to invest more in production.
 
Glancing at the Wafer Capacity Leaders chart quickly, I thought for intel it said:

-4.1% (negative 4.1%) and I was thinking that's bad...Intel is so bad at this they're actually having a negative impact on overall wafer capacity and bringing the numbers down.

It's a tilde (notation for approximate), not a minus sign.
 
I'm starting to think that TSMC's stance that they're not going to increase production capacity may not have been as good an idea as they originally thought. They're leaving a crap-tonne of money on the table that disappears every time someone who was going to buy an AMD product ends up with something else due to a lack of choice in the matter.
 
I don't think supply is going to improve for years.
China controls the rare-earth metals, and many of their most profitable companies have been on the receiving end of US inspired tariffs. So far they have only turned the screw a tiny bit...
 
Why does COVID have anything to do with this shortage? Don’t these people normally work in a giant clean room dressed from head-to-toe in protective gear everyday anyway? Should be just like a hospital ....they didn’t shut down, they ramped-up. If anything we should have an oversupply of chips and an undersupply of boards.
Because supply chains across the world are heavily hampered. You need raw materials etc. The ports in Australia are wayyyy behind where they would be without covid. Stuff is delayed months more than usual timelines.
 
Because supply chains across the world are heavily hampered. You need raw materials etc. The ports in Australia are wayyyy behind where they would be without covid. Stuff is delayed months more than usual timelines.

Yup, this is exactly why I stopped my hankering for a new video card before next year!

The added costs of each port's COVID routine, combined with the reduced number of ships on the ocean means nothing short of worldwide vaccination is going to fix this.
 
Yup, this is exactly why I stopped my hankering for a new video card before next year!

The added costs of each port's COVID routine, combined with the reduced number of ships on the ocean means nothing short of worldwide vaccination is going to fix this.
My vid card blew up last November. Still waiting on my 3080. Got a pre-order in but the queues have been insane.
 
Why does COVID have anything to do with this shortage? Don’t these people normally work in a giant clean room dressed from head-to-toe in protective gear everyday anyway? Should be just like a hospital ....they didn’t shut down, they ramped-up. If anything we should have an oversupply of chips and an undersupply of boards.
It is far more complex than that.
The demand rose dramatically while production slowed down due to many logistic issues.
It is not like the foundries are closed. It is just difficult to transport everything around the world, including raw materials needed to keep the industry going on.
 
Why does COVID have anything to do with this shortage? Don’t these people normally work in a giant clean room dressed from head-to-toe in protective gear everyday anyway? Should be just like a hospital ....they didn’t shut down, they ramped-up. If anything we should have an oversupply of chips and an undersupply of boards.
Welcome to the world of just in time ordering, where no-one keeps more than couple weeks worth of supplies in stock. It doesn't matter WHERE in the supply gain a problem occurs, if it lasts more than two weeks the production will come to a halt.
 
This is a false shortage because it's a result of consumerism. We've seen from those scaling tests that even Ryzen 1 CPUs are perfectly fine for gaming and as I've discovered personally, modern games that are graphics-intensive can still run decently on my old R9 Fury.

So, if I'm able to play Godfall, a game that is known for its intense graphics requirements just fine on that old R9 Fury, then anyone with a GTX 980 Ti or GTX 1070 or better is just whining for the sake of it because they're both faster cards than the Fury.

People's ability to play games hasn't been hampered by this "shortage". Our society is just full of marketing-programmed babies that have this idea that they're somehow "missing out" because they don't have the "latest and greatest" hardware. Corporate marketing has put it in their heads that unless they're playing a game at 1440p ultra or better at 120fps, their experience is just terrible despite the fact that an R5-1600X or R7-1700 with a GTX 1070 will give them an enjoyable gaming experience that is easily better than any console previous to the current generation. How on Earth did these whiny-babies survive the past four years?

Sending my RX 5700 XT back for RMA showed me that I was enjoying Godfall and Assassin's Creed: Odyssey just as well on my Fury as I was on my 5700 XT. The way I knew this is that when I was playing, I completely forgot that I was using my Fury because even though I had turned some things down (like shadows), both games still looked glorious at 1080p. I realised just how good the Fury was working when my replacement card arrived and I caught me by surprise on a Tuesday. So good in fact, that I didn't bother installing the 5700 XT until the next weekend and I didn't go back to 1440p. Since I couldn't see much of a difference between them, if any at all, I decided that I'd rather play at 1080p and reduce the work of my card by about 50%.

Anyone who has a GTX 1070 or better GPU and a R5-1xxx or better CPU and feels some dire need to upgrade is suffering from what I've decided to call CMS, Consumerism Marketing Syndrome. It's the same illness that makes people think that paying over $1000 for a fracking cellphone is reasonable.

Now, don't get me wrong, I get it. I grabbed my RX 5700 XT because I knew that my R9 Furies were aging and I wanted to see what playing a modern game at 1440p was like. It was nice and all but I really didn't feel like it was THAT much better so I tried running games first at 1080p and then upping it to 1440p. I honestly wondered if there was something wrong with me because I didn't really see a difference.

So, I ran the Far Cry 5 benchmark at 1080p, 1440p and 2160p on my 55" TV and stood right in front of it, trying to see a difference. As strange as it sounds, I ultimately I had to accept the fact that I was unsuccessful in seeing any difference. Everything was crisp and clear regardless of which resolution I had it set to and I was standing close to a 55" display. A screen that large would make telling the difference easier, not harder and I was standing right in front of it with my face only inches away, looking at things like water quality, edge pixellation, foliage quality and overall appearance. If I were doing a blind test, I wouldn't be able to tell one from the other on a 55" display which means that nobody else would be able to see a difference on a 30" or smaller no matter how good their eyes are.

My advice, listen to Sheryl Crow's song "Soak Up The Sun" and pay attention to one of the most important lines that I've ever heard in music history about the secret of happiness:
"It's not having what you want, it's wanting what you've got."

The key to weathering this "shortage" is to just be satisfied if your hardware runs a game smoothly and lose yourself in it because regardless of what nVidia's marketing BS says, that's the way games were meant to be played.
 
So if chip shortage is expected to last untill 2022 then really it will last till 2023-or4 as the new chips /cpus/gpus are due out in 2022.
 
This is a false shortage because it's a result of consumerism. We've seen from those scaling tests that even Ryzen 1 CPUs are perfectly fine for gaming and as I've discovered personally, modern games that are graphics-intensive can still run decently on my old R9 Fury.

So, if I'm able to play Godfall, a game that is known for its intense graphics requirements just fine on that old R9 Fury, then anyone with a GTX 980 Ti or GTX 1070 or better is just whining for the sake of it because they're both faster cards than the Fury.

I can play Godfall on my 980Ti. In fact my 980Ti OCs well enough that it outperforms the 1070 by a small margin.

I'd really like to get a new GPU because my soon to be 6 year old 980Ti is starting to act up with random fan spikes to max RPMs. This happens while gaming or even when sitting idle.

I do plan on pulling her all apart next week or soon after once I get some thermal pads in since I've had a few suggestions it could be old, dried up TIM causing some odd heat readings and the fans spin up fast for a few seconds.....or perhaps the fans are failing....or perhaps the fan controller on the card is failing.....

I'd much rather get a replacement, new GPU before my 980Ti takes a dump and I'm left without a gaming system. Sadly, though, there are no viable GPUs available to replace my current one with because they're all sold out "due to supply constraints".

Then the only other ways to pick up a new, acceptable GPU at the moment are:
* hunt websites and hope you are extremely lucky at getting one before it instantly vanishes
* get on a waiting list, which they don't seem to move because of lack of retail inventory or AIB manufacturers aren't making enough to sell to folks on their lists and they are outsourcing all GPUs to pre-built companies, miners/scalpers that buy on the backend in massive quantites
* hit up ebay/amazon to find a GPU that was originally $400, now being priced out a 250% markup.....if not more.

Yes, folks will complain. Some have legit reasons and others complain just to complain because they can.

I was all ready with money set aside for a new GPU - only to watch them all fly off the shelves in record speeds and then barely anything comes in for the higher-end cards. All while you see swaths of GPUs being used in crypto farms or scalpers botting up the remaining few that do hit the retail end and resell them for 2/3/4x their initial price. It's gross. It's been like this since October when the 3070 launched.

It's not your place to tell people they shouldn't complain about new GPUs and to use old GPUs to run their games just fine. You don't know everyone's situation or need. If your old hardware is still functional and you can play games on them, good for you. Keep doing that. Let others do what they want; if it's complaining or gaming, it's on them.
 
Last edited:
I bought a 2080S mid June last year because I was getting so tired of all the SLI issues I was having with my 980s. Then Nvidia dropped their bomb "better performance at the same price" with the 3080 and I was kind of pissed at myself for not waiting. Now? Well overall I'm happy with my purchase, a MSI waterhawk so my system is nice and quite even with an overclock. So far it's performance has been great. More importantly I'm not stuck waiting and waiting for a possible compromise card because I can't get the one I really want. So I'm good...
 
Back