Nvidia reportedly spent $10 billion for a chunk of TSMC's 5nm manufacturing capacity

nanoguy

Posts: 1,355   +27
Staff member
Why it matters: Securing enough manufacturing capacity is going to be critical to the success of Nvidia’s RTX 4000 GPUs, which are expected to land as soon as this summer. The RTX 3000 series were regarded by gamers as paper launches, but the company is said to have paid through the nose so that wouldn’t be the case for its upcoming GPUs.

Back in November 2021, the rumor mill was abuzz with hints that Nvidia was planning to use TSMC’s 5nm process node for its upcoming GeForce RTX 4000 series (Ada Lovelace) GPUs. These are widely expected to be significantly faster and more power-hungry when compared to the current Ampere lineup, but a much bigger issue for the Jensen Huang-powered company is securing enough manufacturing capacity.

According to a report from Hardware Times, Nvidia is prepared to pay dearly for the ability to meet growing demand for increasingly powerful graphics cards. Specifically, it may be offering up to $10 billion to TSMC for a significant chunk of its 5nm manufacturing capacity. Other companies tapping TSMC’s N5 process node are Bitmain, AMD, and Apple.

Nvidia previously used Samsung’s 8nm process node to make Ampere GPUs, which are notorious for being power hungry. The company may have also dealt with relatively poor production yields when it comes to its RTX 3000 series GPUs, so moving to a smaller process node could help in that regard. However, TSMC’s 5nm wafers are expected to cost quite a bit more, which could translate into more expensive graphics cards.

With Intel’s Arc (Alchemist) GPUs on the horizon, it makes perfect sense for Nvidia to ensure availability issues won’t plague the launch of its upcoming RTX 4000 series GPUs. The tech supply chain is on the road to recovery, but most industry leaders believe the ongoing shortage of chips and passive components will persist until 2023. When it comes to graphics cards, Nvidia and AIB partners are more optimistic, but we’ll have to wait and see.

Permalink to story.

 
All this paying "dearly" and "through the nose" will of course be passed onto consumers, but it is at least a sign of economics working the way it's supposed to: buyers are expressing their demand via their wallets, and that demand is resulting in the shift of global resources towards meeting that demand. This is an example of something GPU companies can do when charging high prices directly, that scalpers can not and do not do when they siphon the resources out from the middle of the transaction.

Of course until there's net additional capacity online, any additional wafers for Nvidia are coming at the expense of something else.
 
That‘s good news for the GPU market regardless which company you buy from and with their large dies nVidia might need the capacity.

I still find it odd that Samsung fab capacity was the problem this gen - what else is mass produced on that node ?

How much would it cost to just build your own fab versus 10 bill for 5% of another? (Minus the timing constraints).

Don‘t think that‘s a good idea (a fab by itself does little good), but otoh asking TSMC to build a fab for you might be an option.
 
I can literally see the Scalpers sharpening their blades as we speak and the Miners subscribing to the most popular bots there are.

It will be a metaphorical bloody massacre when RTX 4K series hit, not that any gamer in his right mind would buy GTX 480 descendants with a 500-700W TDP.
 
I know a lot of people are hoping that this next gen will bring down prices (like they thought Ampere would with Turing), but everything is saying the opposite.

Manufacturing costs have skyrocketed, shipping costs have increased multiple times (perhaps for a long time), raw material prices have gone up, labor costs have gone up, and other inflationary pressures are pushing up the prices of everything.

It seems like people are willing to pay about $600 for a midrange x60 card, so I wouldn't be surprised if they charged that for the 4060. $999 for the 4070, $1300 for the 4080, $2500 for a 4090.
Think this sounds insane? It's the pricing that's already here that retailers are already asking for. Remember we thought prices would go down over each generation? That's over. People accepted the Turing price increase and we can't argue with Nvidia to go back.
 
I can literally see the Scalpers sharpening their blades as we speak and the Miners subscribing to the most popular bots there are.

It will be a metaphorical bloody massacre when RTX 4K series hit, not that any gamer in his right mind would buy GTX 480 descendants with a 500-700W TDP.
yeah it's getting pretty expensive. GPUs are starting to rival electric stoves or airconditioners, and if you live in a hotter climate you would need to run an airconditioner to help alleviate the heat the GPU pumps out, further increasing your bill
 
How much would it cost to just build your own fab versus 10 bill for 5% of another? (Minus the timing constraints).

Pretty sure I read not too long ago that Intel is going to spend $4.5 Billion to build one. However it takes a few years to go from making that decision and actually having chips to use or sell. I think the article said 2024-2025 to have it running and that is probably if everything goes right and on schedule which is probably not going to happen.

That is not going to help with the current shortage of chips.
 
Ampere GPUs are not power hungry. Once undervolted they are very energy efficient, while losing less than 5% perf from stock.



 
yeah it's getting pretty expensive. GPUs are starting to rival electric stoves or airconditioners, and if you live in a hotter climate you would need to run an airconditioner to help alleviate the heat the GPU pumps out, further increasing your bill
I, in fact, do run a standalone AC unit in my game room during the summer. I live in the South. It's a rather small room that becomes nearly unbearable with extended use. Have installed every cooling method I can fit into a case (excluding custom water cooling) and on the chip for my 3080, and during the summer it can bring the ambient air in that room up to 90F within an hour.
 
I know a lot of people are hoping that this next gen will bring down prices (like they thought Ampere would with Turing), but everything is saying the opposite.

Manufacturing costs have skyrocketed, shipping costs have increased multiple times (perhaps for a long time), raw material prices have gone up, labor costs have gone up, and other inflationary pressures are pushing up the prices of everything.

It seems like people are willing to pay about $600 for a midrange x60 card, so I wouldn't be surprised if they charged that for the 4060. $999 for the 4070, $1300 for the 4080, $2500 for a 4090.
Think this sounds insane? It's the pricing that's already here that retailers are already asking for. Remember we thought prices would go down over each generation? That's over. People accepted the Turing price increase and we can't argue with Nvidia to go back.
No.
 
"However, TSMC’s 5nm wafers are expected to cost quite a bit more, which could translate into more expensive graphics cards."
Good luck buying one. Lost cause.
 
Nvidia reportedly spent $10 billion for a chunk of TSMC's 5nm manufacturing capacity
Here's to hoping the RTX 4000 series won't be a paper launch

And thats how you create and maintain a nice drone army....

I wonder if at this point, nvidia is spending more than intel in bribes..err, sorry, loyalty coupons...err sorry again, marketing expenses!
 
I can literally see the Scalpers sharpening their blades as we speak and the Miners subscribing to the most popular bots there are.
This is true only in proportion to how badly Nvidia mismatches their original pricing to what the market will support. If they price correctly, they'll leave little additional margin for the scalper to pursue, and inventory and shopping processes will work just like they do for any other correctly-priced product.

On the other hand, if that price is really mostly set by and for miners, it will interesting theater to watch Nvidia pretend the pricing is attractive to gamers.
 
This is true only in proportion to how badly Nvidia mismatches their original pricing to what the market will support. If they price correctly, they'll leave little additional margin for the scalper to pursue, and inventory and shopping processes will work just like they do for any other correctly-priced product.

On the other hand, if that price is really mostly set by and for miners, it will interesting theater to watch Nvidia pretend the pricing is attractive to gamers.
Nvidia moved away from gamers long time ago.

They only pay lip service.

The real problem is that you have all these drones blindly throwing money at them and lashing againts anyone that dare stating the truth.
 
Pretty sure I read not too long ago that Intel is going to spend $4.5 Billion to build one. However it takes a few years to go from making that decision and actually having chips to use or sell. I think the article said 2024-2025 to have it running and that is probably if everything goes right and on schedule which is probably not going to happen.

That is not going to help with the current shortage of chips.
Intel is spending twenty billion dollars for new fabs in both Ohio and Arizona. So, at least $40 billion in plants that should be "completed" in 2024. Expect another year to get to full speed as processing a 5nm wafer takes over two months with an incredible number of steps. Also, the new Ohio facility will be expandable as Intel announced they may spend up to $100 billion US dollars on the facility over time.
 
Of course until there's net additional capacity online, any additional wafers for Nvidia are coming at the expense of something else.

Well we know it's not going to be Apple simply because we know Apple has a sweetheart deal being the biggest TSMC customer of them all.

Which means that naturally, at least one of them will be AMD. Now from their perspective AMD knows this is coming as we have seen them voluntarily leave market segments like the lower end chips with no offering below the 5600g and no desktop Zen3+ announced beyond a single chip, the 5800X3D, which means they're likely wanting to get as many allocation as they can for Zen 4 chips for both enterprise and consumer segments.

What I think has a good chance of happening is that AMD eventual RDNA launch of a 7000 series will be redefining what a paper launch is and maybe even just releasing for the top SKU and it's derivatives (So think 7800xt, 7800 and 7700xt all the same chip with cut down, defective versions for the lower tiers) and pretty much nothing else.

I think we'll run a great risk of the GPU race switching to being just Nvidia vs intel for a good chunk of the next generation, and if Nvidia was already at the very least equal to AMD in tech and ahead in software support and features then it's going to be a bloodbath vs intel that will probably not be able to match AMD in terms of software support and features at all, let alone stay competitive with Nvidia.

This is...Overall terrible news for AAA pc gaming.
 
Nvidia wouldn't do this if there wouldn't be expected supply constraints ahead. They want to make sure, whatever the situation might be, that their products are available. If they exist basically alone in the market regarding availability, they will price their cards as they like. This investment is not a gamble of any sort I would say, because with such amount of money they will be in the worst case the only GPU brand with something to offer. After such investment, if mining would suddenly end, then Nvidia's GPUs would be cheaper than usual, so I disagree with the notion that Nvidia is not ever going to price their GPUs lower since Turing, because just looking at the history of their releases prices have been varying when you count in inflation (inflation is the real culprit). With this investment they are actually just playing it safe, since anything they will produce, it will sell anyway, that's for sure. The reason why AMD might not do the same is likely that they don't have such cash laying around.

So, if there are going to be any GPUs available in a bad situation, they are from Nvidia and surely they are going to be fine GPUs. The high power consumption numbers are from the stupidly expensive models, since Lovelace is said to be much like what Ampere was after Turing: a node shrink with yet better efficiency, which equals even more CUDA cores. Tried and true way which cannot go wrong, but this said, RDNA3 might introduce greater generational improvements, at least I expect so.
 
The 1050 Ti is still 270USD $.
GTX 1650 (non super or ti) is almost 320 to 380 USD$ depending on model.
Cheapest RTX 3050 is starting from 510 USD $.
**(Every shop has plenty of stock in display.)

Current pricing model of GPU's even after a decline as of today is not sustainable in the long run. It's not just the companies manufacturing them will be affected, but rather the whole associated DIY market. No reasonably priced GPU, will ultimately steer the customer to other valid options like prebuild/laptops. Yes, it is not idle but when you can save some money especially with discounts on such then why not.

Also, in the long run it will deter upgrades to WQHD and above resolution which is the main reason so many higher-mid range cards sell.

And the current market prices are not just due to manufacturing price increase or demand, it's due to the bundling of motherboards/PSU etc. with the GPU forced by distributors onto retailers. Now retailers are charging 200% of MSRP, saying they need to "recover" the cost.
So are they giving away such bundled products for free, as they obviously choose to double the price on GPU's?
Will the cost be not recovered by selling of such product individually, or they can sell the whole bundle at reasonable price?
 
This is true only in proportion to how badly Nvidia mismatches their original pricing to what the market will support. If they price correctly, they'll leave little additional margin for the scalper to pursue, and inventory and shopping processes will work just like they do for any other correctly-priced product.

On the other hand, if that price is really mostly set by and for miners, it will interesting theater to watch Nvidia pretend the pricing is attractive to gamers.

So essentially honest / realistic msrp ? On one hand they could do that since customers showed enough of them will pay the higher prices.
On the other hand, their fantasy land msrp worked fine into duping reviewers into giving better reviews by comparing their cards to the wrong counterpart and reviewing them based on the msrp.
 
Imagine the jump in efficiency from Samsung 8nm, which is more like tsnc 10nm to tsmc 5nm. We can finally hope for some efficient mobile GPUs
 
So essentially honest / realistic msrp ? On one hand they could do that since customers showed enough of them will pay the higher prices.
On the other hand, their fantasy land msrp worked fine into duping reviewers into giving better reviews by comparing their cards to the wrong counterpart and reviewing them based on the msrp
This doesn't sound like a hard trade to me.

Choice A: Price realistically, *keep all that revenue for themselves*, a big chunk of which they've already promised to TSM, while selling 100% of manufacturing capacity, and yes having to endure reviews which lament the high market prices.

Choice B: Give away hundreds of dollars in value per card to scalpers, sell the same 100% of capacity anyway, enjoy some nicer words from reviewers as to price, but also take lots of negative flak from consumers who could not buy at those prices.

Nvidia's fantasy MSRP was not them being nice guys, or placating reviewers as their #1 goal. The issues were they could not accurately predict the impact of miner demand, nor the pandemic and its impact on supply chain. The original 3080 pricing (including a few hundred dollars markup from AIB premium models) for example was clearly based on being able to make many more cards and having to price them to move in volume to a market that was primarily gamers. That didn't happen.

The ongoing challenge here is the high variability of mining demand. What miners will pay is based on what crypto will be worth and the possible ranges are far wider than what's involved in trying to model what gamers will pay for more sparkles. They also have to consider the strategic possibility that mining demand may not be stable or perpetual and they therefore need to keep a foot in the gaming market even if it is temporarily less financially attractive.
 
Im on an RTX 2080 and right now I can get practically any game running at decent frame rates with RT on at 1440p. However it is starting to show its age and I think the 4xxx series will be time to upgrade. I dont expect to pay less than £1000. In fact I would imagine a 4080 will cost £1200+. Im ok with this. Hopefully I can get a nice 4K OLED to pair it with by the time it comes out.
 
Much more.
It's not just about buying the land, machinery, etc.
Developing a cutting edge process node takes a lot of time, lots of high end engineers and lots of experience.

I feel people forget how hard it has become for companies to keep up in Process Nodes.

Global Foundries even a decade ago already started facing problems, it was one of the largest reasons why AMD got out of the Fab Business. It is a money sinkhole if you run into issues.

Intel is a good example of this, struggling for the last decade with advancing their nodes.

Sure you can build a Fab, brute force with money. Doesn't mean that the output is not only cost effective, but also comparable to your competitors.


TSMC has a pretty large lead on its competitors ATM, and I don't see that changing any time soon.
 
Back