Google IT hardware manager says Moore's Law has been dead for 10 years

Daniel Sims

Posts: 1,376   +43
Staff
In context: Nvidia CEO Jensen Huang has consistently proclaimed the demise of Moore's Law in recent years. Although his counterparts at AMD and Intel hold differing opinions, a recent presentation from Google appears to align with Huang's perspective. This alignment might also help elucidate TSMC's trends over the past several years.

The tech industry frequently discusses how much time, if any, Moore's Law has left. Google's head of IC packaging, Milind Shah, recently supported a prior assertion that the trend, which has served as a crucial guidepost for the tech industry, ended in 2014.

In 1965, the late Intel co-founder Gordon Moore theorized that the number of transistors per square inch on a circuit board would double about every two years. The theory, named after him, has mostly held fast in the nearly six decades since but has recently faced turbulence.

In 2014, MonolithIC CEO Zvi Or-Bach noted that the cost of 100-million-gate transistors, which had previously been steadily falling, hit rock bottom at the then-recent 28nm node. Semiconductor Digest reports that Shah, speaking at the 2023 IEDM conference, supported Or-Bach's claim with a chart showing that 100M gate prices have remained flat ever since, indicating that transistors haven't gotten any cheaper in the last decade.

Although chipmakers continue to shrink semiconductors and pack more of them onto increasingly powerful chips, prices and power consumption have increased. Nvidia CEO Jensen Huang has tried to explain this trend by proclaiming the death of Moore's Law multiple times since 2017, stating that more powerful hardware will inevitably cost more and require more energy.

Some have recently accused the Nvidia CEO of making excuses for the rising prices of Nvidia graphics cards. Meanwhile, the heads of AMD and Intel admit that Moore's Law has at least slowed down but claim that they can still achieve meaningful performance and efficiency gains from innovative techniques like 3D packaging.

However, the analysis from Or-Bach and later Shah might align with TSMC's wafer price hikes, which sharply accelerated after 28nm in 2014. According to DigiTimes, the Taiwanese giant's cost-per-wafer doubled over the subsequent two years with the introduction of 10nm in 2016. The outlet estimated that the latest 3nm wafers could cost $20,000.

As TSMC and its rivals aim toward 2nm and 1nm in the coming years, further analysis indicates that most of the semiconductor industry's recent growth comes from rising wafer prices. Despite wafer sales falling in the last couple of years, the average price of TSMC's wafers kept increasing.

Permalink to story.

 
People still buy overpriced stuff, heck yeah they paid 3 times the MSRP during the pandemic. Can't wait for 2N prices... What do you say 2 times curent MSRP!
People will still buy because "the more you buy, the more you save"

Cheers!
 
ITT: people whining that GPUs dont cost $300 like during the depths of the 2008 recession, not understanding inflation, margins, or percentages, malding and seething.
 
Don't need Jensen or this guy to tell anyone that. Just see how a midrange PC from 10 years ago is still perfectly usable for basic tasks and even some light indie gaming. In 2014, a midrange PC from 2004 was nearly useless, and in 2004, a 1994 PC was nothing but recycling material.

It really is no excuse for the price gouging that's been going on though. And the forced obsolence companies are pushing to force people to keep replacing perfectly usable hardware, like the Windows 11 hardware TPM requirements, not to mention hardware components durability and reliability taking a nosedive, again to force people to keep consuming since they can't rely on Moore's Law anymore.
 
This is a weird comment section. The whole article is about how transistor pricing has flattened out, and new GPUs use more and more transistors, so they get more and more expensive.

And then everyone comes in and complains about pricing.

Yeah, that's the whole of the article, a 100mm2 GPU on a modern arch today is going to cost substantially more to produce than a 100mm2 GPU on 28nm, because the modern GPU will pack in 7-8x the transistors in the same space, so it's gonna cost more.

Granted, post pandemic greedflation pricing does factor in, but it's not where 100% of the price increases are coming from.
 
so, because it's not a law, that means they can raise prices according to their heart's desire, huh..?
It's called the free market, Fidel.

This is a weird comment section. The whole article is about how transistor pricing has flattened out, and new GPUs use more and more transistors, so they get more and more expensive.
A very good point. And, according to some estimates, some iterations of the N2 node may actually have a *higher* per-transistor cost than N3.
 
If this is the case, then we are stuck at about current performance because increasing the price can only go so far before sales diminish too much. this might not be a completely bad thing. we could keep the same hardware for longer if it weren't for fixed function bits in new hardware. Probably that is why we see gpu makers explore new avenues of increasing performance like upscaling and frame generation.
 
Can Silicon Chips keep doubling in complexity and get cheaper, not anymore, it that a good reason for Nvidia to price gouge hell no.
 
If this is the case, then we are stuck at about current performance because increasing the price can only go so far before sales diminish too much. this might not be a completely bad thing. we could keep the same hardware for longer if it weren't for fixed function bits in new hardware. Probably that is why we see gpu makers explore new avenues of increasing performance like upscaling and frame generation.

-Yep

NV is trying to tackle the problem with software, DLSS/Frame Gen etc, while AMD has their software knock off versions they're trying to to tackle the issue with chiplets, keep the logic on the bleeding edge process and the I/O on a cheaper older process while maximizing yields.
 
NV is trying to tackle the problem with software, DLSS/Frame Gen etc, while AMD has their software knock off versions...
Actually, Jensen recently said that, while in the last 10 years their gpus have seen only a roughly 2.5X performance increase from new process nodes, they've gotten more than ten times that from architectural changes and improvements.

Can Silicon Chips keep doubling in complexity and get cheaper, not anymore, it that a good reason for Nvidia to price gouge hell no.
Surprisingly enough to some, price gouging is not defined as "any price higher than I personally wish to pay".
 
Came here to post this. But just because the law is dead is not an excuse to overcharge or PC hardware.


Literally everything else surrounding and driving these increasingly advanced chips is also increasing in complexity and cost. Material costs, Memory capabilities, PCB complexity, cooling, power delivery are all having to meet higher and higher demands. And software features these days easily require 50x-100x more engineers than they did 15-20 years ago. To further exacerbate the issue, thanks to today’s hot new workloads like AI, suddenly consumers have been thrust into competition with the enterprise segment for limited wafer allocation. All because we are well into using GPUs for more than just gaming.

Time to face it, things are rapidly getting to the point where it simply won’t make sense for nVidia and AMD to cater to gamers anymore, given that their halo compute products sell for 10-20x more than their halo gamer parts at the same if not greater volumes. I mean realistically what value is the 4090 to nVidia, if they manage to sell 100k-200k, let’s even assume 1 million 4090s sold over the product lifespan, maybe netting nV $1000 per sale, while their Hx00 skus are both being bulk-ordered by FB, M$, Google, and/or Amazon, in 100k-300k+ batches and come with a much higher profit margin per sale (easily $20k+ per unit)?
 
Literally everything else surrounding and driving these increasingly advanced chips is also increasing in complexity and cost. Material costs, Memory capabilities, PCB complexity, cooling, power delivery are all having to meet higher and higher demands. And software features these days easily require 50x-100x more engineers than they did 15-20 years ago. To further exacerbate the issue, thanks to today’s hot new workloads like AI, suddenly consumers have been thrust into competition with the enterprise segment for limited wafer allocation. All because we are well into using GPUs for more than just gaming.

Time to face it, things are rapidly getting to the point where it simply won’t make sense for nVidia and AMD to cater to gamers anymore, given that their halo compute products sell for 10-20x more than their halo gamer parts at the same if not greater volumes. I mean realistically what value is the 4090 to nVidia, if they manage to sell 100k-200k, let’s even assume 1 million 4090s sold over the product lifespan, maybe netting nV $1000 per sale, while their Hx00 skus are both being bulk-ordered by FB, M$, Google, and/or Amazon, in 100k-300k+ batches and come with a much higher profit margin per sale (easily $20k+ per unit)?
When these companies post record breaking profits, this knowledge is public knowledge for any publicly traded company, I don't want to hear their costs have gone up. Yes, their costs have gone up but so have their profit margins. I'm not complaining about the 4090 either. I'm disgusted by the fact that if I want to play a game at 1080p medium settings without upscaling tech I have to spend atleast $500 on a GPU. I've used a 4k TV to game on for years now and I see no point in upgrading from my 6700xt.

I likely will buy a 7900xt when RDNA 4 comes out. I was wholly prepared to buy an 8700g and just be happy with that but that's really a disappointing product all around.
 
Transistor pricing flattened out in 2014. And Dennard Scaling died back in 2005. But Moore's Law is about the number of transistors you can have on a single chip. That has continued to increase - however, it is no longer doubling every two years or every 18 months, as Moore's Law was previously understood to say.
I think that it's not a useful argument to say whether Moore's Law is alive or dead; the important thing is that progress in semiconductor logic has slowed down significantly, even if it has not yet stopped entirely.
 
When these companies post record breaking profits, this knowledge is public knowledge for any publicly traded company, I don't want to hear their costs have gone up. Yes, their costs have gone up but so have their profit margins. I'm not complaining about the 4090 either. I'm disgusted by the fact that if I want to play a game at 1080p medium settings without upscaling tech I have to spend atleast $500 on a GPU. I've used a 4k TV to game on for years now and I see no point in upgrading from my 6700xt.

I likely will buy a 7900xt when RDNA 4 comes out. I was wholly prepared to buy an 8700g and just be happy with that but that's really a disappointing product all around.

I used the 4090 as an example; the lower end stuff is even worse for the company in terms of margin per unit sold. And be careful not to be distracted by “record profits”. Actual percent of margin for the segment is what you are interested in. Inflation is technically driving everyone’s numbers up, but not necessarily their margins. Ultimately, even if actual dollars are up, the GPU makers simply aren’t making much money off gamers anymore, they’re making money off of their data center offerings comparatively speaking. That’s why it’s important to look at margins in each sector.

I’ll leave you with perhaps the most succinct explanation for your problem that I can care to come up with. If you play the same games today as you did 12 years ago, todays GPU offerings are obviously going to be more capable and performant compared to their predecessors. There is a measurable performance and/or efficiency increase in such an apples-to-apples comparison.

But, given what we know about how shockingly poorly today’s games are made (a trend that is unfortunately continuing in that wrong direction), combined with the fact that games are in fact becoming more complex, and suddenly you realize the comparison is apples to oranges. You’re pushing 4 times the pixels and 10 times the polygons, with much more complex visual effects and much higher data processing demands than ever. Everything is scaling up. You’re no longer driving the go-kart from your youth at the county fair, you’re driving a full-on sports car on a far more challenging track. I hate to say it but literally nothing about that experience is going to be cheaper than before.

As an aside, “medium settings” is really just an arbitrary descriptor of visual fidelity (I can’t stand the low-medium-high-ultra nonsense, it should have died a painful death years ago!!).
 
Coincidentally, it failed at the same time that Google shed the last pretence at morality, and with that, any ability to innovate.
 
Moore's Law is a sideshow. It's there to reassure businesses and consumers that "somehow" their new CPU is better than the last. However, it's been a long time since the doubling of transistors equated to double performance. You can trace this back to Netburst, but it really sank in when Intel stalled its node progression at 14nm (in late 2014), which is why so many people trace the decade since the death of Moore's Law.
 
This is just corpo speak for "we want you to accept this so we can give you less and less but charge you more and more"
 
Google's statement is most likely true when it comes to Google. They've been stagnant for years as evidenced by all the recent layoffs.
 
Can thank mining and AI for GPU prices, corporations can buy at much higher prices than consumers and TSMC is adding capacity but in the end, supply cannot increase as fast as demand which is why prices go up. If google or facebook is paying 50k for a 4090 equivalent GPU, it doesn't make sense for them to sell them to us for 1k or even 2k.
 
If a "law" states a doubling in a set time frame, how is "slowing down" not an end to such a "law"?
 
Back