Aiming for Atoms: The Art of Making Chips Smaller

Although the likes of Intel and Nvidia have come under fire from noticeably increasing prices of their chips, the article goes a fair way in explaining why a mid range video card or CPU costs a lot more than it did say 10 or 15 years ago.

Inflation is one normal reason, but the cost of the development and manufacturing of ever smaller processes has risen rapidly. Simply going from 28nm to 7nm has doubled the cost of a wafer. On immature processes for a new, tinier, ever more difficult to develop node, the yields will often be less. This also affects relevant memory fabrication, although innovations in that field have helped. Therefore costs are easily more than doubled to tape out a 7nm chip than a 28nm one.

This in the space of just 7 years. Yes, Nvidia and Intel have exploited their technology leads by raising prices. Welcome to the world of capitalism. However a large proportion of those price rises are still down to the underlying steep increase in the costs of developing and manufacturing the parts.

That's why what is often classed today as a mid range video card (RTX2060) is now $350 and not the $200 it (7600GT) generally was in say 2006.

And yet Intel has a 63% profit margin and just shaved $1,000 of it's top HEDT processor. Certainly costs have gone up as they always do with a node but the companies participating have gotten a lot bigger. Economies of scale work in their favor. If they didn't Apple, Intel, and Nvidia would not have the fat profit margins they have today.


Look at the financial facts, Intel and Nvdia are not raising prices to keep cost under control. They are raising prices to increase margins. It's the same reason AMD's new threadripper are more expensive.

"Welcome to the world of capitalism."

Only for those countries that allow their citizens to be preyed upon by unscrupulous capitalists. Capitalism without checks is a self consuming system. Writing off increases in prices as the way of the world is just an excuse to allow it to continue.
 
And yet Intel has a 63% profit margin and just shaved $1,000 of it's top HEDT processor. Certainly costs have gone up as they always do with a node but the companies participating have gotten a lot bigger. Economies of scale work in their favor. If they didn't Apple, Intel, and Nvidia would not have the fat profit margins they have today.


Look at the financial facts, Intel and Nvdia are not raising prices to keep cost under control. They are raising prices to increase margins. It's the same reason AMD's new threadripper are more expensive.

"Welcome to the world of capitalism."

Only for those countries that allow their citizens to be preyed upon by unscrupulous capitalists. Capitalism without checks is a self consuming system. Writing off increases in prices as the way of the world is just an excuse to allow it to continue.
Agreed. Intel pushed the $2k boundary because they could and now, its AMD's turn because the road was paved by Intel so now, AMD can, too.

One thing that I found a bit interesting in the node size/cost post is that transistor density increases with smaller nodes essentially enabling the same computing power in a smaller package. It might cost more, but what comes out is X times as powerful in a comparable space. Except for the high-end, manufacturers seem to toe-the-line with processor pricing. Today's $300 CPU is far more powerful than yesterday's $300 CPU.

Most of that extra cost probably goes into R&D, then tooling. Those costs will be made up by sales.

Another interesting article on the state of current research - https://techxplore.com/news/2019-11-highly-wafer-scalable-aligned-carbon-nanotube.html
 
Agreed. Intel pushed the $2k boundary because they could and now, its AMD's turn because the road was paved by Intel so now, AMD can, too.

One thing that I found a bit interesting in the node size/cost post is that transistor density increases with smaller nodes essentially enabling the same computing power in a smaller package. It might cost more, but what comes out is X times as powerful in a comparable space. Except for the high-end, manufacturers seem to toe-the-line with processor pricing. Today's $300 CPU is far more powerful than yesterday's $300 CPU.

Most of that extra cost probably goes into R&D, then tooling. Those costs will be made up by sales.

Another interesting article on the state of current research - https://techxplore.com/news/2019-11-highly-wafer-scalable-aligned-carbon-nanotube.html

I'd definitely agree. Sales in particular. Compare the number of parts the big chip companies sell today compared to back in the day. It's an order of magnitude greater, making larger and larger R&D budgets possible. R&D being the number one expense for chip companies like Intel, AMD, and Nvidia.
 
Nearing the limits of photolithography! The next 10 years should be interesting as the fight against electro migration is accelerating.
 
That's why what is often classed today as a mid range video card (RTX2060) is now $350 and not the $200 it (7600GT) generally was in say 2006.


No, it's not -the reason video cards have more than doubled in price has been because of vastly higher complexity in components (GDDR3 -> GDDR6 is vastly more expensive to make, even in mass-production). We have als hit a memory densiity wall at 10nm (wwhich is why we end-up with strange crossover devices like a 12GB RTX 3060 released late in a generation)

New cards also use about 4x as much power (which, even in the world of mass-produced multi-fan heat pipe cooled radiator , plus stacks of video card VRMS, it still ads a cost)


The Additional process node cost for the GPU chip are mostly lost-in-the-noise (still adds a cost, but when big monsters like Apple and Qualcomm are willing to take the high cost of new Process Nodes, Nvidia and AMD have all the time in the world to wait for prices to fall (as yields peak, costs drop, and they also find other optimizations to cut costs)
 
Last edited:
Thanks for the article.
I wonder if we can really go sub-atoms. Is there a limit that we cannot surpass ?

Well, you can go down to quarks and that's about it. They haven't discovered anything smaller. At least not in this simulation we call our world. Maybe the outer simulation has smaller particles.
 
No, it's not -the reason video cards have more than doubled in price has been because of vastly higher complexity in components (GDDR3 -> GDDR6 is vastly more expensive to make, even in mass-production). We have als hit a memory densiity wall at 10nm (wwhich is why we end-up with strange crossover devices like a 12GB RTX 3060 released late in a generation)

New cards also use about 4x as much power (which, even in the world of mass-produced multi-fan heat pipe cooled radiator , plus stacks of video card VRMS, it still ads a cost)

The Additional process node cost for the GPU chip are mostly lost-in-the-noise (still adds a cost, but when big monsters like Apple and Qualcomm are willing to take the high cost of new Process Nodes, Nvidia and AMD have all the time in the world to wait for prices to fall (as yields peak, costs drop, and they also find other optimizations to cut costs)

When your design literally costs 10 or 15x more to create and tape out on 7nm today than say 65nm fifteen years ago combined with the wafer production itself being multiple times more expensive it's going to push up costs significantly.

A 7nm TSMC 300mm wafer was well over $9k, you were talking maybe $3k for a 40nm or 65nm equivalent. 5nm is projected to be over $16k. These prices are BEFORE the market went stupid and TSMC completely filled their order books. Not to mention the areal size of chips these days keep growing. A 7600GT was just 125mm² whereas an RTX1060 is what, 200mm²? The RTX2060 is binned but it's still using dies that originally were 445mm² and you don't get to only pay TSMC for the bits of it that work!

So multiple factors go into the cost increases including expensive memory and other components. Memory is a big factor, but mainly because the cost had doubled this past year with the current situation.

GDDR6 wholesale has been up above $80 for 8GB this year, which is a significant chunk of what should be an MSRP $400 card. In a 'normal' situation it could be $45 or less on bulk orders. Which while more expensive than previous generations it still isn't the primary cost of a GPU compared to the die itself.

The actual die cost on 7nm would still be more on a higher end GPU. Perhaps around $90 if you have reasonable yields. Remember, that is production cost alone- much more factoring in your $1Bn design and tape out costs....
 
Last edited:
It's like the episode of Seinfeld where Kramer gets a meat slicer and slices the meat so thin, you can't even see it!
 
First comment is dated as "Jun 24, 2019" but the article is dated "August 31, 2023"

Why is Techspot reposting old articles as new? You guys have plenty of old articles that are a joy to read and I wouldn't mind something on the front page labeled as "from the vault" but I don't understand this particular way of going about it.

Everything in this article is nearly just as relevant as when it was posted and just as interesting.
 
Now if we could actually get memory density up. I feel like memory has been stagnant compared to CPU improvements over the past 10 years.
 
First comment is dated as "Jun 24, 2019" but the article is dated "August 31, 2023"

Why is Techspot reposting old articles as new? You guys have plenty of old articles that are a joy to read and I wouldn't mind something on the front page labeled as "from the vault" but I don't understand this particular way of going about it.

Everything in this article is nearly just as relevant as when it was posted and just as interesting.

We used to tag articles like this with a #tbt label, we just stopped doing that for the last few we have bumped because I personally thought nobody really cared about that (as long as the content is still relevant).

Now, to be clear, we are not simply changing the date of the article and reposting. We read, revise, update and in most cases improve the flow of the original article, and then we post it again.
 
"To grasp the sheer minuteness of 6 nm, consider this: the silicon atoms, which constitute the majority of a processor, are spaced about 0.5 nm apart, with each atom being approximately 0.1 nm in diameter. Hence, as a rough estimate, TSMC's facilities handle transistor elements that span fewer than 10 silicon atoms in width." - is this accurate? What are the features in the 6 or 7 nm process that match the process name?
 
Last edited:
So, if you are building your dream hardware and in the assembly stage, and drop one on the floor ..... how will you ever find it???
 
Back