Apple possibly saving billions by moving Macs to Apple Silicon

nanoguy

Posts: 1,355   +27
Staff member
Bottom line: By the time Apple finishes transitioning the entire Mac lineup to Apple Silicon -- about 18-20 million Macs are sold yearly worldwide -- the company will have saved billions of dollars in production costs. Users will get better performance and energy efficiency, but not necessarily lower prices of their shiny aluminum computers.

It's been speculated that by switching to its own M1 SoC, Apple would effectively save money in terms of component costs. Analyst Ming Chi-Kuo previously said the bill of materials could be reduced by 40 to 60 percent when it comes to using Apple Silicon as opposed to using Intel's latest CPUs.

Sumit Gupta, IBM's vice president of AI strategy, recently made an analysis based on estimated shipments of Macs and M1 chipset's estimated production cost of $40 to $50 per unit. Then he compared that to the estimated cost of an Intel Core i5 dual-core CPU in the MacBook Air, which is anywhere between $175 to $200, as well as the Intel Core i5 quad-core found in the MacBook Pro, which costs anywhere between $225 to $250.

The resulting savings would be $2.5 billion on a year like 2020, when the supply chain took a hit as a result of the pandemic. That's assuming that Gupta's estimates of 5.4 million MacBook Air units and 8.6 million MacBook Pro 13 units align with Apple's internal sales numbers. Based on the component price estimates, Apple has spent around $3.2 billion on Intel processors alone for those 14 million Macs, whereas the same amount of M1 chips would cost $697 million. And that doesn't include Mac minis.

Granted, this is just an estimate, but according to Morgan Stanley analyst Katy Huberty, lower-tier Macs made up 91 percent of all Mac shipments in the past twelve months, which could be part of the reason why Apple transitioned these machines first.

Something we can safely assume for now is that the company won't be making its Macs more affordable, save for the M1 Mac mini, which is now $100 cheaper than the last generation. This was made clear when the company explained the new M1 Macs would come with a familiar design as well as a familiar price tag, even as the new machines can only be configured with either 8 or 16 GB of RAM.

Before you get out your pitchforks, it's important to understand that it may take a while before Apple's higher margins can materialize in a meaningful way. This is because the M1 is manufactured using TSMC's 5 nm process node, which is still relatively immature and possibly prone to yield issues, to say nothing of the EUV tooling costs. Essentially, the M1 could be far more expensive than the $40 to $50 estimate, potentially north of $100.

Back in September, it was revealed that a single 5 nm wafer from TSMC could cost as much as $17,000. This is more than what Apple, AMD, and other companies are paying for a 7 nm wafer carrying the chips used in their latest products. So far, TSMC claims it's getting better yields on the 5 nm process node when compared to 7 nm, but we don't know if that necessarily translates into lower prices when you consider the demand.

It's worth noting that based on early reviews and first impressions, Apple's new M1-based Macs rival previous-gen Intel-based machines that are more expensive and have lesser battery life. This redefines the notion that there are lower end Macs and higher end Macs, since the M1 Macs have been shown to perform surprisingly well against some existing ones costing up to three times more, such as the 16-inch MacBook Pro.

In an interview with Om Malik, Apple software engineering chief Craig Federighi, hardware technologies chief Johny Srouji, and marketing chief Greg Joswiak explained that when it comes to the M1, it's "not about the gigahertz and megahertz, but about what the customers are getting out of it."

Federighi noted "the specs that are typically bandied about in the industry have stopped being a good predictor of actual task-level performance for a long time. Architecturally, how many streams of 4k or 8k video can you process simultaneously while performing certain effects? That is the question video professionals want an answer to. No spec on the chip is going to answer that question for them."

Professionals are no doubt curious to see what Apple can do to scale up Apple Silicon for the iMac, MacBook Pro 16, and the Mac Pro. Federighi says "their day will come. But for now, the systems we’re building are, in every way I can consider, superior to the ones they’ve replaced."

Masthead credit: iFixit

Permalink to story.

 
It's not like those chips developed themselves. That also wasn't free.
Exactly. Talking only about production costs while leaving out the R&D to design a chip that is used by a single brand with only ~6.5% share of PC sales is not a whole picture. You'd think the economics would not work out at all, with Intel and AMD able to support their R&D budget across the remaining 93.5% and end up at a much lower overall cost.

While they gave these chips different names from the A-series in their phones & tablets, there must be a lot of shared investment and re-usable engineering efforts, which must be where the volume to support going at this alone comes from. (Or, alternately, they may have plans to sell their chips to others and gain volume that way, although that doesn't sound like Apple.)
 
Exactly. Talking only about production costs while leaving out the R&D to design a chip that is used by a single brand with only ~6.5% share of PC sales is not a whole picture. You'd think the economics would not work out at all, with Intel and AMD able to support their R&D budget across the remaining 93.5% and end up at a much lower overall cost.

While they gave these chips different names from the A-series in their phones & tablets, there must be a lot of shared investment and re-usable engineering efforts, which must be where the volume to support going at this alone comes from. (Or, alternately, they may have plans to sell their chips to others and gain volume that way, although that doesn't sound like Apple.)
Someone correct me if I'm wrong, but isn't the M1 used in the new macs just a slightly modified version of the chip already used in the iPhone?

At that point, the majority of the cost of the chips is basically null because it was developed for other products. What they're doing now is just finding a way to increase sales of a product they already had
 
I would like to see the development cost vs. the savings so we have a more accurate picture. If this is the case as with most businesses, if the savings isn't 5x the development costs I doubt they would have made the investment.
 
Someone correct me if I'm wrong, but isn't the M1 used in the new macs just a slightly modified version of the chip already used in the iPhone?

At that point, the majority of the cost of the chips is basically null because it was developed for other products. What they're doing now is just finding a way to increase sales of a product they already had
Sorry if I say that, but this argument sounds quite naive. The Apple silicon has been under development, for what 10 years? Obviously at some point Apple realized that Intel was holding them back, plus overcharging for the privilege, and determined that Apple silicon was a strategic investment. At that point they ramped up R&D and developed a long term strategy. The A14 was developed, then the M1 followed on with additional development. The same will be true of the A15 and M2, etc

maybe I misread you, but you make it sound that the M1 was just pulled out of the garbage bin

the tax charged by Intel, and Qualcomm are ridiculous for the value provided
 
Thanks to TSMC's 5nm. I wonder how is the M1's yield. it has 16 billions 5nm transistors which close to rtx 3070's 17 billions 8nm transistors, so m1 is likely to has lower yield than rtx 3070.
 
the tax charged by Intel, and Qualcomm are ridiculous for the value provided

But somehow other OEMs can offer the same chips in computers at 1/2 to 1/3 the price apple is charging.


Someone correct me if I'm wrong, but isn't the M1 used in the new macs just a slightly modified version of the chip already used in the iPhone?

At that point, the majority of the cost of the chips is basically null because it was developed for other products. What they're doing now is just finding a way to increase sales of a product they already had

Pretty much they are developing them for the iphone and ipad's first and formost then tweaking them to the laptop family. Prob just opening up more power limit and raising clock speeds. Iphone can't maintain 6 watts and cool it like the ipad pro and macbook air could.
 
Sorry if I say that, but this argument sounds quite naive. The Apple silicon has been under development, for what 10 years? Obviously at some point Apple realized that Intel was holding them back, plus overcharging for the privilege, and determined that Apple silicon was a strategic investment. At that point they ramped up R&D and developed a long term strategy. The A14 was developed, then the M1 followed on with additional development. The same will be true of the A15 and M2, etc

maybe I misread you, but you make it sound that the M1 was just pulled out of the garbage bin

the tax charged by Intel, and Qualcomm are ridiculous for the value provided
I'm currently very sick and so I'm not the best at organizing my words right now. But what I was trying to say is that if 90+% of the R&D is finished because of its use in the iPhone, Apple would be foolish not to try to use it in other products.
 
The new products are so inferior to the previous ones, I don't know who will be buying them...

The new Mac Mini, for example, is some kind of joke now...

Previous max memory: 64GB, new one - 16GB
Previous Ethernet: 10Gbe, new one - 1Gbe

Apple starts reminding me of modern Mercedes now, a pile of cheap, rattling plastic at a huge premium.
 
Apple is absolutely doing this to increase margins. The same Apple moved from Nvidia to AMD years ago because it was cheaper, ignoring how it utterly derailed MAC productivity apps that relied on CUDA.

Remember: Apple wants your money, and has demonstrably proven that it has waves of people that will buy their half eaten fruit regardless of how garbage it is, and are willing to pay MORE then previously for the privilege.
 
Someone correct me if I'm wrong, but isn't the M1 used in the new macs just a slightly modified version of the chip already used in the iPhone?

At that point, the majority of the cost of the chips is basically null because it was developed for other products. What they're doing now is just finding a way to increase sales of a product they already had
I get what you mean. Honestly, it just sounds like people defending Apple's (now greater) fashion tax. But also, it's up to Apple to determine price.

It would be nice to see them pass the new savings down to their customers, but I'd expect them to embrace Right to Repair long before that...
 
Does anyone believe that Apple - or other OEM pay anywhere near list price for their Intel CPU, particularly 14nm ones ? So I somewhat doubt the Billions in savings.

The more important factors are independence, control over their roadmap and features as well as unifying Mac OS and iOS.
 
Lol. Loving the comments on all the Apple articles here with people complaining about all the missing features from a Core i3-level CPU, like >16GB or 10gE. Apple still sells all that Core i5 and i7 level stuff with those extra capabilities while "limiting" their M1/formerly Core i3 low end stuff to 16GB w/GigE or noE.

Seems like they know how to spec their products after all.
 
Of course ur not going to see cheaper macs, what u will see is the same price but with a CPU that's not nearly as good as anything Intel or AMD could produce.
 
16GB RAM limit on M1.
We'll see how this goes.
Seems like very nicely. 8GB is quite OK for dealing with 8K material. I would still invest in 16 gigs if you want to render or lump more live ProRes video material.

I was patently waiting for ARM Macs and finally. Intel was garbage with all that 100C crap. Just waiting for confirmed Blender (beta) which was mentioned during the M1 presentation.
 
Seems like very nicely. 8GB is quite OK for dealing with 8K material. I would still invest in 16 gigs if you want to render or lump more live ProRes video material.

I was patently waiting for ARM Macs and finally. Intel was garbage with all that 100C crap. Just waiting for confirmed Blender (beta) which was mentioned during the M1 presentation.
Wait. Did you just blame Intel for 100C temps in an Apple computer? I think we're done here.
 
Exactly. Talking only about production costs while leaving out the R&D to design a chip that is used by a single brand with only ~6.5% share of PC sales is not a whole picture. You'd think the economics would not work out at all, with Intel and AMD able to support their R&D budget across the remaining 93.5% and end up at a much lower overall cost.

While they gave these chips different names from the A-series in their phones & tablets, there must be a lot of shared investment and re-usable engineering efforts, which must be where the volume to support going at this alone comes from. (Or, alternately, they may have plans to sell their chips to others and gain volume that way, although that doesn't sound like Apple.)
It's Arm/AIM it's not rocket science, Hardware wise it nothing really special, they didn't have to dump billions into R&D and start from scratch they worked off cookie cutter designs, using the ARM license and instruction set means you can only deviate so much in hardware design, which mind you they do but not to magical levels like the marketing wants you to believe.
 
Wait. Did you just blame Intel for 100C temps in an Apple computer? I think we're done here.

The Core i7-9750H is listed as a 45W processor. If you put in a 45W cooler in your Core i7-9750H laptop, that CPU will run at 100C and throttle down to a slower speed than it's all-core Turbo. You well know that is the Intel way of doing things:

List your processor with a nominal wattage and CPU speed, but allow it to run at it's all core Turbo so it looks good in benchmarks and uses 2X or more of the actual wattage rating until it runs out of cooling.
 
The Core i7-9750H is listed as a 45W processor. If you put in a 45W cooler in your Core i7-9750H laptop, that CPU will run at 100C and throttle down to a slower speed than it's all-core Turbo. You well know that is the Intel way of doing things:

List your processor with a nominal wattage and CPU speed, but allow it to run at it's all core Turbo so it looks good in benchmarks and uses 2X or more of the actual wattage rating until it runs out of cooling.
Intel isn't building the laptops. You need to educate yourself on laptop designs.

I'll give you a push.
Steve Jobs actually said if the cpu wasn't 100C, which is a safe limit believe it or not, performance was being left on the table.

The more you know!
 
Back