Apple has a game changer in its M1 UltraFusion Chip Interconnect

The big picture: Apple is pushing boundaries with its 114 billion transistor behemoth M1 Ultra. It uses state-of-the-art interconnection technology to connect two discrete chips into a single SoC. Fortunately, developers will not have to jump through hoops to utilize the Ultra's full potential as it will behave as a single unit on the system level.

Apple recently announced the M1 Ultra, its new flagship SoC that will power the all-new Mac Studio, a compact yet high-performance desktop system. It claims the M1 Ultra powered Mac Studio would provide CPU performance up to 3.8 times faster than the 27-inch iMac with a 10-core processor.

"[The M1 Ultra is a] game-changer for Apple silicon that once again will shock the PC industry," said Johny Srouji, Apple's senior vice president of hardware technologies.

The M1 Ultra is undoubtedly a powerhouse. It combines two M1 Max chips over what Apple calls the UltraFusion interprocessor interconnection that offers 2.5 terabytes per second of low latency, inter-processor bandwidth.

As per Apple, UltraFusion uses a silicon interposer with twice the connection density & four times the bandwidth of competing interposer technologies. Since each M1 Max has a die area of 432 mm², the UltraFusion interposer itself has to be over 864 mm². That is in the realm of AMD and Nvidia's enterprise GPUs featuring HBM (High Bandwidth Memory).

Another advantage of Ultrafusion is that developers won't need to rewrite their code, as on a system level, the Mac will see the dual-chip SoC as a single processor.

Built on TSMC's 5-nanometer process, the M1 Ultra has 114 billion transistors, a 7x increase over the original M1. It can support up to 128 GB of unified memory with a memory bandwidth of 800 GB/s, made possible by its dual-chip design. It includes 16 performance cores with 48MB L2 Cache and four efficiency cores with 4MB L2 cache, while the GPU can have up to 64 GPU Cores. It also sports a 32 core Neural Engine that can execute up to 22 trillion operations per second for accelerating machine-learning tasks.

Digitimes reports that Apple's M1 Ultra SoCs use TSMC's CoWoS-S (chip-on-wafer-on-substrate with silicon interposer) 2.5D interposer-based packaging process. Nvidia, AMD, and Fujitsu have used similar technologies to build high-performance processors for datacenters and HPC (high-performance computing).

Taiwan chipmaker TSMC has a newer alternative to CoWoS-S in its InFO_LSI (InFO with integration of an LSI) technology for ultra-high bandwidth chiplet integration. It uses localized silicon interconnects instead of large and expensive interposers, similar to Intel's EMIB (embedded die interconnect bridge).

It is believed that Apple chose CoWoS-S over InFO_LSI as the latter was not ready in time for the M1 Ultra. So Apple might have played it safe by opting for a proven but more expensive solution over a cheaper, more nascent technology.

The Mac Studio will be available starting March 18, with a starting price of $3999, which includes 64GB of unified memory and a 1TB SSD.

Permalink to story.

 
Another advantage of Ultrafusion is that developers won't need to rewrite their code, as on a system level, the Mac will see the dual-chip SoC as a single processor.
Oh yeah? We first saw this "advantage" on PC's 2005. Pretty typical that Apple makes something PC has done for 16 years and it feels like revolutionary when not so tech savvy morons accept all Apple BS propaganda.
 
...Am I missing something?
Besides it being easier to manufacture both the max and ultra (as it's essentially just 1 die design + connections for both now), why is this a "game changer"?

What's the big difference between this, and a chip that is just manufactured 2x bigger?

Or is it just that a chip has the stats listed (and the focus isn't so much that it's interconnected)?
 
...Am I missing something?
Besides it being easier to manufacture both the max and ultra (as it's essentially just 1 die design + connections for both now), why is this a "game changer"?

What's the big difference between this, and a chip that is just manufactured 2x bigger?

Or is it just that a chip has the stats listed (and the focus isn't so much that it's interconnected)?
Like I said, it's not. AMD did it already on 2005 and Intel probably same year (won't even bother to check).

There is no major difference between that and monolithic bigger chip.

This news item is pretty much zero value.
 
...Am I missing something?
Besides it being easier to manufacture both the max and ultra (as it's essentially just 1 die design + connections for both now), why is this a "game changer"?

What's the big difference between this, and a chip that is just manufactured 2x bigger?

Or is it just that a chip has the stats listed (and the focus isn't so much that it's interconnected)?
You are definitely missing something lol. Creating larger chips is more expensive and more likely to fail meaning that yields will always be better when making smaller sized silicon. Almost all chip designers are moving to a multi die setup or "chiplet" way of making chips to get past performance, yield and supply barriers.

This "UltraFusion" technology is just Apples equivalent to AMD's "Infinity fabric".
 
You are definitely missing something lol. Creating larger chips is more expensive and more likely to fail meaning that yields will always be better when making smaller sized silicon. Almost all chip designers are moving to a multi die setup or "chiplet" way of making chips to get past performance, yield and supply barriers.

This "UltraFusion" technology is just Apples equivalent to AMD's "Infinity fabric".
So nothing missed, as I already mentioned it being easier to manufacture......
 
...Am I missing something?
Besides it being easier to manufacture both the max and ultra (as it's essentially just 1 die design + connections for both now), why is this a "game changer"?

What's the big difference between this, and a chip that is just manufactured 2x bigger?

Or is it just that a chip has the stats listed (and the focus isn't so much that it's interconnected)?


Because the gpus acts as one, this has not been seen before!! Anandtech has a good overview of this, check it out!!
 
If Apple’s silicon didn’t only run Mac OS natively. Then us hardware enthusiasts who build their own systems would have another true option besides Intel, AMD and NVidia when designing our next rig
If it didn't only run Mac, it wouldn't exist, and would be impossible to know how it would perform running Windows. You're just dreaming with that comment.
 
Last edited:
So nothing missed, as I already mentioned it being easier to manufacture......
Not just ease of manufacturing, having chip interconnects like this "UltraFusion" can allow you to bring memory modules onto it (HBM) for more memory. Its also scaleable, so you could make an enormous single chip that would only fit in a large desktop but it wouldnt work in a tablet or a laptop as it would be too big. With a "chiplet" approach you could just have 1 chip in your tablet, 2 in a laptop, 6 in a larger system etc. It gives you far more flexibility and minimises waste.
 
Been awhile since Apple was first at something.
They arent "first" here and nowhere in the article or anywhere else is anyone claiming they are. Are you just spamming an article about Apple with the usual pre-defined troll posts without actually reading the article?

But you are also wrong, the M1 is very recent and thats the first time someone has made an ARM CPU that is capable of performing as good or better as desktop X86 equivalents. The M1 will go down in history for being the first at what it is. At least, it will amongst us nerds!
 
Not just ease of manufacturing, having chip interconnects like this "UltraFusion" can allow you to bring memory modules onto it (HBM) for more memory. Its also scaleable, so you could make an enormous single chip that would only fit in a large desktop but it wouldnt work in a tablet or a laptop as it would be too big. With a "chiplet" approach you could just have 1 chip in your tablet, 2 in a laptop, 6 in a larger system etc. It gives you far more flexibility and minimises waste.
They're not using it for more than 2 connected now (which doesn't change the game much). And because of that, we're not sure how well it would practically scale either (as it would add distance and complexity beyond the smaller chiplets).

So, at best it sounds like they're the first to have it available in a consumer device... well, unless it's already made it's way out of datacenter-focused chips lol
 
The thing with Apple, Coca Cola. MacDonald's, Red Bull, Disney etc

is how successful they are at marketing - selling a narrative .
Buy a phone from a plucky little company
Apple really cares about you .
Apple is NOT gathering info on you - to make billions in selling advertising off you ( S35 Billion - from advertising companies at least ).
Apple is the kindest , greenest company in the world .
Apple cares about it's workers
Apples polices are all for the consumer ,
Apple is an American company - and really cares about America.

I don't mind having a tick ( swoosh?? ) on my shirt - but I'm not going to have a massive NIKE emblazoned across the front - plus I get it - for many people their phone gives them positive feelings - their RTX 3090 makes them happy - but at least recognise the techniques companies use . USA is calling out Russia at moment stating its techniques and actions coming. I would guess other magicians get a good feeling seeing 5 techniques combined and done really well ( plus they already love magic ) - most of us adults expect the magic trick to work - know it's not "magic" so is our pleasure any more than those in the know ?.
We lied to our children because we want to make the experience wonderful - eg making a big effort that Santa came - someone has probably put horse droppings next to their chimney . But know when it's done solely to exploit or cheat you
 
The iPhone was introduced 15 years ago. So, as he said, it's been a while since Apple was first at something.
iphone is STILL first though... the 13 came out 6 months ago.... iPad is first in tablets... latest models just came out...
first in profitability, overall market cap....

you understand now?
 
I love how bent out of shape people get about a minor player in the computer industry. Well, minor except for making hugely profitable products.

Anyway, the value here is to show Qualcomm and any other ARM licensee what's possible within that design space if those players could get off their duffs and actually make something not only competitive, but also forward-looking. Maybe someday we'll see that and have an alternative to Intel-AMD. I doubt it though as QC/Sammy are still making money and seem too complacent to want to excel, or even compete in this space.

For now that just leaves Apple.
 
They're not using it for more than 2 connected now (which doesn't change the game much). And because of that, we're not sure how well it would practically scale either (as it would add distance and complexity beyond the smaller chiplets).

So, at best it sounds like they're the first to have it available in a consumer device... well, unless it's already made it's way out of datacenter-focused chips lol
Whos claiming their "UltraFusion" tech is the first?
 
If Apple’s silicon didn’t only run Mac OS natively. Then us hardware enthusiasts who build their own systems would have another true option besides Intel, AMD and NVidia when designing our next rig

I have some good news for you that there are number of ARM laptops you can buy now and more coming out this year.

They about two to three years behind the M1, so give it time and you have very high end i9 desktop performance CPU like in a laptop with all day battery life , yes a all day battery life that cost $700 not $2,000

Microsoft is working with ARM and Android so the OS will be more build for it in coming years.
 
Back