Apple's M1 Ultra is a performance beast, but definitely not an RTX 3090 killer

nanoguy

Posts: 1,355   +27
Staff member
Bottom line: Apple is likely correct when it claims the M1 Ultra is the most powerful consumer-grade desktop chip, but that's because it's hard to compare an SoC with 114 billion transistors with what's available now in the x86 space. Early benchmarks seem to suggest performance-per-watt is stellar, but GPU performance falls short against dedicated GPUs like Nvidia's RTX 3090.

When Apple introduced its M1 Ultra chipset, it made a big deal about its performance and energy efficiency, extolling the benefits of the chiplet design and the UltraFusion packaging and interconnect technology that made it possible.

To be fair, a lot of engineering clearly went into both the hardware and software aspects of the new chipset, as Apple essentially fused together two M1 Max chips and paired them with a bunch of high-bandwidth unified memory. It also made the two chips recognizable in software as a single chip, which will no doubt simplify app development.

However, as is the case with many of Apple's performance claims (and for that matter, any company that's competing in the hardware space), they don't usually tell the full story. Companies like to cherry-pick benchmark results to make their products look better than the competition, and that's why independent reviews are an important resource to consult before deciding what works for you.

Apple chose to compare the M1 Ultra performance against Intel's Core i9-12900K and Nvidia's RTX 3090, two of the fastest and most power-hungry consumer parts in the desktop space at the moment. The company claimed during the launch event that its new chipset is able to slightly beat the RTX 3090 with a much more modest power consumption, but it didn't say which benchmarks were used.

Now that the first independent reviews are out, things are starting to come into focus. The Verge ran a series of benchmarks including NPBench Python and Geekbench, as well as some Puget and gaming tests, and the results were interesting.

The M1 Ultra's CPU definitely outperforms the M1 Max's, as well as the 28-core Intel Xeon W found in the specced-out Mac Pro, but its GPU doesn't quite reach RTX 3090 levels of compute performance.

The Verge used a PC equipped with an Intel Core i9-10900, 64 gigabytes of RAM, and an Nvidia RTX 3090 GPU and obtained a Geekbench 5 Compute score of over 215,000 points. By comparison, the M1 Ultra in the Mac Studio was only able to muster a little over 83,000 points, or 102,156 points when using Metal.

That's still an impressive result, so The Verge also looked at gaming performance in Shadow of the Tomb Raider. Apple is notorious for not optimizing its hardware for gaming workloads, and the M1 Ultra is no exception. While it was able to achieve a respectable 108 frames per second at 1080p and 96 frames per second at 1440p, Nvidia's dedicated GPU had a lead of 18 to 31 percent.

A lot of that difference could be down to the 100-watt power envelope of the M1 Ultra GPU and the way it shares memory bandwidth with the CPU. For reference, the RTX 3090 alone has a TGP of 320 watts and the Core i9-12900K can add over 241 watts on top of that. We saw a similar story with other Apple Silicon chipsets such as the M1 Pro, which is great in terms of performance-per-watt but struggles to keep up with more power-hungry hardware from the x86 space.

The key takeaway from the M1 Ultra benchmarks we've seen so far is that Apple created a small desktop computer that approaches the level of performance found in the significantly larger, more expensive Mac Pro. The $6,199 specced-out Mac Studio may seem like an expensive bit of kit, but for professionals who depend on macOS and apps optimized for Apple Silicon it might look like a bargain next to a slightly more powerful, $14,000 Mac Pro.

Permalink to story.

 
I think Apple is probably not happy that they had to use "GPU" for a point of comparison at all: I think at least for their Studio line they would like you to forget games even exist for GPUs and just focus on video production or ML workloads instead.
 
That's not how it works... And even if it did you can also multiply the price by at least 3.

Sure but I’m not talking about price. I’m just saying that Apple has done an amazing work with the silicon arch and we are only at the beginning. Wait and see for the Mac Pro. I’m sure They will release something like the M1 ultra but doubled (like M1 max to M1 ultra move) and they will lead performance at a fraction of power consumption proposed by the competition.

Édit : at the price of an arm and leg
 
Can't really just compare a single game...

This is the problem the M1 Pro and Max have, they work fine in few games but total crap in most anything else.

The only impressive thing is apple's GPU scaling between two dies. This has been something that AMD and Nvidia have been talking about for years now to replace the old SLI for Crossfire. But just have yet to see it become reality.
 
The problem is the M1 SoC's aren't made as a gaming priority, they are for creating. Apple having to compare some arbitrary numbers is not a fair comparison for what the SoC can do. Apple also didn't say WHAT benchmarks it used to compare to a 3090. I guarantee you it wasn't a game benchmark, it was some form of editing or number crunching, not producing polygons for a tree in Tomb Raider.

Wait until real reviewers, not The Verge who put everything about the M1 out of context intentionally, do true comparison benchmarks, then you can yell about performance vs price vs power consumption.
 
I think there's a target market, already on a Mac-based workflow, who are going to be very pleased with the Mac Studio. There may even be a slice of formerly PC-based users who find the package appealing.

The one piece I'd want to know more about is how modular is the internal hardware. I've seen just a couple reviews and neither was able to even open the case. At this price point, I'd want to know I had some flexibility re: upgrade path, re-using components, etc. That may not be very representative of the target market though.
 
I’m not talking about game but performance
yet in your first comment "game is over for nVidia"

you mention the word game - however lets be a bit more generous and take its general meaning - ie game over - implies for everything - so therefor not performance solely - but Nvidias' GPUS number one reason to be ( no not mining _) gaming .
I thing Nvidia is not worried about Apple just yet in the gaming sphere - it's GPUs do just more than FPS - plus they play nearly all PC games .

What will be telling going forward will be mobile games - an industry getting bigger - I will put Nintendo on that side - as an M1 could easily spit out Nintendo games on an emulator - the other PC/GPU and PS5 , XB1.

No serious gamer will buy a Mac - just yet ( solely for gaming ) - it's a nice bonus to play compatible games from video encoding/creation day job - and those portable games are getting better and better - yet do Apple Mac users want microtransaction games
 
I don't understand the point of this article. AN Apple CPU and chipset, that is optimized for OSX cannot 'beat' an nvidia GPU in a benchmark or a game that are both optimized for x86... Definitely fanboi clickbait.
 
Apple as usual is less than honest about benchmarks. (To quote now famous Gordon's "In WHAT?!?!") With CPU I knew it that it's very much true seeing small M1. 3090 claim was barely believable from the start and if you make it, you better be darn sure its 101% on point. OK in one place 3090 is worse that M1U, but only in power draw.

Of course if we glue 2 (of the glued already) M1U for M1 Extreme then yes technically we'll get 3090, but... at same power draw and... twice the price. On top of that, Metal vs CUDA how you can compare both ecosystems. One is like swimming lone canoe across the Pacific and another is same travel route, but done in a fleet of cruise ships with -combined- millions of passengers on-board. There is a reason why there are no games on Mac and 3D space is barely alive outside big studios on C4D. Apple ****ed everybody with Metal, they screwed nVidia users across the board after High Sierra by dropping all support, so rest of computing World decided to return the favor. No self-respecting large game studio will go Cupertino Way for making games. Fruity Cult makes too many arbitrary decisions (like dropping support), which can turn huge dev investment into pile of horse manure overnight.

Studio is great product, especially CPU, but it cannot compete on graphic compute with PC be it games or 3D for top spot. Not yet.
 
Rtx3090 20% faster for 400% more power. Apple just need to glue 2 m1ultra and the game is over for nVidia

That post is so uninformative it aint even funny. The m1 max is already a HUGE expensive as **ck chip in transistor count. You can fit 4 3090s in the same transistor count as the m1 max. 4 of them. Actually, you don't even need 3090s. You can fit around 40x12900k (with their igpu included). Freaking 40 of them. Im pretty sure 40x12900ks would beat the crap out of the m1 max in both CPU and GPU compute.

Also something is wrong with the SOTR numbers. My 3090 gets 196 fps at 1440p, yet verge manages less fps in 1080p :O
 
That post is so uninformative it aint even funny. The m1 max is already a HUGE expensive as **ck chip in transistor count. You can fit 4 3090s in the same transistor count as the m1 max. 4 of them. Actually, you don't even need 3090s. You can fit around 40x12900k (with their igpu included). Freaking 40 of them. Im pretty sure 40x12900ks would beat the crap out of the m1 max in both CPU and GPU compute.

Also something is wrong with the SOTR numbers. My 3090 gets 196 fps at 1440p, yet verge manages less fps in 1080p :O

Your's is uninformative as well, first you are comparing the transistor count of an all intergrated computer with GPU / Neural Engine / Cpu /Memory / various controllers in ONE CHIP. To be fair you should compare transistor count of the CPU part of the M1 vs your Intel CPU, also, I hope you are directly connected to a nuke plant to feed those power angry Intel CPU :p
2nd you claim the M1 is very expensive, this is false, the m1 Ultra has a manufacturing cost of approximativily 350$ (I forgot the source but this number seems accurate).
The Studio M1ultra price as nothing to do with the manufacturing price but with the Apple Marketing department targeting the creative companies ok to spend that money in a computer.
 
Your's is uninformative as well, first you are comparing the transistor count of an all intergrated computer with GPU / Neural Engine / Cpu /Memory / various controllers in ONE CHIP. To be fair you should compare transistor count of the CPU part of the M1 vs your Intel CPU, also, I hope you are directly connected to a nuke plant to feed those power angry Intel CPU :p
2nd you claim the M1 is very expensive, this is false, the m1 Ultra has a manufacturing cost of approximativily 350$ (I forgot the source but this number seems accurate).
The Studio M1ultra price as nothing to do with the manufacturing price but with the Apple Marketing department targeting the creative companies ok to spend that money in a computer.
Look at the power hungry alder lake cores. 35w beats the m1 at the same wattage :O
 

Attachments

  • 12630 35w.png
    12630 35w.png
    387.2 KB · Views: 5
2nd you claim the M1 is very expensive, this is false, the m1 Ultra has a manufacturing cost of approximativily 350$ (I forgot the source but this number seems accurate).
That's only processed silicon price Assuming 100% yields (and with chip that large yields do suck)

For comparison, Zen2 Epyc 64 core manufacturing costs using same formula is about 160$.

Considering very poor yields, M1 Ultra is ultra expensive to produce. Estimated cost is at least 500$ per chip considering very poor yields again. Realistic cost for 64-core Zen2 Epyc is around 200$. And that's 64-core powerhouse with 256 MB L3 cache...
 
I mean the comment section appear to consist of kiddies who only use their rigs for Call of Duty.

Yes, macOS doesn't have many games. You wanna know why? Because Apple doesn't give a rat's @ss about gaming.

Their GPUs are used by graphics designers, video editors, AI developers. Not by gamers.
 
Rtx3090 20% faster for 400% more power. Apple just need to glue 2 m1ultra and the game is over

The Rtx3090 is almost 2 years old! I doubt Nvidia has been twiddling their thumbs instead of working on the RTX 4090! Besides, now that they managed to secure 5nm chips for themselves, I can see them pulling off RTX 3080 level performance in a laptop!
 
Back