Inside the Apple M1 is an incredibly quirky GPU

mongeese

Posts: 643   +123
Staff
In context: Apple keeps the inner workings of the M1 family of processors secret from the public, but dedicated developers have been reverse-engineering it to create open source drivers and a Linux distro, Asahi Linux, for M1 Macs. In the process, they've discovered some cool features.

In her efforts to develop an open source graphics driver for the M1, Alyssa Rosenzweig recently found a quirk in the render pipeline of the M1's GPU. She was rendering increasingly complicated 3D geometries, and eventually arrived at a bunny that made the GPU bug out.

Basically -- and please note that this and everything else I'm about to say is an oversimplification -- the problem begins with the GPU's poor access to memory. It's a powerful GPU, but like the A-series iPhone SoC it shares an ancestor with, it takes shortcuts to preserve efficiency.

Instead of rendering straight into the framebuffer like a discrete GPU might, the M1 takes two passes of a frame: the first finds the vertices, and the second does everything else. Obviously, the second pass is much more intensive, so between the passes, dedicated hardware segments the frame into tiles (mini-frames, basically) and the second pass is taken one tile at a time.

Tiling solves the problem of not having enough memory resources, but to be able to piece the tiles back together into a frame later, the GPU needs to keep a buffer of all the per-vertex data. Rosenzweig found that whenever this buffer overflowed, the render wouldn't work. See the first bunny, above.

In one of Apple's presentations, it's explained that when the buffer is full, the GPU outputs a partial render - I.e., half the bunny. In Apple's software, the buffer in question is called the parameter buffer, a name seemingly taken from Imagination's PowerVR documentation.

Imagination is a UK-based company that, like Arm, designs processors that it licenses to other companies. Apple inked a deal with the company at the beginning of 2020 that allows Apple to license a broad range of its IP. It's clear that the M1, which was brought to market at the end of 2020, uses their PowerVR GPU architecture as some sort of a basis for its GPU.

Anyway, back to the bunny. As you might have guessed, the partial renders can be added together to create a render of the whole bunny (but with a dozen extra steps in-between, of course).

But this render still isn't quite right. You can see artifacts on the bunny's foot. It turns out that this is because different parts of the frame are split between a color buffer and a depth buffer, and the latter misbehaves when loaded with partial renders.

A reverse-engineered configuration from Apple's driver fixes the problem, and then you can finally render the bunny (below).

It's not just Rosenzweig's open-source graphics driver for the M1 that jumps through all these hoops to render an image: this is just how the GPU works. Its architecture probably wasn't designed with 3D rendering in mind but despite that, Apple has turned it into something that can rival the latest discrete GPUs, if not quite surpass them, as Apple claims. It's cool.

For a more in-depth (and technically accurate) explanation of bunny rendering, and for other explorations into the M1, be sure to check out Rosenzweig's blog and the Asahi Linux website.

Masthead credit: Walling

Permalink to story.

 
Well I think that's more the problem with particular rendering engine (glmark2=OpenGL) than the GPU itself. Blender since v. 3.1 has full support for *Metal* M1 powered Macs, before that only used CPU part. Even so at CPU level it was quite impressive considering power envelope and work done. Nobody sane will use OpenGL when Metal is supported. Everybody knows that OGL is convoluted mess and Metal despite limited support because of its "Mac-ishness" is in the same ballpark as Vulkan or DX12.
 
Well I think that's more the problem with particular rendering engine (glmark2=OpenGL) than the GPU itself. Blender since v. 3.1 has full support for *Metal* M1 powered Macs, before that only used CPU part. Even so at CPU level it was quite impressive considering power envelope and work done. Nobody sane will use OpenGL when Metal is supported. Everybody knows that OGL is convoluted mess and Metal despite limited support because of its "Mac-ishness" is in the same ballpark as Vulkan or DX12.
and yet another "expert" fails to figure this out? Weird. Use of deprecated software should never be the basis for any kind of a conclusion, other than "on deprecated software so and so".
therefore it also sounds like the conclusion is wrong, it is not an issue with the M1, but for deprecated software. Sorry, it is a pet peeve of mine
 
Well I think that's more the problem with particular rendering engine (glmark2=OpenGL) than the GPU itself. Blender since v. 3.1 has full support for *Metal* M1 powered Macs, before that only used CPU part. Even so at CPU level it was quite impressive considering power envelope and work done. Nobody sane will use OpenGL when Metal is supported. Everybody knows that OGL is convoluted mess and Metal despite limited support because of its "Mac-ishness" is in the same ballpark as Vulkan or DX12.

I don't think you understand what they are actually doing? They aren't stuffing around trying to get OpenGL to work on Apple Silicon for the LOLs, they are trying to reverse engineer the GPU drivers so they can implement support for the GPU in Asahi Linux, which supports OpenGL and Vulkan. By using a known tool like glMark that works on Linux they can try to understand how the GPU works at the silicon level and what software calls it needs to drive it. The GPU doesn't directly understand Metal, the Apple drivers in MacOS translate those Metal commands into one's the GPU actually used. Obviously Apple doesn't document how that works at the hardware level, so the Asahi developers need to reverse engineer that level so they write alternative Linux drivers which can translate OpenGL and Vulkan commands that Linux uses to something the Apple GPU can understand.

They aren't critising the GPU in Apple Silicon, but rather uncovering some of its unique elements so their drivers can correctly drive it.

 
If anything, this beast will make me switch to a Mac.
I can assure you that the M1 Ultra doesn't even surpass a 3060 in any relevant metric. We have M1 equipped macbook pros at work (bought exactly because we needed something for video editing and light 3D work) and they just can't keep up with a similarly priced PC (I'm talking about huge differences).

The best thing they do is leverage the advanced process node to extend battery life which is nice for me who is doing programming on it. The GPU just isn't up to snuff.
 
Last edited:
and yet another "expert" fails to figure this out? Weird. Use of deprecated software should never be the basis for any kind of a conclusion, other than "on deprecated software so and so".
therefore it also sounds like the conclusion is wrong, it is not an issue with the M1, but for deprecated software. Sorry, it is a pet peeve of mine

Yet another one fails to understand what they are doing. Metal is available only under MacOS. These guys are enabling the GPU on Linux and are writing the driver. Metal on Mac runs on top of a driver, on Linux the equivalent is Vulkan and OpenGL.
Linux has pretty good OpenGL drivers.
 
I had a system with the Intel GMA500 -- which they pretended was an Intel part but was really PowerVR. It was indeed a very odd GPU.
 
and yet another "expert" fails to figure this out? Weird. Use of deprecated software should never be the basis for any kind of a conclusion, other than "on deprecated software so and so".
therefore it also sounds like the conclusion is wrong, it is not an issue with the M1, but for deprecated software. Sorry, it is a pet peeve of mine
Nobody here is using deprecated software, Mesa drivers supporting OpenGL and Vulkan are fully up to date and supported. They are not running MacOS so no concern over what Apple decides to support or not.

Conclusion is not wrong in any way, it is an issue with the M1 GPU (not really a bug as such, there's some bits on the GPU to support continue rendering where it let off, which wouldn't exist if it was a true bug); and the drivers must deal with it whether they are for Metal, OpenGL, Vulkan, or Direct3D.
 
If anything, this beast will make me switch to a Mac.
I bought my first Mac, a Mac Mini M1 and the hardware and video capabilities are insane!

Macos though is very inferior to Windows or many Linux distributions, making some basic tasks complicated. I thought it was because I never worked with macos before, then I searched for solutions and I found out that macos is the issue.

First test a macos device to see if it suits you; as I have already the Mac and some video apps, I'll keep it and hope that next macos solve those issues (doubt it...).

A MacBook Air M2 running Windows would be fantastic 🤩
 
Back