Apple M1 Pro Review: Is it really faster than Intel/AMD?

This thread is a good example of why Apple is successful: The majority of people including the actual review was actually fairly favorable to Apple: They've created a good product. It's expensive sure and lacks both flexible configuration options and a more mature ecosystem but if you just want to edit video and are ok to do so on Final Cut there's currently no better option once you take into account performance and battery life.

So why is basically nobody allowed to be MILDLY CRITICAL of some aspects like 400 bucks for a tiny ram update and similarly inflated ssd prices? Those are valid complains: the reason x86 has been successful is because of things like m.2 slots so you can choose whenever you want capacity, performance or a balance of the 2 in your drive. SODIMM slots, wider array of CPU choices and more permutations of all said options.

For anybody else it's perfectly valid to point out "Yes, you give up unbeatable battery life but you save money by having far more configuration choices" but nope: Apple enthusiasts just have to come and defend even people saying "I want one, just can't afford one" as if their honor somehow depended on a stupid tech product.
 
Anyone here who can shed more light on the Chromium compile results? Like, is there something about the silicon that's really good for compiling (if so, what?); or does the Mac have a better compiler; or maybe the level of optimization settings aren't directly comparable; or is there a difference in how much code needs to be compiled (maybe MacOS includes more needed libraries natively?)

I don't know much about any of these but that particular graph seemed different enough that I got to wondering what was behind it.
Compilation tasks like fast memory subsystem (200GB/s here) and large caches. So the Chromium compile result is expected.
The same thing (memory speed) is responsible for great 7zip compression performance, decompression is more CPU bound so M1 falls behind a bit.
 
A laptop review with no battery testing and no unplugged performance test? If you are not gonna benchmark actual workflow with actual intended purpose, then what's the point?
 
For those who think Apple upgrades are expensive: LPDDR5 RAM at 200GB/s speed basically doesn't exist on windows plus there are no 8TB NVME drive that can do 7.5GB/s transfer speed that you can currently buy on the market for less than $2,000

It's expensive, but you get what you pay for.
 
All the reviews on these M1 mac's and nobody ever talks about all the issues with Rosetta. My company had to pull the M1's because they dont work friendly with developer tools not to mention the security stacks that most companies use are completely incompabitble with M1's.

Good luck with JAMF and other apple supported products. Freeze issues, applications not working, etc.

I am aware of quite a few companies struggling with M1's in their environment and one had to pull all of them and search for a way to buy them outside of Apple so they dont have M1's.

The M1 chipset has been nothing short of disastrous in the professional world. I recognize most personal use people would not experience some of this but as Apple tries to push into the corporate environment this has caused massive headaches.
We’ve rolled out hundreds, using Mosyle Fuse MDM and DEP, and not had any problems at all.
Perhaps your issue is with JAMF?
 
Apple is operating like a monopoly with this pricing, which I guess is fair enough considering the vast majority of MacBook buyers are existing Mac users.

And that kinda sums it up. It's fine when Apple does it.
 
You do know that Alder Lake are space heaters?

Maybe you should watch i5-12600K and i7-12700K reviews instead of i9 with all limiters disabled + OC.

i7-12700K is 64c load under Techpowerup Review with Noctua U14S.
5800X is 74c load using exactly same cooler and ambient, also Techpowerup.

Same RPMs on the fan too.

Sorry to burst your bubble.
 
Maybe you should watch i5-12600K and i7-12700K reviews instead of i9 with all limiters disabled + OC.

i7-12700K is 64c load under Techpowerup Review with Noctua U14S.
5800X is 74c load using exactly same cooler and ambient, also Techpowerup.

Same RPMs on the fan too.

Sorry to burst your bubble.
and i5 12600k is 66c in the same test. So it's win win.
 
and i5 12600k is 66c in the same test. So it's win win.
Yeah they are generally easy to cool at stock. Just because they have a good deal of OC headroom does not mean you need to OC. However it's a nice option to have.

I'm not sure these guys claiming Alder Lake is a space heater have used a Ryzen 5800X, this chip is not easy to cool, especially not if you turn off limiters and enable PBO.

5800X is hotter than even 5900x because all cores are on a single chiplet.
 
Yeah they are generally easy to cool at stock. Just because they have a good deal of OC headroom does not mean you need to OC. However it's a nice option to have.

I'm not sure these guys claiming Alder Lake is a space heater have used a Ryzen 5800X, this chip is not easy to cool, especially not if you turn off limiters and enable PBO.

5800X is hotter than even 5900x because all cores are on a single chiplet.
Yeah, setting it to 95w or even 65w eco-mode works wonders and doesn't hurt performance too badly.
 
New tech is nice but from what I can see few people need more than what we already have. Laptops already last most if not all day, do you really need more? Efficiency is always worth doing though so good on Apple. But at this point are we not arguing about minutia like what's the best peanut butter, crunchy or smooth?

To highlight this I'd like to point out that Apple desktops now run off a mobile chip set. The Apple POV is that's how amazing M1 is (and they are amazingly efficient)... my POV is that's evidence of how little people need more power with the exception of games. Even games are reaching a point where they to will not benefit from more power until we go photo realistic and then we don't need more. The point is, you're becoming the limit now, not the chips.
 
"But if Apple wants to win Windows users over to Mac, setting pricing so high isn’t the way to do it"

Pretty sure they don't... they sell plenty as it is...

Where are all the AMD shills who went on and on about power efficiency as to why Alder Lake was so terrible?

The thing is... in a laptop, power efficiency actually MATTERS! And Apple destroys AMD and Intel there...

I don't really use Macs.... but I'd be tempted to buy one next generation - once all the software is out of "beta" and works natively without Rosetta...
I bought one of the first M1 MB Air. It's great, for what I use it for. I have a gaming laptop but the MBA is perfect for travel and it does everything but high-end gaming. When I travel I'm usually not thinking about gaming at my destination so there's enough casual gaming on the Mac to keep me occupied if needed (or I just use my phone).

Everything I use runs fine. I am mostly an Office 365 app user and don't really have any high-end use cases. Which I think is typical for a lot of people. Sure, developers, photographers and graphic artist as examples, have higher performance needs than me.
 
New tech is nice but from what I can see few people need more than what we already have. Laptops already last most if not all day, do you really need more? Efficiency is always worth doing though so good on Apple. But at this point are we not arguing about minutia like what's the best peanut butter, crunchy or smooth?

To highlight this I'd like to point out that Apple desktops now run off a mobile chip set. The Apple POV is that's how amazing M1 is (and they are amazingly efficient)... my POV is that's evidence of how little people need more power with the exception of games. Even games are reaching a point where they to will not benefit from more power until we go photo realistic and then we don't need more. The point is, you're becoming the limit now, not the chips.
I don't think most laptops last a day under full use and running Wifi. It really depends on their configuration. There is a significant difference between my MacBook Air M1 and my HP Pavilion in regards to battery life. The HP might last 3 but no more than 4 hours. My gaming laptop? Ha, if it got anywhere close to 2 hours playing games, I'd be shocked. The MacBook will easily run 4-8 hours on WiFi driving a second monitor doing Teams meetings and other office work. I know because I occasionally need to use my USB power cord on the Mac to charge my Logitech keyboard.

But you are correct that we don't really need all that power because we are shifting some of our computing needs into the cloud. I don't need a fast rendering laptop when I can ship that off and have something much faster do the work. I think we're at the very beginning of doing this with gaming. We have streaming services and they are not too bad. So imagine when we have tons of computer and graphics power in a device that can run 10-12 hours before needing a charge? That would be a game changer, IMHO.

People are using mobile devices more and more to access Internet content. Longer battery life will be even more important in the future.
 
Yeah, but its a Mac.

..and the type of person who uses Apple products tend to be a bunch of insufferable wankers, who you just want to punch after 5mins of being in the same room as them.
What a well thought through analysis of the article, and upvoted to the top as well. I think we can see who the wankers are - sorry had to stoop to your level.

I use both platforms and I prefer OSX. Each has its benefits. This new range of CPU is a game changer, not just for individual users but more broadly. If the bulk of computer users had this major improvement in energy consumption the impact would be substantial re global energy use.

Thank you for the article however its flawed as noted above by others. Correct the flaws, reassess in a more objective less emotive manner. Lets do all the tests in a true head to head software match up - on battery only and see who performs.
 
Apple is years ahead of Intel and AMD in terms of performance per watt that is for sure.

Apple SoCs completely destroys Android SoCs too, nothing new here. They have been doing that for years by now.

It will be fun to see if Intel can win Apple back. I doubt it, however Alder Lake and especially Raptor Lake will be a big step in the right direction.

With M1 Pro and Max it's going to be much harder tho.

I expect M2 on 3-4nm TSMC in 2022. Just like iPhone 14 series.
M2 will be based on the A15 Bionic used in the iPhone 13 - thus it will be built on TSMC's N5P node.

A16/M3 will be based on 3nm or 4nm, depending on whether TSMC can get 3nm working in time, and that model will benefit from process shrink.
 
While true, like @Irata said it's a bit over-stated: efficient x86 laptops are not *terribly* far behind and while performance isn't ideal I don't think most professionals should consider owning a single device for all they do both when they need horsepower and when they need mobility. It's still after all, a single point of failure for your workflow if your fancy macbook gets stolen or damaged vs having an ultra light 15w class laptop while on the go and a moderate desktop at office/home for when more heavy crunch is needed.

Power efficiency is one of those things when more is almost always better but there is such a thing as "good enough" and while many laptops fail at this due to Nvidia (And AMD's) seriously laughable power efficiency numbers, integrated solutions do fill that "Good enough" niche for most people: long days (But not *full* day battery) and good enough performance on the road and yes on a pinch, an APU 5800u can enable some 720p gaming without issues and 1080p for lighter titles even so you're good to go for what you need unless you're on the curious position of being fully homeless but can afford a Macbook Pro.

But hey given the prices maybe the meaning of "laptop hobbo" might actually become reality soon.
Yeah, it is true.

For some reason Techspot thought it would be a good idea to compare laptops while plugged in - while the numbers for most of these models would fall off a cliff if unplugged from the wall.

The real performance of these x86 models comes not from the CPU but the GPU - so calling them x86 is almost a misnomer - they should be called nVidea or AMD GPU laptops with an x86 compute engine.

As for the desktop bias, before I retired we had pretty well established that all computers fail and because of that we had this practice called "backing up" which would allow us to restore a broken, lost, or stolen computer with a copy of the data taken at a previous date. You ought to try it - your desktop environment will someday become inoperative and if you're not keeping at least at least two (I use three with one being offsite) backups you're doomed to lose data. Heck, even my 61 TB Thunderbolt connected disk array is backed up in this way.

Now me: I love a laptop that I can do full bore work on in the middle of Starbucks, and these laptops are so strong you can work anywhere at any time.
 
Not a shocker.
X86 is compatible with so many different systems and software than any RISC chip.
And no matter how fast Apple silicon gets, I'm still not buying it! I'd have to use MacOS.
M1 is a purpose built chip for professionals, and it's damn good at that in natively supported applications.

It's no x86 killer that's for sure.
That's a legacy "I won't learning nothing new" opinion - most OS agnostic consumers if exposed to both would immediately choose an M1 computer over a competing Windows computer because of the system's snappiness and efficiency (long battery life).

And no, these laptops playing in rather stratospheric space - they successfully compete against the top tier Windows laptops speced up to their level, but that's not where most of the computer market lives.

Most machines are waaayyyy down lower in the territory that the M1 occupies. The lower end M1 (and soon to be M2) computers are what should worry AMD and Intel - especially with efficiency mandates coming down the pike like the <50 kWh/year mandate as has appeared in California. These models are rapidly gaining market share - shortly after introduction, the M1 Mini became the best selling PC in Japan.

The vast majority of the computers sold are not high end Ryzen or core-i9s - they're middling pretty crappy computers like low end chrome books or ultrabooks or their desktop equivalents. The M1s easily overpower these models in pretty much all common consumer metrics.

Even enterprises are abandoning the traditional Windows client/server model because it's just too hard to keep end-user machines working properly between user installed software, lousy Windows maintenance, and iffy CPU vulnerability mitigations - instead they're going for web apps and thin clients tied to SMB/CIFS shares - usual based on linux kernels. Machines they can rapidly wipe and replace should something go wrong.
 
That's a legacy "I won't learning nothing new" opinion - most OS agnostic consumers if exposed to both would immediately choose an M1 computer over a competing Windows computer because of the system's snappiness and efficiency (long battery life).

And no, these laptops playing in rather stratospheric space - they successfully compete against the top tier Windows laptops speced up to their level, but that's not where most of the computer market lives.

Most machines are waaayyyy down lower in the territory that the M1 occupies. The lower end M1 (and soon to be M2) computers are what should worry AMD and Intel - especially with efficiency mandates coming down the pike like the <50 kWh/year mandate as has appeared in California. These models are rapidly gaining market share - shortly after introduction, the M1 Mini became the best selling PC in Japan.

The vast majority of the computers sold are not high end Ryzen or core-i9s - they're middling pretty crappy computers like low end chrome books or ultrabooks or their desktop equivalents. The M1s easily overpower these models in pretty much all common consumer metrics.

Even enterprises are abandoning the traditional Windows client/server model because it's just too hard to keep end-user machines working properly between user installed software, lousy Windows maintenance, and iffy CPU vulnerability mitigations - instead they're going for web apps and thin clients tied to SMB/CIFS shares - usual based on linux kernels. Machines they can rapidly wipe and replace should something go wrong.
Except that the M1 laptops cost north of $2000.... and a chromebook costs south of $400.... don't tell me they are in competition....
 
> In this benchmark, the M1 Pro came last of the configurations we tested

Just so you know, the photogrammetry graph says "Higher is better". You should probably fix that.
 
Yeah, it is true.

For some reason Techspot thought it would be a good idea to compare laptops while plugged in - while the numbers for most of these models would fall off a cliff if unplugged from the wall.

The real performance of these x86 models comes not from the CPU but the GPU - so calling them x86 is almost a misnomer - they should be called nVidea or AMD GPU laptops with an x86 compute engine.

As for the desktop bias, before I retired we had pretty well established that all computers fail and because of that we had this practice called "backing up" which would allow us to restore a broken, lost, or stolen computer with a copy of the data taken at a previous date. You ought to try it - your desktop environment will someday become inoperative and if you're not keeping at least at least two (I use three with one being offsite) backups you're doomed to lose data. Heck, even my 61 TB Thunderbolt connected disk array is backed up in this way.

Now me: I love a laptop that I can do full bore work on in the middle of Starbucks, and these laptops are so strong you can work anywhere at any time.

I don't necessarily disagree but my only comment is that you are overstating the importance of a strong GPU quite a bit: yes for gamers is crucial. Also for video editors you've gotta have a strong GPU.

But for everybody else? Either integrated graphics like Vega 3 to 8 on AMD or Iris Pro on intel is sufficient or is actually CPU dependent: Lots and lots of professionals depend on CPU workloads in fact a lot more than GPU based workloads.

M1 is of course exceedingly good at CPU task as well, not without caveats but pretty excellent so this isn't a direct knock against them, but it is possible to thrive with a laptop that doesn't has a dedicated GPU.
 
Nice rounded review - just shows that the Apple fans who were raving about how wonderful the M1 was - were being very selective.
A lot of money if not processing photos/videos and the new Intel ones do that well also.
Plus the power efficiency wasn't as amazing as I was lead to believe - considering running on a better fab, highly optimised code - that $2500 buys a lot of electricity or battery pack..
Most telling was Apple raving about nearly running all programs flawlessly - yet that is not the case - as stated here .
Windows/Linux are for general purpose computing - with a huge library of software .

As for encoding - interested to see new intel hardware .

That $2000 GPU you will buy for your Windows/Linux PC in a few years will probably have much much better encoders than now - with competition heating up -.
So you will be able to encode 100x faster and play games .
Nvidia must be working on a new encoder as no update for 3000 series .
Intel will also

Not sure about AMD .

TBF - M1 will be fine for youtubers and family videos, local city adverts etc .
Serious Apple creatives will need to spend on the mac pro -and 2500 will seem cheap for them

I think this report also uses some selective testing to make Intel/AMD look better. If you're comparing CPUs, why introduce a GPU for gaming benchmarks, why not use Intels built-in graphics, just like the M1? Of course I expect an Nvidia 3080 to blow away the GPU in the M1 SoC. Duh.

When comparing CPU only tasks the Apple SoC performs very well against Intel and AMD. And we're into 1.5 gen silicon at this point compared to what, 10th or 11th generation Intel? I'd say Intel better be looking over their shoulder because Apple is hot on their heels. Also, how do we know Apple won't integrate an external GPU, Nivida or AMD down the road?

At the end of the day, I think Apple has breathed some new life into their laptop business. After having used a MacBook Air M1 for just shy of a year now I can say this laptop is more than adequate for everyday personal and work tasks for a large percentage of people. As you said, creatives probably need something different but that is also true of Windows creative users. I don't think many creative people are using high powered laptops in mobile applications all that often due to the poor battery performance plus the need for external screens and other peripherals.

 
Apple is [years] MONTHS ahead of Intel and AMD in terms of performance per watt that is for sure.

FTFY
The gains here are no more significant than a normal "tick or tock" release on any silicon. Apple just got their latest and greatest out of the lab a few months before INTELs next release. We see the same see saw in ATI / NVIDEA card releases.

It's hardly revolutionary or earthshattering as the review shows. It's a good processor; better in some cases, and not in some others.
 
Back