Ex-Intel engineer says "abnormally bad" Skylake QA prompted Apple switch to ARM

Meh apple wants to fall into obscurity again that is their choice. Alot of PC programs got ported because apple ran on x86 and made the job easier, now your going to get a bunch of mobile ports that are feature limited, sure Adobe and a few others will support it with full featured programs but it will be the powerpc wasteland again for apple owners.
That is so true. If anyone remembers Mac's before x86; there was almost no software on Mac. With x86 it was much easier for companies to port their software to MacOS on the same architecture.
Now they are moving to a much weaker architecture which is also not straight compile compatible with x86... companies have worked for years to port software to ARM. There is a reason that ARM versions of software are much weaker than their x86 counterparts.
 
What Arm can come from switching I say.

Anyway, not sure why they wouldn’t have looked at AMD it’s not like Zen 2 just came out and Zen 3 will be much better again.
 
Except ARM isnt standardized, at least not at the level thats required for PC style hardware. What your suggesting is an entirely embedded world where every product uses what amounts to custom solutions. Thats unlikely to happen. Notice how game consoles have switched from custom solutions to mostly off the shelf ones?

ARM is likely to take some marketshare for sure, but its unlikely to cause x86 to "die". The same reasons in play that make it "good enough" also work against forcing a wholesale switch away from x86 even if it becomes more efficient and faster (since x86 will also be "good enough" but with a vastly larger installed software base).

I think people who say these types of things simply don't understand the colossal undertaking involved in moving everything a company might be using to a new architecture. Heck, its possible PC gaming is a big enough market alone to keep x86 alive in some form or another.
Exactly. The doom of x86 has been talked about for like forever. There has been so much R&D going into the architecture to ensure that x86 will continue on to the future. The gap between RISC (ARM) and CISC (x86) is blurring. ARM is becoming more CISC-like and will eventually have to grapple with the same challenges x86 have been fighting for years.
 
Where did you get 20%? Last time I looked it was at around 8%.
Read my entire post you quoted, it wasn't that long
Except ARM isnt standardized, at least not at the level thats required for PC style hardware. What your suggesting is an entirely embedded world where every product uses what amounts to custom solutions. Thats unlikely to happen. Notice how game consoles have switched from custom solutions to mostly off the shelf ones?

ARM is likely to take some marketshare for sure, but its unlikely to cause x86 to "die". The same reasons in play that make it "good enough" also work against forcing a wholesale switch away from x86 even if it becomes more efficient and faster (since x86 will also be "good enough" but with a vastly larger installed software base).

I think people who say these types of things simply don't understand the colossal undertaking involved in moving everything a company might be using to a new architecture. Heck, its possible PC gaming is a big enough market alone to keep x86 alive in some form or another.
Most people don't even have a standard computer in their house anymore and if they do, it's really only used by one person for work or school. All our recreational needs can be fulfilled by embedded chips in smart TVs, phones and tablets. Office software can run on arm no problem and if things like Photoshop and protools switch to arm, all but enthusiasts and specific applications will require x86
 
Last edited:
Most people don't even have a standard computer in their house anymore and if they do, it's really only used by one person for work or school.

Could you provide a source for this? I know that the plural of anecdote is not the data, but I'm thinking here... my grandma has a desktop she uses about 14 hours a day (ha!) and my dad has a desktop, my stepmom has a laptop. My in-laws share a desktop. My sister-in-law has a laptop. My siblings all have at least one desktop or laptop, and most of their kids do, too. I've got a desktop and a few laptops, and my wife has a laptop. My neighbors all have laptops per adult and some have desktops. Same with all of my friends. We all use them regular, game together, play games over Zoom/Jackbox/Discord, etc. I'm just confused because I actually can't think of a single person I know that doesn't have a "standard computer" in their house. Almost all of them have Intel though me and some of the people I've influenced have AMD chips, as well my tech savvy friends. I know one or two people with Mac laptops. None of that means anything; could be a silo. Just wondering if you have a link or source for "most people" not having a "standard computer."
 
Last edited:
Could you provide a source for this? I know that the plural of anecdote is not the plural of data, but I'm thinking here... my grandma has a desktop she uses about 14 hours a day (ha!) and my dad has a desktop, my stepmom has a laptop. My in-laws share a desktop. My sister-in-law has a laptop. My siblings all have at least one desktop or laptop, and most of their kids do, too. I've got a desktop and a few laptops, and my wife has a laptop. My neighbors all have laptops per adult and some have desktops. Same with all of my friends. We all use them regular, game together, play games over Zoom/Jackbox/Discord, etc. I'm just confused because I actually can't think of a single person I know that doesn't have a "standard computer" in their house. Almost all of them have Intel though me and some of the people I've influenced have AMD chips, as well my tech savvy friends. I know one or two people with Mac laptops. None of that means anything; could be a silo. Just wondering if you have a link or source for "most people" not having a "standard computer."

Agreed, a few people I knew switched to tablet from their laptop, after a year they had a laptop again. Phones and tablets are great for quick browsing but beyond that you are really hamstrung. I couldn't type out an email or make memes on my phone, I need a keyboard and Ms paint for that.
 
Agreed, a few people I knew switched to tablet from their laptop, after a year they had a laptop again. Phones and tablets are great for quick browsing but beyond that you are really hamstrung. I couldn't type out an email or make memes on my phone, I need a keyboard and Ms paint for that.
Exactly. Since day 1, I have always find that iPads are not suited for any serious work. The lack of a mouse is a great handicap. I-sheeps the world over heralded it as the second coming despite that. Apple have taken many years, but they have finally realized the folly of their design and lately, have added limited mouse support for iPads. How can anyone reasonably consider having the need to constantly reach all over the screen to click something as being ergonomic when using a mouse is nothing more than a flick of the wrist while resting the arm on the desk? What about a 32" monitor with a full size mechanical keyboard? I have owned pretty much every generation of iPads and many Android tablets (mostly crap), but they will never come close to replacing my docked mobile workstation.
 
Could you provide a source for this? I know that the plural of anecdote is not the data, but I'm thinking here... my grandma has a desktop she uses about 14 hours a day (ha!) and my dad has a desktop, my stepmom has a laptop. My in-laws share a desktop. My sister-in-law has a laptop. My siblings all have at least one desktop or laptop, and most of their kids do, too. I've got a desktop and a few laptops, and my wife has a laptop. My neighbors all have laptops per adult and some have desktops. Same with all of my friends. We all use them regular, game together, play games over Zoom/Jackbox/Discord, etc. I'm just confused because I actually can't think of a single person I know that doesn't have a "standard computer" in their house. Almost all of them have Intel though me and some of the people I've influenced have AMD chips, as well my tech savvy friends. I know one or two people with Mac laptops. None of that means anything; could be a silo. Just wondering if you have a link or source for "most people" not having a "standard computer."
I mean, what you gave me is just about as ancedotal as what I would give you. Where I live hardly anyone has a dedicated computer unless they need it for it such as work, creative purposes or are a PC gamer. Otherwise they consider it a waste of money. When I have friends over some people look at my PC and say, "wtf is that?"

I couldn't live without my PC, but for many people I know their phone IS their PC.

I'm having a difficult time finding total sales revenue for PC sales, but here they spent $522 billion dollars on smartphones(not including tablets) in 2018. If you can find the amount of money spent on PCs in 2018 and it is greater than that of smartphones, the. I will gladly concede. A few sources I look at said said PC gaming was estimated between $132 billon and $135 billion. Granted, that's not the whole market. However, once you start googling you'll find how difficult proper statistics on this subject are to find.

Otherwise, were arguing on anecdotal evidence. I did my part it's your turn to do we work on this.

EDIT:
I'd like to bring up another point, I never said that smartphones and tablets could replace everyone's computers, just that many people I know don't own a PC or need one. There are many laptops, windows 10 laptops included that have ARM chips in them. I'm talking about ARM chips replacing x86 chips because they're already common place, cheaper and more power efficient. Further, I won't be switching to ARM because I'm a PC enthusiast and loving gaming and Linux.

Don't try to change the subject to touchscreens replacing keyboards. This is about ARM computing overtaking x86 computing with Apple accelerating that transition with their own ARM chips in their products.

BTW, I posted this from my phone
 
Last edited:
The main reason could be that Intel has more security holes than Swiss cheese. You can make your computer safe in all other aspects, but if the CPU itself is the biggest hole, there's no way to guarantee safety. However, ARM is slower than AMD and Intel CPUs, especially in single-threaded tasks, so certain apps may become much slower in the future.
 
"world’s fastest supercomputer is now the Arm-based" which arms are used? Left ones or the right ones? I might be rude, but I think if it's laziness if you can't write an architecture name properly and articles shouldn't be written like that.

Hello, English teacher here... if it's spelled with a capital A as in "Arm", then that's not referring to a limb, now is it? Besides that, language without context is meaningless. The context always establishes the connotations and meaning of the words we choose to use. In the context of this article, NOBODY would be thinking the author is talking about arms as limbs.

Unless you have a hidden agenda in trying to make the author look bad? Like u're from a rival site...I can't see any other reason why someone would put such an asinine comment like yours.
 
Last edited:
Read my entire post you quoted, it wasn't that long

Most people don't even have a standard computer in their house anymore and if they do, it's really only used by one person for work or school. All our recreational needs can be fulfilled by embedded chips in smart TVs, phones and tablets. Office software can run on arm no problem and if things like Photoshop and protools switch to arm, all but enthusiasts and specific applications will require x86

Where do you live where most people don't have computers?

I live in Toronto and I assure you most people here have at least 1 computer (ie desktop/laptop) per household. With the quarantine a lot of people went out and bought more than 1 per house hold because children need it for school during this quarantine lock down. The Ontario government is also buying thousands more laptops to send to families that can't afford it for their children (not the majority, 90% of my students have laptops). A new laptop can be bought for $450 USD here, the average household income in Toronto is $60k USD, so $450USD for a family is very affordable.
 
Where do you live where most people don't have computers?

I live in Toronto and I assure you most people here have at least 1 computer (ie desktop/laptop) per household. With the quarantine a lot of people went out and bought more than 1 per house hold because children need it for school during this quarantine lock down. The Ontario government is also buying thousands more laptops to send to families that can't afford it for their children (not the majority, 90% of my students have laptops). A new laptop can be bought for $450 USD here, the average household income in Toronto is $60k USD, so $450USD for a family is very affordable.
I live in Pittsburgh. And, actually, pretty much everyone has a computer, it just isn't a desktop or a laptop. The point that I'm driving at is that the majority of our computing needs is already being done on ARM. A phone, tablet, smart TV or gaming console is a computer.

And as I said before because people don't seem to be reading my posts IS THAT PEOPLE DON'T BUY COMPUTERS ANYMORE UNLESS THEY NEED IT FOR WORK, SCHOOL OR CREATIVE PURPOSES. The fact that you said most people in quarantine went out and bought computers because their kids were staying home from school shows that THEY DIDN'T NEED DEDICATED COMPUTERS BEFORE.

Since the majority of computing that the general public needs is already being done on ARM, most people don't need laptops or desktops. Almost half the people in my mountain bike club don't have a computer. Remember the phrase, "the best camera is the one you have with you"? Computing is becoming very similar to that. I use my phone FAR MORE than I do my desktop. A friend of mine is a real estate agent, she almost exclusively uses her phone for work and does so through the company app.

I also don't understand why people keep ignoring the point that I'm trying to make. Everyone keeps shifting the conversation to "who doesn't have a computer" "phones and tablets aren't replacements for computers". I'm not talking about replacing laptops or desktops, I'm talking about how basically everything we do today is already done on ARM and moving towards it in the laptop and desktop space will hardly affect anyone outside of SPECIFIC USES

This is the most frustrating thread I've ever spent time replying in.
 
I live in Pittsburgh. And, actually, pretty much everyone has a computer, it just isn't a desktop or a laptop. The point that I'm driving at is that the majority of our computing needs is already being done on ARM. A phone, tablet, smart TV or gaming console is a computer.

And as I said before because people don't seem to be reading my posts IS THAT PEOPLE DON'T BUY COMPUTERS ANYMORE UNLESS THEY NEED IT FOR WORK, SCHOOL OR CREATIVE PURPOSES. The fact that you said most people in quarantine went out and bought computers because their kids were staying home from school shows that THEY DIDN'T NEED DEDICATED COMPUTERS BEFORE.

Since the majority of computing that the general public needs is already being done on ARM, most people don't need laptops or desktops. Almost half the people in my mountain bike club don't have a computer. Remember the phrase, "the best camera is the one you have with you"? Computing is becoming very similar to that. I use my phone FAR MORE than I do my desktop. A friend of mine is a real estate agent, she almost exclusively uses her phone for work and does so through the company app.

I also don't understand why people keep ignoring the point that I'm trying to make. Everyone keeps shifting the conversation to "who doesn't have a computer" "phones and tablets aren't replacements for computers". I'm not talking about replacing laptops or desktops, I'm talking about how basically everything we do today is already done on ARM and moving towards it in the laptop and desktop space will hardly affect anyone outside of SPECIFIC USES

This is the most frustrating thread I've ever spent time replying in.
Ah I see, and with that HUGE qualifier I agree. that's different from what you were saying in the post I quoted though, and its quite an important detail.
 
Ah I see, and with that HUGE qualifier I agree. that's different from what you were saying in the post I quoted though, and its quite an important detail.
and which qualifier was that? did you just read that one post or the whole thread?
 
I don't get it. Ryzens have been around for three years now. If what Piednoël said is true I'd figure Apple would have chosen to use AMD CPUs. Someone should've done their job and posed that as a question to Piednoël.

ARM computing is faster and more efficient than X86. Ryzen is X86. So whilst Ryzen is better than Intel, it’s still not very good compared to arm.

Also Apple want to design their chips themselves and use TSMC to produce their designs. Intel have their own fabs but AMD do not. If Apple used AMD they would be paying AMD loads of money to get chips from the same company that makes their chips for them. Why add a middle man?
 
ARM computing is faster and more efficient than X86. Ryzen is X86. So whilst Ryzen is better than Intel, it’s still not very good compared to arm.

Faster? Let's see: Single thread performance (single CPU): x86 is much faster. Multi core performance (single CPU): x86 is much faster. So where Arm is "faster"? If referring to single supercomputer, it doesn't take long until x86 is again on top spot.

When it comes to efficiency, comparing sub 3 GHz CPU's against 4+ GHz CPU's is very pointless.

It's x86 btw, not X86.
 
ARM computing is faster and more efficient than X86. Ryzen is X86. So whilst Ryzen is better than Intel, it’s still not very good compared to arm.

Also Apple want to design their chips themselves and use TSMC to produce their designs. Intel have their own fabs but AMD do not. If Apple used AMD they would be paying AMD loads of money to get chips from the same company that makes their chips for them. Why add a middle man?

Clock for clock arm is slower, and desktop versons are just now getting to where x86 was a decade ago, the very best arm chips can compete with core2 and phenom, yes that's not terrible and great for a toy like a Chromebook or your phone, but absolutely worthless for actual work. Time = money, and if the x86 can do it in 3 hours but the arm takes 5 that is unacceptable and is a net loss in profit. I feel bad for studios, on new arm Macs their rendering times will be similar to times in 2006 all over again, I'd imagine they are going to hold onto what they've got as long as they can.
 
and which qualifier was that? did you just read that one post or the whole thread?
Just read the first post, not the whole thread. Anyways, just qualify and be detailed in your 1st post to avoid repeating yourself so much. I mean u certainly don't have to, it is just the internet after all, but you wrote: This is the most frustrating thread I've ever spent time replying in.

Cheers.
 
Okay, well osx has closer to a 20% market share and is most often chosen by artsy types over windows so I bet the % of adobes creative if software compared to windows is much higher than that


I don't know where you got 2%

Us "artsy types" who work on video started switching to Windows years ago when they neutered Final Cut, forcing us to go to Adobe. This was accelerated by the trash can Mac Pro and the exiling of nVidia, as we weren't locked in on Apple software any more. I had some hope I might switch back with the new Mac Pro - which is beautiful - but at 3x the launch price of the previous model, and a CPU and GPU not remotely competitive with AMD/nVidia machines 1/3 the cost I can't see myself going back.

Us "artsy types" rely on a LOT of third party plugins for Adobe, most devs of which can't afford to support existing x86 customers and ARM customers at the same time. Most are cross Mac-Windows, so if it's a choice between keeping our plugins with Windows with cheaper, faster hardware, and sticking with the Mac, we'd likely go Windows (I already did).

Unfortunately for Apple/Adobe it's a chicken/egg argument - these devs won't support ARM till their users switch, and their users won't switch until they support it. Expect further increases in the use of Surface products and BYOD desktops in the video space. A LOT of big houses moved to Da Vinci with Linux too.
 
Clock for clock arm is slower, and desktop versons are just now getting to where x86 was a decade ago, the very best arm chips can compete with core2 and phenom, yes that's not terrible and great for a toy like a Chromebook or your phone, but absolutely worthless for actual work. Time = money, and if the x86 can do it in 3 hours but the arm takes 5 that is unacceptable and is a net loss in profit. I feel bad for studios, on new arm Macs their rendering times will be similar to times in 2006 all over again, I'd imagine they are going to hold onto what they've got as long as they can.

I'm not quite sure where you got this from (and this is coming from someone very much not a fan of the ARM switch) but this is very much wrong. High end ARM mobile chips have been competing with desktop products from 2014 for some time - very much not 2006 era stuff. Emulation is definitely going to be a hit, but if you remember that CPU increases are exponential, ARM is miles past 2006.

You also don't really know much about studios - most rendering is significantly GPU accelerated - a lot more than it was in 2006 or even 2014 - so the CPU is less of a factor.

ARM, x86, etc, the architecture isn't what's important here - it's the optimisation for the task. Intel has a wide customer base, Apple doesn't. If Apple can optimise the chips for each machine to be better tailored to their customers (as they have with the iPhone) AND (this is the part I am very sceptical of) get devs to recode their software, they'll be fine.

The main issue is AMD has a poor track record and Intel has been lost for 5 years. Apple so far hasn't lost the crown once in smartphones. It's going to be an interesting few years - but if devs don't switch quick enough creators will lose patience and look elsewhere. It's out of Apple's hands now - they can almost certainly make a CPU which matches what AMD/Intel have, it's all about what software they can get to run on it.
 
The main issue is AMD has a poor track record and Intel has been lost for 5 years. Apple so far hasn't lost the crown once in smartphones. It's going to be an interesting few years - but if devs don't switch quick enough creators will lose patience and look elsewhere. It's out of Apple's hands now - they can almost certainly make a CPU which matches what AMD/Intel have, it's all about what software they can get to run on it.

They could. But then, AMD and Intel both have made high performance CPU cores for decades. Apple so far hasn't make single one. So I wouldn't hold my breath.
 
They could. But then, AMD and Intel both have made high performance CPU cores for decades. Apple so far hasn't make single one. So I wouldn't hold my breath.

Doesn't matter, they have the architecture in place already, which is competitive in both IPC, frequency and core count. It's easy to scale this up - in the same way a 3300X from AMD is just a 3950X with less cores.
 
Doesn't matter, they have the architecture in place already, which is competitive in both IPC, frequency and core count. It's easy to scale this up - in the same way a 3300X from AMD is just a 3950X with less cores.

It's barely competitive on IPC but not on frequency. Apple's current 7nm chip barely exceeds 2.5 GHz, that's very far from 4.5 GHz range we are talking with AMD and Intel chips. Basically Apple needs to make CPU with much longer pipeline to get higher clocks. And making longer pipeline makes it much more difficult to keep IPC high.

So far Apple has not demonstrated they could even design CPU with long pipeline, even less demonstrated long pipeline CPU with IPC high same time. Even Intel (Netburtst) and AMD (Bulldozer) had great difficulties reaching high clock speeds without sacrificing IPC. Additionally longer pipeline automatically means more power consumption, so there is long road ahead. Nothing like "we just take current cores and add more of them".
 
It's barely competitive on IPC but not on frequency. Apple's current 7nm chip barely exceeds 2.5 GHz, that's very far from 4.5 GHz range we are talking with AMD and Intel chips. Basically Apple needs to make CPU with much longer pipeline to get higher clocks. And making longer pipeline makes it much more difficult to keep IPC high.

So far Apple has not demonstrated they could even design CPU with long pipeline, even less demonstrated long pipeline CPU with IPC high same time. Even Intel (Netburtst) and AMD (Bulldozer) had great difficulties reaching high clock speeds without sacrificing IPC. Additionally longer pipeline automatically means more power consumption, so there is long road ahead. Nothing like "we just take current cores and add more of them".

You need to go speak to cmaier on MacRumours. He's the guy who designed Opteron. Look at the pipeline for the A12 - it's wider than desktop chips. The length can significantly increase without affecting IPC. The reason the clocks are competititive is the power consumption at 2.5ghz is a fraction of the equivalent Intel chip - this can easily be increased with active cooling, something we haven't seen yet on an Apple ARM chip.

They literally just need to add more cores and power. It's all already there. Again, the main issue is software.
 
You need to go speak to cmaier on MacRumours. He's the guy who designed Opteron. Look at the pipeline for the A12 - it's wider than desktop chips. The length can significantly increase without affecting IPC. The reason the clocks are competititive is the power consumption at 2.5ghz is a fraction of the equivalent Intel chip - this can easily be increased with active cooling, something we haven't seen yet on an Apple ARM chip.

They literally just need to add more cores and power. It's all already there. Again, the main issue is software.

I talked about pipeline length. It hardly makes sense to make long pipeline when clock target is so low. As already proved, longer pipeline will make it harder to maintain high IPC. Power consumption is low because that chip is made for low power consumption, whereas AMD and Intel high performance cores are not.

Basically Apples chips are like AMD's Cat cores or Intel's Atom. Neither would be high performance core just adding active cooling, more power and more cores. For those uses AMD has Zen and Intel has Core.

As said, Apple hasn't proven ANYTHING about making high performance cores yet. Hotter mobile chip is not answer.
 
Back