Intel boss Pat Gelsinger calls Arm's PC threat "insignificant"

I've seen this as a problem for a long time. I know teenagers who have no concept of what wifi is, why their devices work the way they do. On one end of things it's amazing we've reached the point that things just work without us having to think about them. However, there was a lot to be gainedby having to think about things when you used them. I know people who I considered computer illiterate who know more about computers than Zoomers and Gen Alpha.

In the same way we are seeing a steep dropoff in skilled tradesmen I think within the next 20 years we are going to see the same steep dropoff of people who are skilled in hardware design and programing.

At first being a nerd was uncool, then it became cool, then it because a career path and now I feel like we're seeing a drop off of people interested in going into IT. I find that really unfortunate because I was talking with my step Dad a few months ago cars as a hobby is going away and how tinkering with computers is becoming the new "car guy" hobby. He's 87 and said "maybe it's time I get a computer, It's getting so hard to find cars to work on." Keep in mind, he likes to restore trucks from the 60's-80's as a hobby but he just can't get them anymore.

I had no idea how to respond to him, but I find it really unfortunate that tinkering with PCs on a fundamental level is something people are becoming disinterested in. My biggest problem with that is I don't see it being replaced with anything. I guess the big thing now is "tinkering" with social media. It's a strange world we're living in

Good post.

I like tinkering with computers. But the problem with computer technology is that it keeps changing. It's as ephemeral as electricity. Real machines are far more interesting because they have substance. However, the more a device is bound up with electricity, the more ephemeral it becomes. For example, a good pocket knife can last decades, but an equally expensive cellphone is only good for a few years.


 
I don't see what the fus is about, a slightly faster ARM chip than a x86 ? why is that a big deal ? there are already light years faster chips (POWER10?) and they don't get to be advertised like this right ? so what the fus is about ? waike me up when a ARM chip does laps around x86, I suppose that is when you could say that throwing 30 years of x86 backward compatibility is worth it right ? but isn't POWER10 worth it already today ? ... some things to think about :)
The big deal is that ARM can do the same perf using 5 times less power.
 
He should be worried (and probably is, I think this is a show of bravado.)

Intel CPUs are better on power use than they used to be, but still use quite a bit more power than similar-performance AMD models, and both of those use much more than ARM. Intel's current advantage is basically the traditional "Wintel", Windows + Intel.

But...

I used a ARM-based Chromebook several years back, with a Nvidia Tegra K1, I used the "Chrubuntu" setup to get Ubuntu onto an SDCard, with Nvidia GPU drivers and even CUDA working on it. There's VERY little to say about it -- the desktop worked flawlessly, you went to install some package or other and there was an ARM version of it (several mainstream distros have full ARM ports, and with the Raspberry Pi being ARM it means various packages are ported to ARM independent of the distro's porting efforts.) Incredible battery life (22 hours under realistic use, and 12 hours running the quad cores at full tilt, like video encoding or compiling or the like.) And pretty good performance, although the current Qualcomm etc. ARMS are significantly faster.

I had x86/x86-64 emulation 4 or 5 years ago that was about 1/4 to 1/2 native speed with qemu, apparently now there are better Intel-on-ARM solutions that are more like 1/2 to 3/4s (usually closer to 1/2).

From what I've heard, it's similar with Windows now -- emulation is roughly 1/2 speed (or in other words, double the power consumption.) This could be a problem when there is very little ARM-native software. But from what I've gathered, if you get your project to be buildiable under recent Visual Studio version, you can end up with an ARM-native version, so I suspect (unlike now were the complaint was so many Intel-only apps that the machines didn't get the battery lfie they should), I think in contrast in months to a year you could have predominantly native apps.

I've run desktop Linux over the years on DEC Alpha, MIPS (both an SGI and DEC MIPS), HP PA-RISC, both PowerPC and IBM POWER systems, ARM, and several Intel-compaitble models (Cyrix, IBM, AMD, and Intel). Running Linux on non-Intel has been relatively pain-free for years, the emulation is just icing on the cake for running Linux apps, my main interest was seeing if I could get wine going for running games; I couldn't back then but apparently that works perfectly fine now (both in Linux and Windows.)

So, Microsoft could F' this up enough they don't stay on the market for us Linux aficionados to purchase and enjoy. But, I suspect enough ARM-native apps will come out fast enough that people will enjoy having a long-battery-life cool-running laptop.

Edit: The other interesting system I saw is for sale is an ARM workstation. They had something like 80 or 120 cores or something crazy like that in there. Lots of disk connectivity, PCIe slots, DIMM slots and everything just like a PC workstation. The tech journalist asked "Well, OK, that's a lot of CPU power, but ARM SOCs are not know for their great GPUs, what are you doing for a GPU though?" They pointed out, it's a desktop, it has PCIe slots, they put a 4090 in there base with options for a Quadro or whatever they call the workstation-class cards these days. Nvidia has had Linux ARM drivers for close to 10 years so they could just slap on whatever ARM distro and install the Nvidia driver on it just like on x86.
 
Last edited:
I don't see what the fus is about, a slightly faster ARM chip than a x86 ? why is that a big deal ? there are already light years faster chips (POWER10?) and they don't get to be advertised like this right ? so what the fus is about ? waike me up when a ARM chip does laps around x86, I suppose that is when you could say that throwing 30 years of x86 backward compatibility is worth it right ? but isn't POWER10 worth it already today ? ... some things to think about :)
Having used Linux desktops on DEC Alpha, MIPS (both SGI and DEC), HP PA-RISC, ARM, and PowerPC and a much older IBM POWER system, and not lacking for anything on them... I have been using mostly AMD and Intel systems but I'm in no way tied to them. I'd consider a Power10. If the performance is there, if power use is there (if it's a notebook rather than a desktop), and it can be bought by a mere mortal at a normal price, I'd consider one for sure.

The other CPU that's not a factor now but I'm curious about over the next couple years is RISCV. It's mainly displacing the low end now (stuff like wireless access points, controllers built into wireless cards, memory controllers, and a slew of other embedded uses that was using MIPS or low-end ARM). But it's a scalable spec so vendors could make higher-performance RISCVs if they wished too. Not that the royalties for ARM are that bad but they could do the RISCV royalty-free if they wished too.
 
it sounds good, but is it something worth to throw away 30 years of x86 softwares for ? who is going to rewrite everything, for ARM ?
Well as I've found, in Linux the vast majority of software is portable and already is ported to ARM among other CPUs. Nobody is discussing throwing away 30 years of x86 software. Both Linux and ARM Windows have seamless, painless to use CPU emulation. I think it'll be a very long time before Microsoft could remove the CPU emulator (like Apple did, emulating PowerPC on Intel for a while, the removing it eventually). But I think it could see lower use over time so you'd get most of your benefits from the power-sipping ARM while still being able to run those Intel-compiled apps indefinitely.
 
Back