Intel's Itanium is finally dead

Itanium has no speculative execution, no SMT, no out of order execution. Those reasons make it immune to Spectre and Meltdown. Main reason why Itanium failed is that unless compiler makes "perfect" code, it's quite slow. Something x86-64 CPU's that "squeeze ever smaller performance gains" don't really need...

I'd like to have more information about "Intel's x86-64 license". Intel reverse-engineered x86-64 without license and now there is cross-license agreement between AMD and Intel.
Well, I'm not sure about the specifics but I know that under the cross-licencing agreement that was already in place, AMD had no choice but to allow Intel to licence AMD64 and I agree with that. After all, the only reason that AMD was even able to make 32-bit CPUs was because Intel was forced to allow it. I think that Intel was allowed to reverse-engineer AMD64 chips without a licence but wasn't able to actually sell any before the licence was granted. It's pretty messy but that's how I understand it went down. I'll be the first to admit that my understanding of it may not be complete or 100% accurate so I would recommend checking into it if you want to know for certain.
 
Well, I'm not sure about the specifics but I know that under the cross-licencing agreement that was already in place, AMD had no choice but to allow Intel to licence AMD64 and I agree with that. After all, the only reason that AMD was even able to make 32-bit CPUs was because Intel was forced to allow it. I think that Intel was allowed to reverse-engineer AMD64 chips without a licence but wasn't able to actually sell any before the licence was granted. It's pretty messy but that's how I understand it went down. I'll be the first to admit that my understanding of it may not be complete or 100% accurate so I would recommend checking into it if you want to know for certain.
Cross-license deal came 2009 after long going legal battle. Intel released first 64-bit CPU's 2004.

AMD was able to make x86 CPU's because IBM wanted another supplier for CPU's and AMD was handed x86 license. Then AMD reverse engineered multiple Intel CPU's. Intel tried to block AMD from making 386 "because of 32-bits" but AMD reverse engineered it too and court decided it was OK. Intel did exactly same for x86-64. Like with 386, no license was needed.

Fact that Intel64 lacked two instructions AMD CPU's had but were removed from certain documents (but later added them) proves Intel reverse-engineered x86-64. Otherwise AMD would have handed information of all instructions for Intel. That's also reason why Intel64 was not exact replica of x86-64.

Halfhill said that AMD initially left out a pair of instructions from its early AMD64 documentation, then decided later to add them back in. The two instructions are somewhat innocuous; the LAHF and SAHF instructions load and store status flags into a particular address. However, all of the other instructions listed in AMD’s published documents were later included in Intel’s chips. Halfhill said Intel engineers were unaware of the discrepancy until he contacted them.
 
It's not really the same thing though is it? One is hardware and the other is software. Software restrictions on hardware that otherwise would be fine is a slap in the face.
Look, You won't take Windows 11 because it wants a newer CPU that supports TPM. You're also in here complaining that CPU's aren't new enough and should just be completely different architectures and remove all legacy support.

But you wouldn't buy a new CPU if Windows 11 only supported RISC-V for example.

The lack of consistency is what hurts me:
"I don't want that, it won't run on my legacy hardware"
"Everything should move to running on a brand new architecture without legacy support"

...
 
As I remember it , DEC ( I worked for them at the time (great place to work)) sort of started the move to 64-bit. The DEC alpha, Tru64 UNIX and eventually VMS with 64-bit hardware and OS pushed HP to go 64-bit with Itanium. Intel used some of the same design concepts as Alpha. There were actually 64-bit Microsoft NT Alpha machines. Eventually the PC space went 64-bit.
FYI: There was a company that have a 128-bit machine for technical computing. I can't remember the name.

DEC had a translator that converted x86 code to Alpha code. It was slow the first time, since it needed to do the translation, but after that, it would fly.

This was for the NT version, not sure if that was needed for the other OS.

More info here: https://en.wikipedia.org/wiki/FX!32



HP had their own CPU, PA-RISC and they killed it because they believed in the lies of Intel about Itanium.

https://en.wikipedia.org/wiki/PA-RISC

What really killed Alpha was DEC being bought by Compaq and then HP, which sold the IP to Intel.
 
Gosh I'm amazed it took this long to finally kill it.
IMO, its surprising it became a product at all. It was surely only a niche market.
Look at current PC hardware and OSes. There is built-in compatibility stretching back 50 years. Who uses anything from XT era? Do you still have floppy on W10? Perhaps some parallel printer support on W11 too or ZIP/MO drive.
Well, yes, if you use USB interfaces to those devices. Just like serial ports.
At least in Linux you can add some obscure binaries if you really need them (assuming there is one in the first place) keeping the kernel relatively lean. But Windows is just mess of ancient compatibilities which should've been disposed off years ago.
But Windows is just a mess. :laughing: I would have stopped right there in that statement. :dizzy:
Ask yourself one simple question. Have you seen Capitan Picard using PC with Windows 11 compatibility mode on USS Enterprise?
Careful. You will get people complaining that ST: Discovery has more advanced tech than Kirk's Enterprise. ;)
 
Look, You won't take Windows 11 because it wants a newer CPU that supports TPM. You're also in here complaining that CPU's aren't new enough and should just be completely different architectures and remove all legacy support.

But you wouldn't buy a new CPU if Windows 11 only supported RISC-V for example.
..
IMO, like it or not, X86-64 was genius purely because it supported the existing X86 software.

IMO, dumping legacy support from a new proc would be a tough pill to swallow for the average computer user. Just think how much it would cost to have to buy all the software on any PC new again. Most computer users would be totally irate. Not to mention, I bet some regulatory agencies would become involved.
 
IMO, like it or not, X86-64 was genius purely because it supported the existing X86 software.

IMO, dumping legacy support from a new proc would be a tough pill to swallow for the average computer user. Just think how much it would cost to have to buy all the software on any PC new again. Most computer users would be totally irate. Not to mention, I bet some regulatory agencies would become involved.
Hey, you don't have to tell me! I'm not supporting removing legacy support or moving to a different architecture.

I was simply talking to those who complain so hard in every Windows 11 comment section how "They won't ever move to Windows 11 because it doesn't support my hardware!" yet, they're here in another comment section complaining CPU's have legacy support and that x86 should be replaced.

Microsoft added a security layer that modern chips support and the world nearly ended in some of these commenters eyes. Imagine moving away from x86. It's going to be a blood bath.
 
Hey, you don't have to tell me! I'm not supporting removing legacy support or moving to a different architecture.

I was simply talking to those who complain so hard in every Windows 11 comment section how "They won't ever move to Windows 11 because it doesn't support my hardware!" yet, they're here in another comment section complaining CPU's have legacy support and that x86 should be replaced.

Microsoft added a security layer that modern chips support and the world nearly ended in some of these commenters eyes. Imagine moving away from x86. It's going to be a blood bath.
Well, my comment was not directed towards you. I thought I was adding to the conversation.

Definitely, I am one of those who does not like the new W11 hardware requirements - mainly because most of my PCs do not have TPM; however, IF it gets me UHD Blu-ray playback, I'll be happy to build a new HTPC with TPM hardware.
 
I’m an OpenVMS engineer; we’ve just purchased a couple of 2P Poulson servers, HPE couldn’t find any Kittson processors at all (they appear to be that rare).

My shiny new servers have 2012 processors 😂

Fortunately for me, VSI now develops VMS, v9.2 is x86_64 and virtualised - no more Itanium dependency. I’ve no particular affinity for the arch itself but I am rather fond of OpenVMS, so after DEC Alpha/VAX CPUs bit the dust I was quite thankful for Itanium. I won’t miss it though, IA64 compatibility is an absolute nightmare in a corporate environment.
 
DEC had a translator that converted x86 code to Alpha code. It was slow the first time, since it needed to do the translation, but after that, it would fly.

This was for the NT version, not sure if that was needed for the other OS.

More info here: https://en.wikipedia.org/wiki/FX!32



HP had their own CPU, PA-RISC and they killed it because they believed in the lies of Intel about Itanium.

https://en.wikipedia.org/wiki/PA-RISC

What really killed Alpha was DEC being bought by Compaq and then HP, which sold the IP to Intel.
Owned an Alpha in the day and they were truly amazing for the time, it's a real shame the platform never got to mature.
 
Look, You won't take Windows 11 because it wants a newer CPU that supports TPM. You're also in here complaining that CPU's aren't new enough and should just be completely different architectures and remove all legacy support.
I won't take Windows 11 because I'm sick of MS. They are making it way too easy for me to make that decision. Especially when they block my hardware from upgrading.

I certainly don't have anything against MS requiring TPM 2.0. I do however think they are phasing out hardware way too soon. They haven't even bothered phasing out 32 bit hardware yet. And here they are blocking hardware that is only 4 years old and fully capable of 64 bit.

I get why you would think that is why I'm here. But it's not. The only thing here I spoke out against was your apples to oranges comparison. Intel tried to phase out x86-64, that didn't workout so here we are still with x86-64. MS doesn't seem to be that smart. They are currently bitting the hand that feeds them. They think they can program hardware obsolescence after only four years and no one will care. But like I said that is not why I'm sick of MS.
 
Couple of items:

Qualcomm has proven to be as bad as intel towards the industry, so no thanks.
As good as the M1 is, apple needs to show something that competes with ThreadRippers and Epyc that also allows a system with memory and gpu slots.
Personally, I want ARM to be independent and see Samsung and the others licensees to move up the performance bar.

Well it's Qualcomm or nothing, unless you can convince Samsung to have a go. Or maybe get Apple to sell their chips to third party and stop being so insular.
 
Which is ironically part of the reason why those CPUs aren't good at certain workloads. x86 at it's core is a pretty horrid CPU architecture; it needed to be replaced, not extended upon.
So horrid that it has outlived every single competitor it has come across, with only ARM posing any sort of "challenge", although it's been over 10 years since the pomise of "ARM ond esktop" and the most we've seen is a couple of windows tablets as the apple M1, which was so good apple chose to continue using intel zeons in their mac pros. Whoops.
 
Cross-license deal came 2009 after long going legal battle. Intel released first 64-bit CPU's 2004.

AMD was able to make x86 CPU's because IBM wanted another supplier for CPU's and AMD was handed x86 license. Then AMD reverse engineered multiple Intel CPU's. Intel tried to block AMD from making 386 "because of 32-bits" but AMD reverse engineered it too and court decided it was OK. Intel did exactly same for x86-64. Like with 386, no license was needed.

Fact that Intel64 lacked two instructions AMD CPU's had but were removed from certain documents (but later added them) proves Intel reverse-engineered x86-64. Otherwise AMD would have handed information of all instructions for Intel. That's also reason why Intel64 was not exact replica of x86-64.

I know why AMD had the licence to begin with, that's common knowledge. IBM was the merciless God of computers back in the 70s and what IBM wanted, IBM got. This other information is stuff I wasn't aware of so thanks for sharing it. This is pretty amazing stuff. I can't say that I'm all that surprised because I don't think that there's a limit to just how low Intel will stoop. :laughing:
 
IMO, like it or not, X86-64 was genius purely because it supported the existing X86 software.

IMO, dumping legacy support from a new proc would be a tough pill to swallow for the average computer user. Just think how much it would cost to have to buy all the software on any PC new again. Most computer users would be totally irate. Not to mention, I bet some regulatory agencies would become involved.
Well yeah, one of the pillars of the x86 world has always been backwards-compatibility. Everyone recognised this but Intel figured that its name alone was enough to make the entire x86 world abandon that. AMD knew that it was an absurd preposition and proved Intel wrong.
 
Well yeah, one of the pillars of the x86 world has always been backwards-compatibility. Everyone recognised this but Intel figured that its name alone was enough to make the entire x86 world abandon that. AMD knew that it was an absurd preposition and proved Intel wrong.
Yep. Same ole' Intel Hubris (or so it seems). Having worked in a company that suffered the same malady and eventually evaporated from an employer with over 60,000 employees to one with something like 1,000 now, IMO, Intel would be wise to listen to their customers' wishes WRT hardware instead of trying to shove their hardware down their customers' throats.
 
Itanium has no speculative execution, no SMT, no out of order execution. Those reasons make it immune to Spectre and Meltdown. Main reason why Itanium failed is that unless compiler makes "perfect" code, it's quite slow. Something x86-64 CPU's that "squeeze ever smaller performance gains" don't really need...
As someone who (still) supports Itanium, later generation compilers did a MUCH better job at producing compiled code. Yes, early compilers were absolute jokes; that's normal for RISC architectures. They got better, but by the time they did "the compilers suck" and "performance is terrible" were already being repeated en masse.

It's also worth noting Itanium really wasn't designed for serial workloads in the way x86 was; it was designed first and foremost to be a massively parallel CPU architecture that could scale across multiple CPU cores without issue. You know, the type of CPU architecture everyone suddenly wants to create.
 
So horrid that it has outlived every single competitor it has come across, with only ARM posing any sort of "challenge", although it's been over 10 years since the pomise of "ARM ond esktop" and the most we've seen is a couple of windows tablets as the apple M1, which was so good apple chose to continue using intel zeons in their mac pros. Whoops.
x86 at it's core was meant to be a transitional architecture, but once it got selected by IBM to be the base of the PC market, combined with the rise of PC Compatible's (thanks Bill Gates!) we ended up stuck with it. Hell, even Intel tried to kill the architecture off on several occasions (StrongARM, Itanium, etc.).

Such a shame IBM didn't wait a few months for the 68k to be ready for mass production.
 
As someone who (still) supports Itanium, later generation compilers did a MUCH better job at producing compiled code. Yes, early compilers were absolute jokes; that's normal for RISC architectures. They got better, but by the time they did "the compilers suck" and "performance is terrible" were already being repeated en masse.
Of course everything gets better when time goes on. Still, Itanium came 4 years late and even then was not able to meet targets.

Even with better compilers, x86-64 CPU's from AMD or Intel were just better.
It's also worth noting Itanium really wasn't designed for serial workloads in the way x86 was; it was designed first and foremost to be a massively parallel CPU architecture that could scale across multiple CPU cores without issue. You know, the type of CPU architecture everyone suddenly wants to create.
Who wants to create? Not AMD, not Intel, not ARM, not Apple. Parallel computing seems to be concentrating on GPU side, ASIC's or similar products.
x86 at it's core was meant to be a transitional architecture, but once it got selected by IBM to be the base of the PC market, combined with the rise of PC Compatible's (thanks Bill Gates!) we ended up stuck with it. Hell, even Intel tried to kill the architecture off on several occasions (StrongARM, Itanium, etc.).

Such a shame IBM didn't wait a few months for the 68k to be ready for mass production.
Well, at least x86 compatible CPU's are still produced today. Motorola killed 68K line with "something that was supposed to be better" (just like Itanium did). Still, IBM choosing 68K would have changed many things. Where would Intel or AMD be now? That would be good topic for someone who wants to write a book 👍
 
I still love the irony that it was AMD that created the extensions to x86 that now power 64-bit computing in today’s world, even though AMD has to license x86 from Intel

More Irony is that DEC killed Alpha because Intel announced Itanium. This allowed AMD to buy the CPU workforce from DEC and create Althon. Why DEC cancelled Alpha before Itanium came out is beyond me.
 
More Irony is that DEC killed Alpha because Intel announced Itanium. This allowed AMD to buy the CPU workforce from DEC and create Althon. Why DEC cancelled Alpha before Itanium came out is beyond me.
DEC's journey ended before Itanium launch. As usual there are multiple theories about exact reasons. But ultimately everything always comes down to one point: DEC run out of money. With deep pockets, DEC could have continued developing Alpha even if not profitable (tune: Itanium...)
 
Well, at least x86 compatible CPU's are still produced today. Motorola killed 68K line with "something that was supposed to be better" (just like Itanium did). Still, IBM choosing 68K would have changed many things. Where would Intel or AMD be now? That would be good topic for someone who wants to write a book 👍
PPC was fine as an architecture; Motorola was hindered by IBM's fabs not being able to keep up clocks against Intel, and was *way* behind when multiple cores became a thing. That being said, PPC is still widely used in embedded systems due to excellent power/thermal profiles for the performance given, and licensed production of PPC chips continues for that purpose. You're only now starting to see ARM encroach in some of the markets long dominated by PPC.
 
PPC was fine as an architecture; Motorola was hindered by IBM's fabs not being able to keep up clocks against Intel, and was *way* behind when multiple cores became a thing. That being said, PPC is still widely used in embedded systems due to excellent power/thermal profiles for the performance given, and licensed production of PPC chips continues for that purpose. You're only now starting to see ARM encroach in some of the markets long dominated by PPC.
I meant Motorola 88000, that was supposed to be successor for 68k series. Pretty much like Itanium, it was supposed to be much better while ditching backwards compatibility. Just like Itanic, it failed.
 
Back