Intel discontinues IA-64 Itanium processors

Published by

Click here to post a comment for Intel discontinues IA-64 Itanium processors on our message forum
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
it was a looooooooong death. it was a nice 64 bit but Intel haven't made it evolve since the X64 have risen and were more easy to integrate in company.
data/avatar/default/avatar10.webp
Itanium sucked performance wise for the money invested in it. Early chips were atrocious. I don't know why they don't just take x86_64, strip out all 32bit stuff and backwards compatible things like 8087 emulation, mmx etc. Dropping backwards compatibility would free up loads of transistor space and allow better instruction mapping decisions to be made. Instruction fetching is still one of the main bottlenecks of x86 cpu's due to the horrendously fragmented instruction map.
data/avatar/default/avatar18.webp
Richard Nutman:

Itanium sucked performance wise for the money invested in it. Early chips were atrocious. I don't know why they don't just take x86_64, strip out all 32bit stuff and backwards compatible things like 8087 emulation, mmx etc. Dropping backwards compatibility would free up loads of transistor space and allow better instruction mapping decisions to be made. Instruction fetching is still one of the main bottlenecks of x86 cpu's due to the horrendously fragmented instruction map.
No real point, the main reason IA64 failed was not just because it was pricy, it lacked the main point of x86_64 - backward compatibility. Any architecture that target to replace x86_64 in mainstream on desktop and server markets would need to battle against whole lot of users who'd like to use already existing software.
data/avatar/default/avatar13.webp
blacknova:

No real point, the main reason IA64 failed was not just because it was pricy, it lacked the main point of x86_64 - backward compatibility. Any architecture that target to replace x86_64 in mainstream on desktop and server markets would need to battle against whole lot of users who'd like to use already existing software.
Backwards-compat doesn't really matter for HPC/many server usage scenarios. They usually compile their software every time for their specific HW, since every 0.0001% matters. But the general support and spread is still important.
data/avatar/default/avatar18.webp
flashmozzg:

Backwards-compat doesn't really matter for HPC/many server usage scenarios. They usually compile their software every time for their specific HW, since every 0.0001% matters. But the general support and spread is still important.
Today sure, but when Itanium have been starting it was much more of the problem. By the time it was norm to build software Itanium have failed to get momentum and to move into mass market.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Richard Nutman:

Itanium sucked performance wise for the money invested in it. Early chips were atrocious. I don't know why they don't just take x86_64, strip out all 32bit stuff and backwards compatible things like 8087 emulation, mmx etc. Dropping backwards compatibility would free up loads of transistor space and allow better instruction mapping decisions to be made. Instruction fetching is still one of the main bottlenecks of x86 cpu's due to the horrendously fragmented instruction map.
To sort of paraphrase what blacknova said, the problem is IA64 was binary incompatible. In other words, IA64 is not "x86_64 minus 32-bit compatibility".
flashmozzg:

Backwards-compat doesn't really matter for HPC/many server usage scenarios. They usually compile their software every time for their specific HW, since every 0.0001% matters. But the general support and spread is still important.
If it didn't matter as much as you suggest, the architecture would've died years ago. HP is a good customer to Intel but they're not that good to get their own "exclusive" architecture. Meanwhile, architectures like PPC and ARM sometimes aren't so appealing in server markets also because of binary compatibility, even though (depending on the application) they can have some pretty major improvements over x86.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
blacknova:

Today sure, but when Itanium have been starting it was much more of the problem. By the time it was norm to build software Itanium have failed to get momentum and to move into mass market.
To my knowledge, Itanium was never intended to target mass market to begin with....
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Shabby means not good. So not so shabby would be not bad. It's kind of a shame seeing Itanium go, for like the 7th time, because having an alternative out there being actively developed expands possibilities. Everything being forever deadlocked on being x86 derivatives isn't good.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
Neo Cyrus:

Shabby means not good. So not so shabby would be not bad. It's kind of a shame seeing Itanium go, for like the 7th time, because having an alternative out there being actively developed expands possibilities. Everything being forever deadlocked on being x86 derivatives isn't good.
Itanium was never an alternative. Itanium, from it's inception, was not intended to be a consumer level product. It was intended to be an HPC/Server product. Even in that market it has struggled to be relevant....
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
sykozis:

Itanium was never an alternative. Itanium, from it's inception, was not intended to be a consumer level product. It was intended to be an HPC/Server product. Even in that market it has struggled to be relevant....
I'm aware, but if it was developed to the point where it was more viable who knows what would have came from that. The truly best processors we could make, even relative to the era, will never be an x86 derivative. But it seems at this rate we can't even be sure we'll live to see the day where something made from scratch to be superior will ever take over.
data/avatar/default/avatar32.webp
Actually, in the early 90s HP did envision EPIC to eventually become top-down architecture, but their partner (Intel) realized soon that HPC market was the most it will see. Itanium was the typical overly optimistic vision of its time, when skeptics were already declaring the dominant x86 obsolete and unsalable, ready for the grave. The future back in the 90s belonged to PowerPC, Alpha and Itanium -- all of them now burred under billions of x86-compatible CPUs in every possible market segment, while adding ARM on top of the lid. That doesn't mean x86's cluttered ISA doesn't have to be dusted off at some point in time. There are already steps in compiler development to phase out the legacy x87 stack (together with MMX), since the x86-64 specs already depreciates it, and with AVX Intel finally began to think about smoother integration of future ISA extensions and register formats, by using prefixes.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Neo Cyrus:

I'm aware, but if it was developed to the point where it was more viable who knows what would have came from that.
Eh... IA64 was released in 2001. Intel had 3 years to make it a compelling alternative to x86-46 and they had well over a decade to attempt to push it into markets beyond servers.
The truly best processors we could make, even relative to the era, will never be an x86 derivative.
But it seems at this rate we can't even be sure we'll live to see the day where something made from scratch to be superior will ever take over. That's an interesting point and one I've considered myself. You're most likely right that there is a superior binary CISC architecture, vs x86 (and who knows, maybe that could've been IA64). This implies a few things: * RISC architectures are still being developed and are showing some promising results. Furthermore, GPUs are compensating for a lot of computations that CPUs would be too power-hungry, slow, or expensive to perform. * Binary computers as we know them are nearing EOL, at least in a way that actually matters (I'm sure we'll continue to see them for home PCs for decades). There's only so much more we can do to improve existing transistors. A lot of companies are investing in stuff like quantum computers or alternative to silicon-based transistors. * With machine learning, it's possible we'll see a new architecture (maybe even x86 compatible) that can far surpass the performance of human design. In short: there is massive potential in the future. We've been clinging onto old architectures with old technologies for a really long time and as far as I'm concerned, the only reason we haven't moved on to something better sooner is because too much closed-source software exists. As long as people aren't able to run their existing software on another platform, people won't switch, and therefore architectures like x86 dominate the CPU market, even though it's probably horribly inefficient for modern-day needs.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
Neo Cyrus:

I'm aware, but if it was developed to the point where it was more viable who knows what would have came from that. The truly best processors we could make, even relative to the era, will never be an x86 derivative. But it seems at this rate we can't even be sure we'll live to see the day where something made from scratch to be superior will ever take over.
Without support from devs, on a massive scale, there is no replacing x86..... x86 survives out of necessity for backwards compatibility.