Intel has begun its product discontinuance program for its "Poulson" Itanium 9500-series processors. Intel’s customers will have to make their final orders for these CPUs this fall and the last Poulson chips will be shipped in early 2021.

Intel’s Itanium 9500-series lineup consists of four CPUs: the quad-core Itanium 9520 and 9550 as well as the eight-core Itanium 9540 and 9560. All of the these processors were released in Q4 2012, and were supplanted with the newer "Kittson" 9700 CPUs last year. Now Intel has set the entire Poulson family to discontinued in Q1 2021, a little more than eight years after their release.

Intel’s customers are advised to place their orders on Itanium 9500-series processors by September 17, 2018. Orders will become non-cancelable on September 28, 2018. The final Poulson chips will be shipped by March 5, 2021. Keep in mind that HP Enterprise (the only company that uses Itanium) will cease selling servers based on the Itanium 9500-series on March 31, 2018, so demand for Poulson products is not going to be high in the coming years.

Intel’s Poulson processor (pictured above on the right, image by Silicon.fr) was among the most significant microarchitectural and performance advancements of the Itanium products throughout their history: the CPU doubled issue width to 12 instructions per cycle, got 4-way Hyper-Threading, received higher frequencies, as well as up to eight cores. By contrast, Intel’s latest Itanium 9700-series processors run only slightly faster than the highest-end 9500-series chips.

The retirement of the Poulson family will mean that Intel's 9700 processors will be the only Itanium parts on the market – and indeed they will be the last Itanium processors altogether as Intel has ceased further Itanium development. Meanwhile only a single vendor – long-time Itanium partner HP Enterprise – is still selling Itanium-based servers. But even so, expect Itanium machines to be around for years to come; HPE’s Integrity machines are used primarily for mission- and business-critical applications, where customers are invested into the platform for the very long term.

Related Reading:

Image Source: Silicon.fr

Source: Intel

Comments Locked

27 Comments

View All Comments

  • jordanclock - Thursday, March 8, 2018 - link

    His name was Intel Poulson.
  • Amandtec - Friday, March 9, 2018 - link

    Ha. You beat me to it.
  • Elstar - Thursday, March 8, 2018 - link

    The 90s were quite something. Superscalar and out-of-order CPUs were starting to go mainstream, and people could reasonably disagree about whether VLIW (including IA64) was where the future was headed. I even remember the trade magazines rumoring about Apple experimenting with VLIW "TriMedia" coprocessors to counter Intel.

    The failure of IA64 is one for the history books. They really mis-predicted where the future was headed in so many ways, both technically and competitively.
  • Elstar - Friday, March 9, 2018 - link

    Oh, and let's forget that Intel fundamental miscalculation wasn't even IA64 specific. In that era, they bet/hoped that programmers and/or compilers could scale up to very wide (IA64) or very deep pipelines (the P4). Both approaches demoed well but were terribly impractical for most real code.

    Intel had to restart their CPU design using the older P3 to create the Pentium M and ultimately what we see/know today (plus more SIMD and multicore).
  • name99 - Friday, March 9, 2018 - link

    Uhh, not quite.
    Trimedia was a PHILLIPS-designed VLIW.
    https://en.wikipedia.org/wiki/TriMedia_(mediaproce...

    Apple looked at it in a very vague way (in that you'd expect any large company to look at every new type of hardware) but never with the goal of it being a CPU, rather with it possibly being a media acceleration card (essentially a fancier replacement for the [VERY rarely used] Apple MPEG card
    https://manuals.info.apple.com/MANUALS/1000/MA1446... )

    Apple likewise looked at Cell (more generally this time, with media in mind but also as the base CPU in a machine). And in both cases it was concluded that the additional hassle you had to go through for these non-standard designs was not worth the payoff.

    (Nowadays, of course, the calculations are very different because an additional design is simply a few sq mm on a SoC, and everyone gets it; we're not talking a $1000 card that 10,000 people buy. So everyone has GPUs, everyone with a QC phone has a VLIW DSP, and soon enough everyone will have some flavor of NPU.

    But the Apple calculation for the past 10 years has been that the space where VLIW can do well --- very structured algorithms and data --- is, in the real world, better handled by dedicated silicon. So rather than a VLIW there is a baseband DSP, an H.264/5 ASIC, an ISP.

    Maybe that will change, but so far the timing never really worked out for VLIW. IF something like TriMedia could have been shipped with every Mac at a reasonable price, it might have made a great foundation for QuickTime in the 90s, back when QT was all about dozens of different codecs.)

    In the case of IA64 there was so much fail in so many ways that it's hard to choose just one issue.
    Clearly the starting point was kinda broken --- general purpose CPUs need to handle irregular data, which means the pain of a cache system and out-of-order load/stores and then instructions, which means you might as well go full OoO, and then adding superscalar is not hard. So the VLIW buys you fsckall.
    But on top of that Intel insisted on adding every piece of craziness they could invent, to ensure that no-one else could copy it. They were so obsessed with this particular goal that they forgot to ask whether anyone would WANT to copy it...
  • mode_13h - Saturday, March 10, 2018 - link

    Imagine GPUs didn't happen. Then, people would be praising Intel's foresight for going "VLIW" (hint: it's not really VLIW). Assuming, of course, they continued to refine it to the same extent as they have x86-64.

    VLIW *did* win the day - in DSPs and GPUs (for a while, at least). Just not the desktop.
  • mode_13h - Saturday, March 10, 2018 - link

    BTW, my point was they weren't wrong in predicting what would be the most demanding compute loads - they were just wrong in predicting how those would be handled. As a CPU company, they saw AI and graphics as challenges to be solved by the CPU, and picked an architecture with notable strength in those areas.
  • name99 - Saturday, March 10, 2018 - link

    I think you are being too kind. The nature and future of GPUs was already visible at the time Itanium was released, and the target market (servers, NOT desktop --- and Intel had no obvious plans to change that, and did nothing to prepare the desktop market for Itanium) was not where media or graphics mattered. (AI, forget it --- irrelevant in this time frame.)

    Itanium was supposed to be Intel's SERVER architecture, to compete against Alpha, SPARC, POWER, MIPS, etc. And it was ATROCIOUSLY designed for that particular job.
  • mode_13h - Sunday, March 11, 2018 - link

    The architecture of IA64 was probably locked-in before Nvidia even shipped the NV1. The term GPU wouldn't be coined for another 4-5 years, at least. Graphics chips in the mid-90's were very simple and hard-wired affairs.

    Where Intel was successful with chips like the i860 probably informed some of their ideas about where IA64 might find heavy compute loads. Remember, SGI was booming back then, and there was a substantial pack of also-rans, trying to make a go of the graphics minicomputer/workstation market.

    And Intel was building Neural Network chips as back in 1993. Analog, but still...

    There's no doubt Intel had ambitions for IA64 to rule the desktop, as well. This is why they included an x86 emulator and pushed MS to port Windows to it.
  • name99 - Saturday, March 10, 2018 - link

    But Intel wasn't selling Itanium as a GPU or DSP, they were selling it as a CPU!
    No-one denied (back then or today) that VLIW has advantages in areas where the code and data structures are very regular -- I said that above.
    The problem is that CPUs are designed to solve problems where code and data are irregular.

    This is not an argument about whether diesel or gasoline engines would win the car market; it's an argument about whether motorbikes or trucks would win the container hauling market. One of those arguments could go either way; one of them has an obviously STUPID answer.

Log in

Don't have an account? Sign up now