We just got off the phone with Nick Knupffer of Intel, who confirmed something that has long been speculated upon: the fate of Larrabee. As of today, the first Larrabee chip’s retail release has been canceled. This means that Intel will not be releasing a Larrabee video card or a Larrabee HPC/GPGPU compute part.

The Larrabee project itself has not been canceled however, and Intel is still hard at work developing their first entirely in-house discrete GPU. The first Larrabee chip (which for lack of an official name, we’re going to be calling Larrabee Prime) will be used for the R&D of future Larrabee chips in the form of development kits for internal and external use.

The big question of course is “why?” Officially, the reason why Larrabee Prime was scrubbed was that both the hardware and the software were behind schedule. Intel has left the finer details up to speculation in true Intel fashion, but it has been widely rumored in the last few months that Larrabee Prime has not been performing as well as Intel had been expecting it to, which is consistent with the chip being behind schedule.

Bear in mind that Larrabee Prime’s launch was originally scheduled to be in the 2009-2010 timeframe, so Intel has already missed the first year of their launch window. Even with TSMC’s 40nm problems, Intel would have been launching after NVIDIA’s Fermi and AMD’s Cypress, if not after Cypress’ 2010 successor too. If the chip was underperforming, then the time element would only make things worse for Intel, as they would be setting up Larrabee Prime against successively more powerful products from NVIDIA and AMD.

The software side leaves us a bit more curious, as Intel normally has a strong track record here. Their x86 compiler technology is second to none, and as Larrabee Prime is x86 based, this would have left them in a good starting position for software development. What we’re left wondering is whether the software setback was for overall HPC/GPGPU use, or if it was for graphics. Certainly the harder part of Larrabee Prime’s software development would be the need to write graphics drivers from scratch that were capable of harnessing the chip as a video card, taking in to consideration the need to support older APIs such as DX9 that make implicit assumptions about the layout of the hardware. Could it be that Intel couldn’t get Larrabee Prime working as a video card? That’s going to be a big question that’s going to hang over Intel’s heads right up to the day that they finally launch a Larrabee video card.

Ultimately when we took our first look at Larrabee Prime’s architecture, there were 3 things that we believed could go wrong: manufacturing/yield problems, performance problems, and driver problems. Based on what Intel has said, we can’t write off any of those scenarios. Larrabee Prime is certainly suffering from something that can be classified as driver problems, and it may very well be suffering from both manufacturing and performance problems too.

To Intel’s credit, even if Larrabee Prime will never see the light of day as a retail product, it has been turning in some impressive numbers at trade shows. At SC09 last month, Intel demonstrated Larrabee Prime running the SGEMM HPC benchmark at 1 TeraFLOP, a notable accomplishment as the actual performance of any GPU is usually a fraction of its theoretical performance. 1TF is close to the theoretical performance of NVIDIA’s GT200 and AMD’s RV770 chips, so Larrabee was no slouch. But then again its competition would not be GT220 and RV770, it’s Fermi and Cypress.

Next, this brings us to the future of Larrabee. Larrabee Prime may be canceled, but the Larrabee project is not. As Intel puts it, Larrabee is a “complex multi-year project” and development will be continuing. Intel still wants a piece of the HPC/GPGPU pie (least NVIDIA and AMD get it all to themselves) and they still want in to the video card space given the collision between those markets. For Intel, their plans have just been delayed.


The Larrabee architecture lives on

For the immediate future, as we mentioned earlier Larrabee Prime is still going to be used by Intel for R&D purposes, as a software development platform. This is a very good use of the hardware (however troubled it may be) as it allows Intel to bootstrap the software side of Larrabee so that developers can get started programming for real hardware while Intel works on the next iteration of Larrabee. Much like how NVIDIA and AMD sample their video cards months ahead of time to game developers, we expect that Larrabee Prime SDKs would be limited to Intel’s closest software partners, so don’t expect to see much if anything leak about Larrabee Prime once chips start leaving Intel’s hands, or to see extensive software development initially. Widespread Larrabee software development will still not start until Intel ships the next iteration of Larrabee, if this is the case.

We should know more about the Larrabee situation next year, as Intel is already planning on an announcement at some point in 2010. Our best guess is that Intel will announce the next Larrabee chip at that time, with a product release in 2011 or 2012. Much of this will depend on what the hardware problem was and what process node Intel wants to use. If Intel just needs the ability to pack more cores on to a Larrabee chip then 2011 is a reasonable target, otherwise if there’s a more fundamental issue then 2012 is more likely. This lines up with the process nodes for those years: if they go for 2011 they hit the 2nd year of their 32nm process, otherwise if they launched in 2012 they would be able to launch it as one of the first products on the 22nm process.

For that matter, Since the Larrabee project was not killed, it’s a safe assumption that any future Larrabee chips are going to be based on the same architectural design. The vibe from Intel is that the problem is Larrabee Prime and not the Larrabee architecture itself. The idea of an x86 many-cores GPU is still alive and well.


On-Chip GMA-based GPUs: Still On Schedule For 2010

Finally, there’s the matter of Intel’s competition. For AMD and NVIDIA, this is just about the best possible announcement they could hope for. On the video card front it means they won’t be facing any new competitors through 2010 and most of 2011. That doesn’t mean that Intel isn’t going to be a challenge for them – Intel is still launching Carkdale and Arrandale with on-chip GPUs next year – but they won’t be facing competition at the high-end too. For NVIDIA in particular, this means that Fermi has a clear shot at the HPC/GPGPU space without competition from Intel, which is exactly the kind of break NVIDIA needed since Fermi is running late.

POST A COMMENT

71 Comments

View All Comments

  • ProDigit - Sunday, December 06, 2009 - link

    Intel Cancels Larrabee Retail Products,
    REASON:
    http://news.cnet.com/2300-1001_3-10001951.html?tag...">http://news.cnet.com/2300-1001_3-10001951.html?tag...

    ???
    Reply
  • Cerb - Wednesday, December 09, 2009 - link

    That's totally unrelated. That is a test chip for people to mess with, and figure out where the hardware itself should go, and how software needs to be written for future many-core CPUs.

    The kind of fine-grained complicated multithreading of ages past is simply not easy enough to code to be worth using (hence single-threaded apps everywhere that could be pervasively multithreaded), and can be a hindrance to scaling as much as a help. Beyond maybe a dozen decently powerful cores, there get to be many unknowns, and many possible bottlenecks. On top of that, now that we're scaling out by more cores, there's no reason to stop doing so, even if disruptive technology can get us going faster, again--accepting faster speeds as a matter of course has been an invisible crutch.

    This will give research folks the ability to actually mess with code on such a computer, so as to be more ready to make the future CPUs work well. Also, such CPUs, with superior, but still weak, cores, may end up being good server CPUs, in the future.

    Larrabee "Prime" is a failure due to Intel doing what Intel does best: pointlessly burning away money, and creating hype. They should have quietly gotten a small group of engineers to make a solid core design (the Isreali guys would be a good bet, following history), then once it was working well internally, gotten other groups involved. Once the 2nd or 3rd generation was good internally, polish it up, release it for niche markets, and move out from there.
    Reply
  • Zingam - Sunday, December 06, 2009 - link

    I will tell you why! x86 is a dead end and that has been known for decades now. It it only the billions of Intel that keep it afloat yet.
    Maybe that would make Intel realize that they have to move on and bring something new, something groundbreaking that is ready for the future!
    Reply
  • tygrus - Sunday, December 06, 2009 - link

    [quote]
    "Maybe that would make Intel realize that they have to move on and bring something new, something groundbreaking that is ready for the future"[/quote]

    They did years ago and created a new "groundbreaking" 64bit ISA which has become a "dead end" and barely afloat .. the Titanic disaster .. Itanium
    Reply
  • snarfbot - Sunday, December 06, 2009 - link

    kinda like how the power architecture is dead and has been for decades now and its only the billions of ibm that keep it afloat? Reply
  • ProDigit - Saturday, December 05, 2009 - link

    It has been foreseen by many engineers that the x86 architecture is not really suited for graphics.
    There are several reasons, but one of the most striking for me, is that the x86 architecture is not really power efficient.
    And with chips that are overclocked (turbo) to a certain thermal value instead of to a fixed value, would mean that the x86 architecture larabee cards used for graphics would be outperformed by the structure or architecture used by ARM.
    Graphic cards of today are based on some sort of deviant of ARM architectures.

    Still, Larabee's technology might be very interesting in servers for cloud computing, and home PC's, where the motherboard holds place for 4, 6, 8, or 16 cores, that will,and where one can upgrade by just buying CPU cores.

    I see a future in Larabee CPU's, not GPU's, unless they go off the x86 structure which they have been holding for years, and which now has become slightly outdated!
    Reply
  • PsiAmp - Saturday, December 05, 2009 - link

    "Prime running the SGEMM HPC benchmark at 1 TeraFLOP"

    It has to be FLOPS. 'S' on the end is not representing plural, but 'S' - second.

    FLoating point Operations Per Second
    Reply
  • spathotan - Saturday, December 05, 2009 - link

    Wow...what an INCREDIBLE waste of time and hype. Reply
  • MadBoris - Saturday, December 05, 2009 - link

    Vaporware hits again...How's it feel to be hoodwinked?
    Can you admit you were hoodwinked?

    I cannot believe how many tech reporters jumped on Intels hype train.

    Amazing that no actual product arrived, and yet all these financial analysts affected stock prices for the last year on vaporware.

    You have to be stupider than stupid to think Intel can execute in just a few years what GPU companies like NVIDIA and AMD have been eating sleeping, drinking for 24/7/365.

    Hype wins again, and yet a few of us had enough common sense to know the claims of dethroning discrete GPU's was impossible on any first several gen products.

    I'm actually rather disappointed with the tech sites that bought into the Intel hype, it's not the first time. It showed peoples lack of common sense and experience in the field, and how easily they are still swayed by PR.

    High end 3D gaming Graphics is kind of becoming a dying business on PC unfortunately due to console ports. As someone who upgraded GPU's almost every cycle beginning from monster 3d, I now no longer can justify upgrading until the next console comes out. GPU business is ripe for change and will require it but Intel has alot yet to learn.
    Reply
  • justonce - Saturday, December 05, 2009 - link

    Somewhere they are laughing at Intel's failure! Reply

Log in

Don't have an account? Sign up now