Larrabee is Dead, Long Live Larrabee

Intel just announced that the first incarnation of Larrabee won't be a consumer graphics card. In other words, next year you're not going to be able to purchase a Larrabee GPU and run games on it.

You're also not going to be able to buy a Larrabee card and run your HPC workloads on it either.

Instead, the first version of Larrabee will exclusively be for developers interested in playing around with the chip. And honestly, though disappointing, it doesn't really matter.


The Larrabee Update at Fall IDF 2009

Intel hasn't said much about why it was canceled other than it was behind schedule. Intel recently announced that an overclocked Larrabee was able to deliver peak performance of 1 teraflop. Something AMD was able to do in 2008 with the Radeon HD 4870. (Update: so it's not exactly comparable, the point being that Larrabee is outgunned given today's GPU offerings).

With the Radeon HD 5870 already at 2.7 TFLOPS peak, chances are that Larrabee wasn't going to be remotely competitive, even if it came out today. We all knew this, no one was expecting Intel to compete at the high end. Its agents have been quietly talking about the uselessness of > $200 GPUs for much of the past two years, indicating exactly where Intel views the market for Larrabee's first incarnation.

Thanks to AMD's aggressive rollout of the Radeon HD 5000 series, even at lower price points Larrabee wouldn't have been competitive - delayed or not.

I've got a general rule of thumb for Intel products. Around 4 - 6 months before an Intel CPU officially ships, Intel's partners will have it in hand and running at near-final speeds. Larrabee hasn't been let out of Intel hands, chances are that it's more than 6 months away at this point.

By then Intel wouldn't have been able to release Larrabee at any price point other than free. It'd be slower at games than sub $100 GPUs from AMD and NVIDIA, and there's no way that the first drivers wouldn't have some incompatibly issues. To make matters worse, Intel's 45nm process would stop looking so advanced by mid 2010. Thus the only option is to forgo making a profit on the first chips altogether rather than pull an NV30 or R600.

So where do we go from here? AMD and NVIDIA will continue to compete in the GPU space as they always have. If anything this announcement supports NVIDIA's claim that making these things is, ahem, difficult; even if you're the world's leading x86 CPU maker.

Do I believe the 48-core research announcement had anything to do with Larrabee's cancelation? Not really. The project came out of a different team within Intel. Intel Labs have worked on bits and pieces of technologies that will ultimately be used inside Larrabee, but the GPU team is quite different. Either way, the canceled Larrabee was a 32-core part.

A publicly available Larrabee graphics card at 32nm isn't guaranteed, either. Intel says they'll talk about the first Larrabee GPU sometime in 2010, which means we're looking at 2011 at the earliest. Given the timeframe I'd say that a 32nm Larrabee is likely but again, there are no guarantees.

It's not a huge financial loss to Intel. Intel still made tons of money all the while Larrabee's development was underway. Its 45nm fabs are old news and paid off. Intel wasn't going to make a lot of money off of Larrabee had it sold them on the market, definitely not enough to recoup the R&D investment, and as I just mentioned using Larrabee sales to pay off the fabs isn't necessary either. Financially it's not a problem, yet. If Larrabee never makes it to market, or fails to eventually be competitive, then it's a bigger problem. If heterogenous multicore is the future of desktop and mobile CPUs, Larrabee needs to succeed otherwise Intel's future will be in jeopardy. It's far too early to tell if that's worth worrying about.

One reader asked how this will impact Haswell. I don't believe it will, from what I can tell Haswell doesn't use Larrabee.

Intel has a different vision of the road to the CPU/GPU union. AMD's Fusion strategy combines CPU and GPU compute starting in 2011. Intel will have a single die with a CPU and GPU on it, but the GPU isn't expected to be used for much compute at that point. Intel's roadmap has the CPU and AVX units being used for the majority of vectorized floating point throughout 2011 and beyond.


Intel's vision for the future of x86 CPUs announced in 2005, surprisingly accurate

It's not until you get in the 2013 - 2015 range that Larrabee even comes into play. The Larrabee that makes it into those designs will look nothing like the retail chip that just got canceled.

Intel's announcement wasn't too surprising or devastating, it just makes things a bit less interesting.

POST A COMMENT

75 Comments

View All Comments

  • AnandThenMan - Monday, December 07, 2009 - link

    quote:

    Who has more GPU market share?

    Yes, that's why Intel spent a a billion+ on Larrabee, for no reason at all, because they already ship the highest #s of GPUs. Intel loves to just toss money to the wind, it doesn't matter at all.





    Reply
  • kkwst2 - Wednesday, December 09, 2009 - link

    What? Talk about a straw man argument. The GP said nothing to imply that Larrabee didn't matter, but was just responding to the ludicrous assertion by the GGP that Intel didn't do anything else well/successfully. Reply
  • EnzoFX - Monday, December 07, 2009 - link

    Why so much shit towards Intel. Their IGP's are not for you, that's fine, but there's no reason to call it a bad product out of rage. The IGP itself just works for MOST computer users. If this wasn't the case EVERYONE would have a discrete solution, and Intel wouldn't have such a huge marketshare in the Graphics market, no one would allow it, not the OEM's, or MS.

    They do work, just not for you, not for gaming. Sometimes people forget that they are not the majority. I'd say anyone reading Anandtech does not fall in this category, who have always been fine with Intel's IGP, or at least getting along fine with it.
    Reply
  • - Monday, December 07, 2009 - link

    Intel's announcement wasn't too surprising or devastating, it just makes things a bit less interesting.

    I am a long time reader of AnandTech (from its inception) and am not surprised by the above statement coming from this site. But I beg to differ; Intel has proven in recent news that they are indeed top nomine for the 'evil empire" award-perhaps for as long as 10 years. If all worldly claims ring true, Intel is at best an inhibitor of technology (except their own). What will the world be like without Intel’s presence is always interesting- it will spark competition and innovation, once again, out of 2 car garages (figurative) where PC’s began.

    asH
    Reply
  • stmok - Monday, December 07, 2009 - link

    ...Call it Larrabee 1.0 ;)

    Gen 1 = Prototype.
    Gen 2 = Learn from Gen 1.
    Gen 3 = Start again based on what you've learned from Gen 1 and 2.
    Gen 4 = Refinement of Gen 3.
    etc...

    I think I'll stick to ATI or Nvidia solutions for a good while yet.
    Reply
  • Mike1111 - Monday, December 07, 2009 - link

    I also asked about Haswell in the comments of the last Larrabee cancelation article. Because it was my understanding that some future version of Larrabee was planed to be integrated into the Haswell architecture as a GPU/AVX2 hybrid.

    Anand himself mentioned something like this in his IDF 2009 Larrabee Demo article: "Nehalem was architected to be very modular, Jasper Forest is just another example of that. Long term you can expect Nehalem cores (or a similar derivative) to work alongside Larrabee cores, which is what many believe Haswell will end up looking like."

    At least, I was one of many ;-)
    Reply
  • haukionkannel - Monday, December 07, 2009 - link

    Intel don't need anything faster than their today IGP's...
    Most consumer are happy with even that. But yeah to PC-players this is not so good news.
    Reply
  • MamiyaOtaru - Monday, December 07, 2009 - link

    because that's by far the most important benchmark for SSDs amirite? Reply
  • MrDiSante - Monday, December 07, 2009 - link

    And random reads/writes that blow anything else out of the water. I recommend you actually read an article or two on this fine site about SSDs. Reply
  • Devo2007 - Monday, December 07, 2009 - link

    ..never mind the fact they've had to pull firmware releases TWICE in a row to fix major data loss issues. Yeah, Intel's got a great track record with SSD technology right now. :p Reply

Log in

Don't have an account? Sign up now