How Many Cores in a Larrabee?

Initial estimates put Larrabee at somewhere in the 16 to 32-core range, we figured 32-cores would be a sweetspot (not in the least because Intel's charts and graphs showed diminishing returns over 32 cores) but 24-cores would be more likely for an initial product. Intel however shared some data that made us question all of that.

Remember the design experiment? Intel was able to fit a 10-core Larrabee into the space of a Core 2 Duo die. Given the specs of the Core 2 Duo Intel used (4MB L2 cache), it appears to be a 65nm Conroe/Merom based Core 2 Duo - with a 143 mm^2 die size.

At 143 mm^2, Intel could fit 10 Larrabee-like cores so let's double that. Now we're at 286mm^2 (still smaller than GT200 and about the size of AMD's RV770) and 20-cores. Double that once more and we've got 40-cores and have a 572mm^2 die, virtually the same size as NVIDIA's GT200 but on a 65nm process.

The move to 45nm could scale as well as 50%, but chances are we'll see something closer to 60 - 70% of the die size simply by moving to 45nm (which is the node that Larrabee will be built on). Our 40-core Larrabee is now at ~370mm^2 on 45nm. If Intel wanted to push for a NVIDIA-like die size we could easily see a 64-core Larrabee at launch for the high end, with 24 or 32-core versions aiming at the mainstream. Update: One thing we did not consider here is power limitations. So while Intel may be able to produce a 64-core Larrabee with a GT200-like die-size, such a chip may exceed physical power limitations. It's far more likely that we'll see something in the 16 - 32 core range at 45nm due to power constraints rather than die size constraints.

This is all purely speculation but it's a discussion that was worth having publicly.

Putting it all Together - Return of the Ring Bus Cache and Memory Hierarchy: Architected for Low Latency Operation
Comments Locked

101 Comments

View All Comments

  • phaxmohdem - Monday, August 4, 2008 - link

    Can your mom play Crysis? *burn*
  • JonnyDough - Monday, August 4, 2008 - link

    I suppose she could but I don't think she would want to. Why do you care anyway? Have some sort of weird fetish with moms playing video games or are you just looking for another woman to relate to?

    Ooooh, burn!
  • Griswold - Monday, August 4, 2008 - link

    He is looking for the one playing his mom, I think.
  • bigboxes - Monday, August 4, 2008 - link

    Yup. He worded it incorrectly. It should have read, "but can it play your mom?" :p
  • Tilmitt - Monday, August 4, 2008 - link

    I'm really disappointed that Intel isn't building a regular GPU. I doubt that bolting a load of unoptimised x86 cores together is going to be able to perform anywhere near as well as a GPU built from the ground up to accelerate graphics, given equal die sizes.
  • JKflipflop98 - Monday, August 4, 2008 - link

    WTF? Did you read the article?
  • Zoomer - Sunday, August 10, 2008 - link

    He had a point. More programmable == more transistors. Can't escape from that fact.

    Given equal number of transistors, running the same program, a more programmable solution will always be crushed by fixed function processors.
  • JonnyDough - Monday, August 4, 2008 - link

    I was wondering that too. This is obviously a push towards a smaller Centrino type package. Imagine a powerful CPU that can push graphics too. At some point this will save a lot of battery juice in a notebook computer, along with space. It may not be able to play games, but I'm pretty sure it will make for some great basic laptops someday that can run video. Not all college kids and overseas marines want to play video games. Some just want to watch clips of their family back home.
  • rudolphna - Monday, August 4, 2008 - link

    as interesting and cool as this sounds, this is even more bad news for AMD, who was finally making up for lost ground. granted, its still probably 2 years away, and hopefully AMD will be back to its old self (Athlon64 era) They are finally getting products that can actually compete. Another challenger, especially from its biggest rival-Intel- cannot be good for them.
  • bigboxes - Monday, August 4, 2008 - link

    What are you talking about? It's been nothing but good news for AMD lately. Sure, let Intel sink a lot of $$ into graphics. Sounds like a win for AMD (in a roundabout way). It's like AMD investing into a graphics maker (ATI) instead of concentrating on what makes them great. Most of the Intel supporters were all over AMD for making that decision. Turn this around and watch Intel invest heavily into graphics and it's a grand slam. I guess it's all about perspective. :)

Log in

Don't have an account? Sign up now