Final Words

Well, we've known it was coming for quite a while. We knew it would be a many-core CPU architecture well suited to graphics. And with as much information as we were given, when we sat down to look at what we had we felt like we still didn't know anything about Larrabee. Piles of data and information, insight into how a software render would fit on top of the underlying architecture... it has left us with the feeling that all this is a really cool idea with great potential, but we just don't have any idea what or how well it will do when it finally hits.

Of course, this is the first time any real detail has been given, and any hint of product is at least 12 to 18+ months off. We can't expect Intel to give everything away right off the bat. We are very happy to have the detail we do, and can't wait to get more.

While we are very interested in the architecture from the sort of technophile point of view that we can't help but have, technology for technology sake (no matter how cool the theory behind it might be) amounts to nothing without real-world application and benefit. For Larrabee it will all come down to peformance and price.

AMD has shown that you don't need to be on top to compete. As long as performance somewhere in the middle of the pack can be produced, appropriate and aggressive pricing can go quite a long way. For the consumer it is always a cost/benefit analysis, and there are quite a number of computers with $100 - $300 graphics cards under the hood. If compatibility is there, if performance is there, and if Intel is able to price it right, the first round of Larrabee hardware doesn't need to be ground breaking.

Getting a good foothold and sticking it out for the long haul should be Intel's goal. Compatibility (especially with the track record of Intel's integrated graphics) is likely more important than pure performance. Getting product out there into the market is necssary before developers will even start to take a chance on pushing the hardware itself. And this is where Larrabee could really shine.

Opening the door to fully programmable rendering and making it attractive enough for developers to start pushing the envelope will be a long process. The current game development arena is all about return on investment, and except for a few brave souls we will likely see game and engine developers stick to DirectX 10 for quite some time even after DX 11 comes along. Those who venture into the realm of pure software renderes written for a highly data-parallel CPU will be the exception rather than the norm.

Things That Could Go Wrong
Comments Locked

101 Comments

View All Comments

  • DerekWilson - Monday, August 4, 2008 - link

    this is a pretty good observation ...

    but no matter how much potential it has, performance in games is going to be the thing that actually makes or breaks it. it's of no use to anyone if no one buys it. and no one is going to buy it because of potential -- it's all about whether or not they can deliver on game performance.
  • Griswold - Monday, August 4, 2008 - link

    Well, it seems you dont get it either.
  • helms - Monday, August 4, 2008 - link

    I decided to check out the development of this game I heard about ages ago that seemed pretty unique not only the game but the game engine for it. Going to the website it seems Intel acquired them at the end of February.

    http://www.projectoffset.com/news.php">http://www.projectoffset.com/news.php
    http://www.projectoffset.com/technology.php">http://www.projectoffset.com/technology.php

    I wonder how significant this is.
  • iwodo - Monday, August 4, 2008 - link

    I forgot to ask, how will the Software Render works out on Mac? Since all Direct X code are run to Software renderer doesn't that fundamentally mean most of the current Windows based games could be run on Mac with little work?
  • MamiyaOtaru - Monday, August 4, 2008 - link

    Not really. Larrabee will be translating directx to its software renderer. But unless Microsoft ports the directX API to OSX, there will be nothing for Larrabee to translate.
  • Aethelwolf - Monday, August 4, 2008 - link

    I wonder if game devs can write their games in directx then have the software renderer convert it into larrabee's ISA on windows platform, capturing the binary somehow. Distribute the directx on windows and the software ISA for mac. No need for two separate code paths.
  • iwodo - Monday, August 4, 2008 - link

    If anyone can just point out the assumption anand make are false? Then it would be great, because what he is saying is simply too good to be true.

    One point to mention the 4Mb Cache takes up nearly 50% of the die size. So if intel could rely more on bandwidth and saving on cache they could put in a few more core.

    And am i the only one who think 2010 is far away from Introduction. I think 2009 summer seems like a much better time. Then they will have another 6 - 8 months before they move on to 32nm with higher clock speed.

    And for the Game developers, with the cash intel have, 10 Million for every high profile studio like Blizzard, 50 Million to EA to optimize for Intel. It would only cost them 100 million of pocket money.
  • ZootyGray - Monday, August 4, 2008 - link

    I was thinking of all the p90's I threw away - could have made a cpu sandwich, with a lil peanut software butter, and had this tower of babel thing sticking out the side of the case with a fan on top, called lazarus, or something - such an opportunity to utilize all that old tek - such imagery.

    griswold u r funny :)
  • Griswold - Monday, August 4, 2008 - link

    You definitely are confused. Time for a nap.
  • paydirt - Monday, August 4, 2008 - link

    STFU Griswald. It's not helpful for you to grade every comment. Grade the article if you like... Anandtech, is it possible to add an ignore user function for the comments?

Log in

Don't have an account? Sign up now