Final Words

Well, we've known it was coming for quite a while. We knew it would be a many-core CPU architecture well suited to graphics. And with as much information as we were given, when we sat down to look at what we had we felt like we still didn't know anything about Larrabee. Piles of data and information, insight into how a software render would fit on top of the underlying architecture... it has left us with the feeling that all this is a really cool idea with great potential, but we just don't have any idea what or how well it will do when it finally hits.

Of course, this is the first time any real detail has been given, and any hint of product is at least 12 to 18+ months off. We can't expect Intel to give everything away right off the bat. We are very happy to have the detail we do, and can't wait to get more.

While we are very interested in the architecture from the sort of technophile point of view that we can't help but have, technology for technology sake (no matter how cool the theory behind it might be) amounts to nothing without real-world application and benefit. For Larrabee it will all come down to peformance and price.

AMD has shown that you don't need to be on top to compete. As long as performance somewhere in the middle of the pack can be produced, appropriate and aggressive pricing can go quite a long way. For the consumer it is always a cost/benefit analysis, and there are quite a number of computers with $100 - $300 graphics cards under the hood. If compatibility is there, if performance is there, and if Intel is able to price it right, the first round of Larrabee hardware doesn't need to be ground breaking.

Getting a good foothold and sticking it out for the long haul should be Intel's goal. Compatibility (especially with the track record of Intel's integrated graphics) is likely more important than pure performance. Getting product out there into the market is necssary before developers will even start to take a chance on pushing the hardware itself. And this is where Larrabee could really shine.

Opening the door to fully programmable rendering and making it attractive enough for developers to start pushing the envelope will be a long process. The current game development arena is all about return on investment, and except for a few brave souls we will likely see game and engine developers stick to DirectX 10 for quite some time even after DX 11 comes along. Those who venture into the realm of pure software renderes written for a highly data-parallel CPU will be the exception rather than the norm.

Things That Could Go Wrong
Comments Locked

101 Comments

View All Comments

  • Griswold - Monday, August 4, 2008 - link

    You seem to be confused. Time for a nap.
  • MDme - Monday, August 4, 2008 - link

    but AMD will have Cinema 2.0. did you see that demo? by 2010, AMD will have the RV990 or whatever...and Nvidia will have GT400?
  • phaxmohdem - Monday, August 4, 2008 - link

    Considering how long it took nVidia to release a single GPU significantly faster than G80, I'd be shocked if we wee GT300 by 2009/2010. however a GTX 295GT X2 ULTRA OC is not out of the question ;)
  • shuffle2 - Monday, August 4, 2008 - link

    mm², how hard is that to write? >.>
  • 1prophet - Monday, August 4, 2008 - link

    They need to hit one out of the park with the drivers (software)as well.
  • jltate - Tuesday, August 5, 2008 - link

    I've got a bunch of comments, so I'll just list them all here.

    SSE doesn't have fused multiply-add operations. Larrabee does -- thus that 10 core processor could perform a peak of 320 floating point operations per cycle (it's mentioned in the SIGGRAPH paper).

    Larrabee's programming model is variable width -- the hardware can and likely will be augmented in the future to perform more than just 16 operations in parallel.

    The ring bus between cores was stated to be for each group of 16. Intel stated that for more than 16 cores they'd use "multiple short-linked rings".

    Also, the diagram only shows one memory controller on one side with fixed function logic on the other, not two memory controllers as you showed on page 5 of your article. However, Intel stated in the paper that the configuration and number of processors, fixed function blocks and I/O controllers would be implementation dependent. So in effect it could very well have a half-dozen 64-bit interfaces like G80.

    My forecast? This thing will rock. I for one simply cannot wait.
  • Laura Wilson - Monday, August 4, 2008 - link

    that's the truth

    they say they know this. it sounds like they know this ... we'll see what happens :-)
  • gigahertz20 - Monday, August 4, 2008 - link

    I'm going to predict Larrabee will provide a huge boost of performance over Intel's current crappy integrated graphic solutions, but will not be able to compete with AMD/ATI's and Nvidia's high end GPU's when it (Larrabee) finally launches. If Intel can deliver a monster that can push 100+ FPS in Crysis and doesn't cost so much that it breaks the bank like the current Nvidia GTX 280's, then they will have a real winner! When it finally launches though, who knows what AMD/ATI and Nvidia will have out to compete against it, wonder if Intel is just trying to push out a mainstream chip or go high end as well...guess I need to read the rest of the article :)
  • JEDIYoda - Tuesday, August 5, 2008 - link

    dreaming again huh??? you people who want top notch performance without having to pay for it....rofl..hahaha
  • FITCamaro - Monday, August 4, 2008 - link

    This isn't mean to compete with their IGPs. At least not initially.

Log in

Don't have an account? Sign up now