Final Words

Well, we've known it was coming for quite a while. We knew it would be a many-core CPU architecture well suited to graphics. And with as much information as we were given, when we sat down to look at what we had we felt like we still didn't know anything about Larrabee. Piles of data and information, insight into how a software render would fit on top of the underlying architecture... it has left us with the feeling that all this is a really cool idea with great potential, but we just don't have any idea what or how well it will do when it finally hits.

Of course, this is the first time any real detail has been given, and any hint of product is at least 12 to 18+ months off. We can't expect Intel to give everything away right off the bat. We are very happy to have the detail we do, and can't wait to get more.

While we are very interested in the architecture from the sort of technophile point of view that we can't help but have, technology for technology sake (no matter how cool the theory behind it might be) amounts to nothing without real-world application and benefit. For Larrabee it will all come down to peformance and price.

AMD has shown that you don't need to be on top to compete. As long as performance somewhere in the middle of the pack can be produced, appropriate and aggressive pricing can go quite a long way. For the consumer it is always a cost/benefit analysis, and there are quite a number of computers with $100 - $300 graphics cards under the hood. If compatibility is there, if performance is there, and if Intel is able to price it right, the first round of Larrabee hardware doesn't need to be ground breaking.

Getting a good foothold and sticking it out for the long haul should be Intel's goal. Compatibility (especially with the track record of Intel's integrated graphics) is likely more important than pure performance. Getting product out there into the market is necssary before developers will even start to take a chance on pushing the hardware itself. And this is where Larrabee could really shine.

Opening the door to fully programmable rendering and making it attractive enough for developers to start pushing the envelope will be a long process. The current game development arena is all about return on investment, and except for a few brave souls we will likely see game and engine developers stick to DirectX 10 for quite some time even after DX 11 comes along. Those who venture into the realm of pure software renderes written for a highly data-parallel CPU will be the exception rather than the norm.

Things That Could Go Wrong
Comments Locked

101 Comments

View All Comments

  • phaxmohdem - Monday, August 4, 2008 - link

    Can your mom play Crysis? *burn*
  • JonnyDough - Monday, August 4, 2008 - link

    I suppose she could but I don't think she would want to. Why do you care anyway? Have some sort of weird fetish with moms playing video games or are you just looking for another woman to relate to?

    Ooooh, burn!
  • Griswold - Monday, August 4, 2008 - link

    He is looking for the one playing his mom, I think.
  • bigboxes - Monday, August 4, 2008 - link

    Yup. He worded it incorrectly. It should have read, "but can it play your mom?" :p
  • Tilmitt - Monday, August 4, 2008 - link

    I'm really disappointed that Intel isn't building a regular GPU. I doubt that bolting a load of unoptimised x86 cores together is going to be able to perform anywhere near as well as a GPU built from the ground up to accelerate graphics, given equal die sizes.
  • JKflipflop98 - Monday, August 4, 2008 - link

    WTF? Did you read the article?
  • Zoomer - Sunday, August 10, 2008 - link

    He had a point. More programmable == more transistors. Can't escape from that fact.

    Given equal number of transistors, running the same program, a more programmable solution will always be crushed by fixed function processors.
  • JonnyDough - Monday, August 4, 2008 - link

    I was wondering that too. This is obviously a push towards a smaller Centrino type package. Imagine a powerful CPU that can push graphics too. At some point this will save a lot of battery juice in a notebook computer, along with space. It may not be able to play games, but I'm pretty sure it will make for some great basic laptops someday that can run video. Not all college kids and overseas marines want to play video games. Some just want to watch clips of their family back home.
  • rudolphna - Monday, August 4, 2008 - link

    as interesting and cool as this sounds, this is even more bad news for AMD, who was finally making up for lost ground. granted, its still probably 2 years away, and hopefully AMD will be back to its old self (Athlon64 era) They are finally getting products that can actually compete. Another challenger, especially from its biggest rival-Intel- cannot be good for them.
  • bigboxes - Monday, August 4, 2008 - link

    What are you talking about? It's been nothing but good news for AMD lately. Sure, let Intel sink a lot of $$ into graphics. Sounds like a win for AMD (in a roundabout way). It's like AMD investing into a graphics maker (ATI) instead of concentrating on what makes them great. Most of the Intel supporters were all over AMD for making that decision. Turn this around and watch Intel invest heavily into graphics and it's a grand slam. I guess it's all about perspective. :)

Log in

Don't have an account? Sign up now