Things That Could Go Wrong

I had to write this section because as strong as Intel has been executing these past couple of years, we must keep in mind that in the GPU market, Intel isn't only the underdog, it's going up against the undefeated. NVIDIA, the company that walked into 3dfx's house and walked away with its IP, the company who could be out engineered and outperformed by ATI for an entire year and still emerge as dominant. This is Intel's competition, the most Intel-like of all of the manufacturers in the business, and a highly efficient one at that.

Intel may benefit from the use of its advanced manufacturing fabs in making Larrabee, but it is also burdened by them. NVIDIA has been building GPUs, some quite large, without ever investing a dime in building its own manufacturing facility. There's much that could go wrong with Larrabee, the short list follows:

Manufacturing, Design and Yield

Before we get to any of the GPU-specific concerns about Larrabee, there's always the basics when making any chip. There's always the chance that it could be flawed, it might not reach the right clock speeds, deliver the right performance and perhaps not yield well enough. Larrabee has a good chance of being Intel's largest die produced in desktop-like volumes, while Intel is good at manufacturing we can't rule these out as concerns.

Performance

As interesting as Larrabee sounds, it's not going to arrive for another year at least. NVIDIA should have even higher performing parts out by then, making GT200 look feebile by comparison. If Intel can't deliver a real advantage over the best from NVIDIA and AMD, Larrabee won't get very far as little more than a neat architecture.

Drivers and Developer Relations

Intel's driver team now is hardly its strongpoint. On the integrated graphics side we continue to have tons of issues, even as we're testing the new G45 platform we're still bumping into many driver related issues and are hearing, even from within Intel, that the IGP driver team leaves much to be desired. Remember that NVIDIA as a company is made up of mostly software engineers - drivers are paramount to making a GPU successful, and Intel hasn't proved itself.

I asked Intel who was working on the Larrabee drivers, thankfully the current driver team is hard at work on the current IGP platforms and not on Larrabee. Intel has a number of its own software engineers working on Larrabee's drivers, as well as a large team that came over from 3DLabs. It's too early to say whether or not this is a good thing, nor do we have any idea of what Intel's capabilities are from a regression testing standpoint, but architecture or not, drivers can easily decide the winner in the GPU race.

Developer relations are also very important. Remember the NVIDIA/Assassin's Creed/DirectX 10.1 fiasco? NVIDIA's co-marketing campaign with nearly all of the top developers is an incredibly strong force. While Intel has the clout to be able to talk to game developers, we're bound to see the clash of two impossibly strong forces here.

The Future of Larrabee: The Many Core Era and Launch Questions Final Words
Comments Locked

101 Comments

View All Comments

  • phaxmohdem - Monday, August 4, 2008 - link

    Can your mom play Crysis? *burn*
  • JonnyDough - Monday, August 4, 2008 - link

    I suppose she could but I don't think she would want to. Why do you care anyway? Have some sort of weird fetish with moms playing video games or are you just looking for another woman to relate to?

    Ooooh, burn!
  • Griswold - Monday, August 4, 2008 - link

    He is looking for the one playing his mom, I think.
  • bigboxes - Monday, August 4, 2008 - link

    Yup. He worded it incorrectly. It should have read, "but can it play your mom?" :p
  • Tilmitt - Monday, August 4, 2008 - link

    I'm really disappointed that Intel isn't building a regular GPU. I doubt that bolting a load of unoptimised x86 cores together is going to be able to perform anywhere near as well as a GPU built from the ground up to accelerate graphics, given equal die sizes.
  • JKflipflop98 - Monday, August 4, 2008 - link

    WTF? Did you read the article?
  • Zoomer - Sunday, August 10, 2008 - link

    He had a point. More programmable == more transistors. Can't escape from that fact.

    Given equal number of transistors, running the same program, a more programmable solution will always be crushed by fixed function processors.
  • JonnyDough - Monday, August 4, 2008 - link

    I was wondering that too. This is obviously a push towards a smaller Centrino type package. Imagine a powerful CPU that can push graphics too. At some point this will save a lot of battery juice in a notebook computer, along with space. It may not be able to play games, but I'm pretty sure it will make for some great basic laptops someday that can run video. Not all college kids and overseas marines want to play video games. Some just want to watch clips of their family back home.
  • rudolphna - Monday, August 4, 2008 - link

    as interesting and cool as this sounds, this is even more bad news for AMD, who was finally making up for lost ground. granted, its still probably 2 years away, and hopefully AMD will be back to its old self (Athlon64 era) They are finally getting products that can actually compete. Another challenger, especially from its biggest rival-Intel- cannot be good for them.
  • bigboxes - Monday, August 4, 2008 - link

    What are you talking about? It's been nothing but good news for AMD lately. Sure, let Intel sink a lot of $$ into graphics. Sounds like a win for AMD (in a roundabout way). It's like AMD investing into a graphics maker (ATI) instead of concentrating on what makes them great. Most of the Intel supporters were all over AMD for making that decision. Turn this around and watch Intel invest heavily into graphics and it's a grand slam. I guess it's all about perspective. :)

Log in

Don't have an account? Sign up now