The Larrabee Question

Let's talk about Larrabee.

NVIDIA pointed out that Larrabee x86 isn't binary compatible with other Intel x86 processors (since it doesn't support any of the SSEs) - so there's no advantage there.

Honestly, x86 today is a burden for Larrabee, not a boon as it is not the most desirable ISA from anything other than a compatibility standpoint. The difference between G2xx and Larrabee is in the programming model not the ISA. It's the threading model with G2xx that the developer complaints are really about.

NVIDIA says that it simply takes a new approach to development - focus on data in and data out, rather than conventional top to bottom function coding. The issue is that programmers don't like to change the way they work.

The real question is: when Larrabee ships, will its threaded programming model be significantly easier than G2xx. At this point it's simply too early to tell, Intel thinks it will be and many of the developers I've spoken to agree, but NVIDIA keeps arguing that Larrabee's programming model will be just as different as CUDA and that NVIDIA has the inherent advantage here because of the experience it has had in building GPUs for the past 15 years.

Would NVIDIA Integrate a CPU?

David Kirk summarized, quite well, his thoughts on whether NVIDIA would ever pursue putting a CPU on die next to one of its GPUs.

Kirk's view is that at the low end there's a place for a single-chip CPU/GPU, he views integration (rightly so) as a low cost play. "None of our customers ask us for less performance, why would we ever take away part of our GPU and put a CPU in it?"

NVIDIA currently competes in the low end of the GPU market with its sub $75 GPUs and IGP chipsets. The integrated CPU/GPU does stand the chance of eating into NVIDIA's largest quantity market, and it doesn't look like NVIDIA stands the chance to compete there - at least in x86 desktops/notebooks. Why would you pay more for a NVIDIA chipset with integrated graphics, if you already get integrated graphics on every single CPU you buy?

We've got a future where AMD/Intel ship these hybrid CPU/GPUs on the low end, GPUs like the RV770 and Larrabee at the high end, and NVIDIA is already being pushed out on the chipset side (neither Intel nor AMD wants to be the #2 manufacturer of chipsets for their own CPUs). In the worst case scenario, if NVIDIA gets squeezed by everything I just mentioned over the next few years, what is NVIDIA's strategy going forward? Jen-Hsun actually highlighted one possible direction...

Index NVIDIA's Mobile Strategy: "Completely Focus on Smartphones"
Comments Locked

18 Comments

View All Comments

  • Gary Key - Wednesday, August 27, 2008 - link

    End of next week or Monday 9/8 for 790GX plus 780a comparison/update, retested with the 8.8 drivers this week and they changed the scope and tone of the story we had almost completed. The G45 will be up right before it, just to show a comparison on where Intel is at this point, which honestly is not far considering the driver and repeater problems.
  • Theunis - Thursday, August 28, 2008 - link

    Don't forget 790GX and G45 on Linux tests! Man I'm getting worried about Linux being left in the dark when it comes to hardware decoding for H.264 :(
  • tayhimself - Tuesday, August 26, 2008 - link

    Flat out stating that it wasn't too interesting. Nvidia are in a difficult position and playing their cards very close to their chest.
  • DigitalFreak - Tuesday, August 26, 2008 - link

    There's only reasons Nvidia is being forced out of the chipset market are:

    1) They're being assholes when it comes to SLI compatibility with non Nvidia chipsets. Neither Intel nor AMD need Nvidia chipsets anymore. Both have well designed products to cover their entire markets.

    2) Their chipset products are buggy as hell. When's the last time Nvidia released a chipset that didn't cause some type of data corruption? Nforce4? Nforce2?
  • DigitalFreak - Tuesday, August 26, 2008 - link

    As far as Lucid goes, do you really think Intel would be dumping boatloads of cash into this outfit if they didn't think the technology held promise? It's not going to cure world hunger, but it sounds like the Nvidia PR machine is spinning up 'cause they're getting worried.
  • Griswold - Wednesday, August 27, 2008 - link

    Right. Thats why Intel dumped billions and billions on the HUGELY successful Itanium, which was intended to eventually replace x86 in the consumer place as well. And it did! No, wait - it didnt...
  • JarredWalton - Tuesday, August 26, 2008 - link

    $50M isn't really "boatloads" to Intel - I think that's the value I heard in one of the reports? R&D is expensive, and if Hydra/Lucid ends up going nowhere Intel won't worry too much - they'll probably still get some patents and other interesting info from the whole process.
  • evelyn - Saturday, November 27, 2010 - link

    Winter boots Big Bargain. New Year's wearing new Ugg boots.
    You have enough new boots? Cool enough beautiful? Have brand?
    If you do not have words. I would like to introduce you to my company’s ugg boots.
    Great Brands. Fashionable. Is absolutely unique.
    Do not look at the money. Just a few seconds you have time. You can get what you want.
    For more articles please visit http://www.uggbootspp.com/

    (I)Our Company is Online store, and we vogu emalls offer top quality ugg boots at good price.

    (II) In the past 3 years, we have sent great many ugg boots to the customers in USA, Europe, ASIA and other area, also we have lots of experience in dealing with online business oversea

    (III) Generally, boots are delivered in fast and safety way Of course, ugg boots will be packaged in original box with tags

Log in

Don't have an account? Sign up now