The Awesome Potential of Fully Programmable Graphics

Certainly we can't judge the applicability and impact Larrabee will have until we see how it handles real-world applications. But we absolutely cannot write off such a giant as Intel when they throw their chips into the pot. Some of the current graphics hardware establishment have tried to suggest to us that Intel is not in touch with the current development community and that the only reason some developers are excited about the extensive low level programmability of Larrabee is because they are nostalgic for the old days of graphics programming where it was all about the software renderer.

I don't think anyone is under the illusion that DirectX and OpenGL performance are irrelevant for Larrabee. If Intel fails at delivering equivalent or greater price/performance in games and applications that use these programming APIs, then no matter how well the hardware could be used for any software engine it will fail. But the potential to customize every part of the rendering pipeline, the capability of supporting a software renderer with the same level of performance as if the hardware was customized to it, adds a level of value to the development community that will absolutely blow away anything NVIDIA or AMD can currently (or will for the foreseeable future) offer.

Re-opening the door for Tim Sweeney, John Carmack, Michael Abrash, and other pioneers and visionaries in the field of 3D graphics to once again have the freedom to take a piece of hardware that can offer the kind of data parallel speed that has heretofore been limited to the GPU and literally do anything they want with it is something to be excited about. Limited much less by the physical design of the hardware to once again only be limited by the performance of any given segment of code could help speed up the transition from SIGGRAPH to games. Larrabee could help create a new wellspring of research, experimentation and techniques for real-time graphics, the likes of which have not been seen since the mid-to-late 1990s.

We have absolutely been seeing the current graphics hardware giants move toward more flexibility and programmability. But if Intel is able to effectively leap-frog their slow trudge toward true general purpose programming DX version by DX version, we will see the end of an era where games are feature limited by hardware. No longer will we need new hardware to handle a new DX version with new techniques and effects: we would only need a driver update to add support for the new API. The only obstacle to running games using future APIs will be performance. The only reason to upgrade in the future will be speed. It will be a different world, altogether different than anything we've known or experienced before yet incredibly similar to the roots from which the industry was born.

It is an exciting time to be in the field of computer graphics.

A Tribute to Michael Abrash: The ISA Thread and Data Management: It's Time to Blow Your Mind
Comments Locked

101 Comments

View All Comments

  • DerekWilson - Monday, August 4, 2008 - link

    this is a pretty good observation ...

    but no matter how much potential it has, performance in games is going to be the thing that actually makes or breaks it. it's of no use to anyone if no one buys it. and no one is going to buy it because of potential -- it's all about whether or not they can deliver on game performance.
  • Griswold - Monday, August 4, 2008 - link

    Well, it seems you dont get it either.
  • helms - Monday, August 4, 2008 - link

    I decided to check out the development of this game I heard about ages ago that seemed pretty unique not only the game but the game engine for it. Going to the website it seems Intel acquired them at the end of February.

    http://www.projectoffset.com/news.php">http://www.projectoffset.com/news.php
    http://www.projectoffset.com/technology.php">http://www.projectoffset.com/technology.php

    I wonder how significant this is.
  • iwodo - Monday, August 4, 2008 - link

    I forgot to ask, how will the Software Render works out on Mac? Since all Direct X code are run to Software renderer doesn't that fundamentally mean most of the current Windows based games could be run on Mac with little work?
  • MamiyaOtaru - Monday, August 4, 2008 - link

    Not really. Larrabee will be translating directx to its software renderer. But unless Microsoft ports the directX API to OSX, there will be nothing for Larrabee to translate.
  • Aethelwolf - Monday, August 4, 2008 - link

    I wonder if game devs can write their games in directx then have the software renderer convert it into larrabee's ISA on windows platform, capturing the binary somehow. Distribute the directx on windows and the software ISA for mac. No need for two separate code paths.
  • iwodo - Monday, August 4, 2008 - link

    If anyone can just point out the assumption anand make are false? Then it would be great, because what he is saying is simply too good to be true.

    One point to mention the 4Mb Cache takes up nearly 50% of the die size. So if intel could rely more on bandwidth and saving on cache they could put in a few more core.

    And am i the only one who think 2010 is far away from Introduction. I think 2009 summer seems like a much better time. Then they will have another 6 - 8 months before they move on to 32nm with higher clock speed.

    And for the Game developers, with the cash intel have, 10 Million for every high profile studio like Blizzard, 50 Million to EA to optimize for Intel. It would only cost them 100 million of pocket money.
  • ZootyGray - Monday, August 4, 2008 - link

    I was thinking of all the p90's I threw away - could have made a cpu sandwich, with a lil peanut software butter, and had this tower of babel thing sticking out the side of the case with a fan on top, called lazarus, or something - such an opportunity to utilize all that old tek - such imagery.

    griswold u r funny :)
  • Griswold - Monday, August 4, 2008 - link

    You definitely are confused. Time for a nap.
  • paydirt - Monday, August 4, 2008 - link

    STFU Griswald. It's not helpful for you to grade every comment. Grade the article if you like... Anandtech, is it possible to add an ignore user function for the comments?

Log in

Don't have an account? Sign up now