A Tribute to Michael Abrash: The ISA

Some people idolize athletes. Others gravitate towards entertainers. While Derek is a hockey fan and a musician who loves watching movies, his real passion lead him in a different direction. And he's also going to devolve into first person singular for a minute to tell you a little more about that.

At the time I was a high school student who needed a good project outside the curriculum to teach to our C++ programming class (this was another one of the excellent projects Jo Adams set her students upon). My good friend Tom Macleod and I had just learned enough calculus and advanced geometry to be dangerous: we decided to write a 3D graphics engine in order to learn and teach graphics programming to the class.

To support this endeavor, I spent a bit of cash (well, my parent's cash anyway) on some graphics and game programming books for the occasion, and the one that really stood out (the one that set the course of my life) was Michael Abrash's Graphics Programming Black Book Special Edition. This giant tome contained quite a bit of collected wisdom regarding the art and science of code optimization and graphics programming as well as some great details about the development of Quake.

Not only was his book an incredible source of information and inspiration for me personally, but if there was ever an x86 assembly guru and graphics programming god that could help take the design of an instruction set architecture for Larrabee to a whole other dimension, it is Michael Abrash. And our information indicates that he has done just that.

This isn't to say that others on the Larrabee team don't deserve a spotlight; it's just exciting to see the guy who got me hooked on computer graphics programming (which lead to my interest in hardware) show up on such an impressive graphics hardware design team.

For those who haven't idolized Abrash, his Wikipedia entry helps explain his luminary status in the game industry:

"Michael Abrash is a highly regarded technical writer, and one of the top optimization and 80x86 assembly language programmers, a reputation cemented by his 1990 book Zen of Assembly Language Volume 1: Knowledge. Before getting into technical writing, Abrash was a game programmer, having written his first commercial game in 1982. After working at Microsoft on graphics and assembly code for Windows NT 3.1, he returned to the game industry in the mid-1990s to work on Quake for id Software. Some of the technology behind Quake is documented in Abrash's Graphics Programming Black Book. After Quake was released, Abrash returned to Microsoft to work on natural language research, then moved to the Xbox team, until 2001. In 2002, Abrash went to work for RAD Game Tools, where he co-wrote the advanced Pixomatic software renderer, which emulates the functionality of a DirectX 7-level graphics card and is used as the software renderer in such games as Unreal Tournament 2004."

Intel brought Abrash on as a consultant to help define the Larrabee instruction set. For the longest time, extensions to x86 (e.g. SSE4) were done by Intel engineers at the request of the software community. With every iteration of SSE the game industry was always happier but never truly satisfied with the extensions to x86 that Intel introduced. When Intel set out to define the extensions to x86 that would be used in Larrabee, it sought out visionaries within the game industry to help define that spec rather than creating hardware and defining the ISA internally. One thing we've consistently heard from game developers about Larrabee is that the ISA makes more sense than any other approach they have seen from ATI or NVIDIA. Larrabee's ISA was designed in part by the game industry, for that very industry.

Interestingly enough, while reluctant to go into details about the Larrabee ISA itself, Intel did tell us that fewer than 5% of the instructions are graphics specific. What they found is that creating overly specialized instructions doesn't always do that much good as they can be hard for compilers to use effectively and difficult to hand optimize with as well. Rather, having a good selection of generally applicable and powerful instructions is a better way to go.

One of the advantages of developing the compiler in parallel with the ISA itself is that they can easily test and adapt both as needed to understand how best to balance the ISA. As the vast majority of developers will rely on compilers to generate highly performant code, making sure the ISA is a good fit for compilers is essential. At the same time, because of the renewed interest in software graphics engines Larrabee is stirring up in the Old Guard of real-time 3D computer graphics, having icons like Michael Abrash on the team will help make sure that the ISA is not only compiler friendly but will also be attractive to those who wish to achieve Zen through assembly optimization.

Which brings us to an interesting point.

Programming for Larrabee The Awesome Potential of Fully Programmable Graphics
Comments Locked

101 Comments

View All Comments

  • iop3u2 - Monday, August 4, 2008 - link

    First of all it's called d3d not directx.

    Secondly you seem to imply that direct3d/opengl will cease to exist at some point if larrabee succeeds. I thinks you don't quite get what they are. They are APIs. Larrabee won't make programming APIless. Are you serious anand or what?
  • The Preacher - Tuesday, August 5, 2008 - link

    It could make programming D3D/OpenGL-less for programs/PCs that exploit Larrabee. And if the share of such programs/PCs increases, the share of competing solutions logically decreases and might eventually vanish (although not anytime soon).
  • iop3u2 - Tuesday, August 5, 2008 - link

    Just because you can for example write a c program without the c lib it doesn't mean that people follow that road. It's all about what programmers will choose to do.

    Also, even if they do vanish there will still be a need for an api. So there will either be a new api or they won't vanish. Both situations make no difference whatsoever to the fact that larrabee will always need api implementations.
  • ZootyGray - Tuesday, August 5, 2008 - link

    right - and I will put hotels on boardwalk and park place :)

    I used to own an 815chipset - it was like version 14 or whatever so it didn't suk as bad as some of the earlier ones - but it did blow up - I think pixelated FarCry and Doom3 really killed it. But o sure, the software fixes and bubblegum patches made it good, for a while. I really do think I am going to wait for this just so I can watch the lineups of returns - or read the funny forums posts of sheep seeking help - baaaahaha :) The best part is that it doesn't exist - delay, postpone - kinda like the 64bit chip also. Maybe later, maybe. But the ads invade the livingroom.
    Make sure you keep yer getouttajailfree card - receipt.
    Ummm let's see: I think I will buy this one!

    Reality is that 4870x2 is on deck. Not 'rumour and sigh'. I just know there will be a 16page article on that - not!
  • Pok3R - Monday, August 4, 2008 - link

    Larrabee means good news for consumers, and definitely bad news for nvidia. Maybe the worst in decades...with AMD and Ati having enough human resources now to face it, and Nvidia having nothing but bad policies and falling stocks despite good $elling numbers...

    The future, today, is definitely Intel vs AMD/Ati.
  • initialised - Monday, August 4, 2008 - link

    a miniature render farm (you know like they use to make films like Hulk and WALL-E) on a chip. Lets hope AMD and nVidia can keep up.
  • ZootyGray - Monday, August 4, 2008 - link

    Really? Guess again. There is NOT anything to keep up to.

    I do not accept that the grafx loser in the industry is going to simply become numero uno overnight.

    You really think that nvidia and ati have been sleeping for decades?

    Supporting the destruction of ntel's only competitors leaves us at the mercy of a group that's already been busted for monop and antitrst.

    Well written article? Of course, but I think it's like you are all fished in on many fronts. Nothing is really known except spin. This is beachfront property in the desert.

    There's nothing to watch except what we usually watch - released hardware benchmarks.

    I tell you AMD is going to be the cpu of choice in a few months when the truth about the bias in the benchies is revealed. And try - try real hard - to imagine ati+amd creating the ultimate cpu+gpu powerhouse. ntel needs this hype because I am not the only one with vision here. they are rich and scared, for now.

    but such talk seems to be frowned upon - so let's all cheer for the best grafx manufacturer - ntel = kkaakk! sorry to offend, so many of you just might be lost in the paid mob. so just watch and you will see for yourself- no need to believe me. I really know almost nothing - but I am free to see for myself. sorry to offend - I just can't cosign bs. but that's just me and a very few other posters here who have also been criticized. watch and see for yourself. watch...
  • Mr Roboto - Monday, August 4, 2008 - link

    I'd have to agree with the skeptics here. While the article is well written and informative (What AnandTech articles aren't?) it's purely speculation that Intel can get all of the variables right. How does a company that hasn't made a competitive GPU since the days of the 486 suddenly jump to Nvidia and ATI GPU type levels on their first try, never mind surpassing them. It's absolutely absurd to think that these chips are going to replace GPU's in terms of performance. I believe Larrabee will kick the shit out of Intel's own IGP but then again that's not much of a feat.

    Again I have to agree with previous posters that Intel just isn't that innovative. Even as I speak their are many lawsuits pending against Intel, most of them having to do with accusations of stolen IP that were used to design the Core2Duo. Antitrust suits aside, it's clear that Intel is similar to MS in that they just bully, bribe or outright steal to get ahead then pay whatever fines are levied because in the end they can never fine them enough to not make it worthwhile for Intel or MS to break the law.

    The 65nm Core2Duo is amazing. The 45nm E8400 I just bought is even more so. However the more I think about Intel's past failures as well as how they operate as a company the more far fetched this whole thing becomes.

    IMO they should have tried to compete in the dedicated GPU market before trying something like this. From a purely marketing standpoint Intel and graphics just don't go together. To come in to a new field in which they are unproven (I would bet Intel executives believe that building IGP's have somehow given them experience) and make outrageous claims such as the GPU is dead and Intel will now be the leader, is absurd.
  • JarredWalton - Tuesday, August 5, 2008 - link

    I think a lot of you are missing the point that we fully understand this is all on paper and what remains to be seen is how it actually pans out in practice. Without the necessary drivers to run DirectX and OpenGL at high performance, this will fail. How many times was that mentioned? At least two or three.

    Now, the other thing to consider is that in terms of complexity, a modern Core 2 core is far more complex to design than any of the GPUs out there. You have all sorts of general functions that need to be coded. A GPU core these days consists of a relatively simple core that you then repeat 4, 8, 16, 32, etc. times. Intel is doing exactly that with Larrabee. They went back to a simple x86 core and tacked on some serious vector processing power. Sounds a lot like NVIDIA's SP or ATI's SPU really.

    Fundamentally, they have what is necessary to make this work, and all that remains is to see if they can pull off the software side. That's a big IF, but then Intel is a big company. We have reached the point where GPUs and CPUs are merging - CUDA and GPGPU aim to do just that in some ways - so for Intel to start at the CPU side and move towards a GPU is no less valid an approach than NVIDIA/ATI starting at GPUs and moving towards general purpose CPUs.
  • Midwayman - Monday, August 4, 2008 - link

    I not interested in the graphics so much. It may or may not compete with the the top end nvidia chips if released on time. What is more interesting is if this can easily be integrated as a general purpose cpu for non-graphics work? Imagine getting a benefit out of your gpu 100% of the time, not just when you're gaming. I know its possible to use more modern GPU's this way if you code specifically for them, but with its x86 architecture, it might be able to do it without having apps specifically coded for it.

Log in

Don't have an account? Sign up now