No matter whether we've got a low end or high end system, we all expect the realtime 3D revolution to continue until we achieve near parity with reality. The push forward is backed by many factors including pure hardware performance and brilliant advances in techniques for better approximating what we see. But there's another side to the equation beyond just hardware and developers: there is the graphics API.

Unlike CPUs, graphics hardware (GPUs) do not have a common instruction set upon which tools and software can be built. In order to get the power of the hardware out to the public, we need a common interface that works no matter what GPU is underneath. It's left to the graphics hardware designer to take the code generated by this application programming interface (API) and translate it into something that their chip can use. Because it's the developer's single point of contact, the graphics API is incredibly important. It defines how much flexibility programmers have in using hardware and shapes the world of high performance realtime 3D graphics.

Some of the key work done through the graphics API is taking descriptions of 3D objects in a 3D world, sending those objects and other resources to the hardware, and then telling the hardware what to do with them. There is sort of a step by step process that needs to be followed that we generally call a pipeline. Graphics API pipelines have stages where different work is done. Here's the general structure of a 3D graphics pipeline:

First vertex data (information about the position of the corners of shapes) is taken in and processed. Then those shapes can then be further manipulated and re-processed if needed. After this, 3D objects are broken down from 3D shapes by projecting them into 2D fragments called pixels (this step is called rasterization), and then these pixels are each processed by looking up texture information and using lighting techniques and so on. When pixels are finished processing, they are output and displayed on the screen. And that's the mile high overview of how 3D graphics work.

For the past dozen years (it seems longer doesn't it?), we've seen makers of 3D graphics hardware accelerate two very prominent APIs: OpenGL and DirectX.

We recently touched on advancements tangential to OpenGL in our OpenCL article, but today our focus will be on DirectX. Microsoft's DirectX graphics API is much more heavily used in game engines than OpenGL, in a large part because DirectX tends to move much more quickly and sets the bar for both the hardware and DirectX in terms of feature set and flexibility. That always makes upcoming versions of DirectX exciting to talk about: they define the future capabilities of hardware and expose improved tools to developers. Upcoming DirectX versions are glimpses into our graphical future. Currently we have a lot of DirectX 9 and DirectX 10 games available and in development, but DirectX 11 looms on the horizon.

As usual, Microsoft will be trying to time the release of their next DirectX revision with the release of compatible graphics hardware. As with last time, DirectX 11 will also be released with Windows 7. With the Windows 7 Beta already under way, we expect the OS to be done some time this year.

Microsoft has been rather aggressive with Windows 7 scheduling in light of the rejection of Vista, so it appears they are stepping up to the plate to get everything out sooner rather than later. There was a little more than 4 years between the release of DirectX 9 and DirectX 10. As it hit the streets with Vista in January of 2007, DirectX 10 has just turned 2 and we are already anticipating it's replacement in the very near future. As we will learn, this speedy transition should be very good for DirectX 11 adoption as DirectX 10 hasn't even become pervasive yet: many games are still DirectX 9 only.

But let's take a closer look at what we are talking about before we go any further.

Introducing DirectX 11: The Pipeline and Features
Comments Locked

109 Comments

View All Comments

  • epyon96 - Sunday, February 1, 2009 - link

    That's very insightful. Can you go into more detail?

    I am confused because there appeared to be significant differences between Dx9C and Dx9B since NVidia made it sound like the difference was like the difference between Dx8.1/2/3 and Dx8.4 which did seem very significant if memory serves me right.

    The difference between 8.4 and 9 seemed minimal in quality of the final output.
  • GourdFreeMan - Monday, February 2, 2009 - link

    The guidelines I spoke of were mentioned on the MSDN Forums circa 2003 regarding how changes to Direct3D would affect DirectX versioning, but seem to have been abandoned in favor of the bimonthly SDK updates following the DX 9.0c release. Bimonthly updates led to faster bug fixes, which in prior versions of DirectX sometimes required a letter update.

    If you are interested in the exact technical changes between DirectX versions, I suggest downloading the old SDK versions prior to the move to bimonthly updates and looking at the Changes section of the documentation.

    Regarding the move between DirectX 8 and DirectX 9, Shader Model 2.0 was introduced making way for games such as Far Cry (admittedly Far Cry was a DX 9.0b game, but the changes from 9.0 to 9.0b mainly involved SM 2.0a and SM 2.0b which for Far Cry meant enhanced performance on ATi and nVIDIA cards). Far Cry would later be patched to support DX 9.0c and SM 3.0, adding features like HDR, but I would argue that the unpatched game still looked considerably better than DX8 titles.

    (Incidentally there is no DirectX 8.3 and 8.4 -- there was 8.1a and 8.1b in the progression instead).
  • epyon96 - Saturday, January 31, 2009 - link

    I wish the article had more background on what you just hypothesized (obviously with some substantiated facts) instead instead of the unnecessary vista bashing. It wound satisfy an actual curiosity.

    I remember that's one of the reasons why the in depth analysis of the development cycle of R770 was so well liked.
  • gamerk2 - Saturday, January 31, 2009 - link

    The issue with DX11 is this: You need to supply a DX10 codepath for those who won't update GFX cards (you can't release a game no one has hardware for), but also would need a DX9 codepath for XP.

    Why would anyone release a game with three seperate grpahics code paths? Its for that reason I see a slow use of DX11, as long as XP holds 15-20% market share.
  • ltcommanderdata - Saturday, January 31, 2009 - link

    If I remember those OS market share reports correctly, as of the end of last year Windows XP had about 65% market share, Vista has about 20% after 2 years, and Mac is nearing 10%. Even if Windows 7 is a roaring success, XP just has too much built-up market share to disappear overnight, so XP and DX9 compatibility will be required for at least another 2 years. The other thing that works against Windows 7 is that even if it isn't released until next year, it's introduction looks to be right in the middle of this economic recession, since things probably won't really pick up until late 2010 or 2011. When the economy does pick up again, there will be huge demand as companies finally switch from XP which would be 10 years old by then, but the first year of Windows 7 sales will probably be slow.
  • bobvodka - Saturday, January 31, 2009 - link

    Well, to be fair, you don't have to have a DX10 path and a DX11 path as such. A few important features work on DX10 cards anyway, such as the multi-threaded rendering stuff, so you need a DX11 and a DX9 path at most; you just have to do some feature detection to find out if you are on a DX11, DX10 or DX10.1 card.

    Still a slight pain, but not as much as 3 real code paths.
  • DarkMadMax - Saturday, January 31, 2009 - link

    And main reason is consoles. There are practically no PC exclusives anymore among large budget titles (e.g. the ones who concentrate on graphics) . So all games target xbox360 hardware (if they dont they are ps3 exclusives). So until new generation of consoles appears there will be no progress in graphics. Period

  • haukionkannel - Saturday, January 31, 2009 - link

    To me, this article mostly talks about new features of DX11 and that some fundamental fealtures can benefit allso dx10 and dx10.1 hardware...
    To me it seems that the Vista part was only there to say why there are not any real DX10 games now, even the features are there. I didn't read it as an Vista hate like many people here seems to think of it.
    All in all it was very good article abou how DX11 can allow those promises that DX10 promised to flourish better this time.
  • scruffypup - Saturday, January 31, 2009 - link

    That though this article was supposed to be about DirectX 11, Derek's bias and opinion about Vista overshadowed the subject of the article,...

    This article shows poor writing at its finest,.. afterall doesn't writing 101 teach one to make the article about the subject you are writing about and not something else?

    Again I say,... Derek does a disservice to anandtech with this bias. If you want to put in your bias towards an unrelated subject,.. at least show clearly the links (relevancy) to your intended subject material and how you come by a conclusion to support that claim other than just spouting off needlessly,... for that is what you have done essentially, as it held no relevance to the subject material the way you wrote the article.
  • chizow - Saturday, January 31, 2009 - link

    Article summary:

    1)DX11 offers nothing new over DX10, as quoted in the article its just a strict superset that builds on and adds features to DX10 capability.
    2)Vista and DX10 sucked because no one wanted to use them.

    Derek, like many others I disagree with your assessment of Vista's importance in the overall OS hierarchy, here's just a quick list:

    1) First OS to bring 64-bit support to the mainstream.
    2) First OS to offer multi-threaded driver improvements. Look at Rel 180 and 8.12 Hot Fix, where multi-threaded drivers are all the rage.
    3) First OS to offer DX10 support. We're finally seeing some of the performance benefits we were promised in DX10 with multi-threaded drivers and improved AA with reading of the multi-sample depth buffer.
    4) Much better OS stability compared to XP. It wasn't always the case, but contrary to your article, most of the problems were fixed in July/August with the various video hot fixes (Ryan Smith can probably confirm or deny this).

    I think Win7 just emphasizes how good Vista is, and how many light years ahead both are compared to XP. You could say Win7 is like Mojave SE, not Vista SE, as you can clearly see all the Vista-haters who are running Win7 glowing about all the features and stability they've missed out on for at least a year (since Vista SP1).

Log in

Don't have an account? Sign up now