DirectX 10

Visual changes aside, there are numerous changes under the hood of Vista, and for much of our audience, DirectX 10 will be the biggest of such changes. DirectX 10 has enjoyed an odd place recently in what amounts to computer mythology, as it has been in development for several years now while Microsoft has extended DirectX 9 to accommodate new technologies. Previously, Microsoft was doing pretty good at providing yearly updates to DirectX.

Unlike the previous iterations of DirectX, 10 will be launched in a different manner due to all the changes to the operating system needed to support it. Of greatest importance, DirectX 10 will be a Vista-only feature; Microsoft will not be backporting it to XP. DirectX 10 will not only include support for new hardware features, but relies on some significant changes Microsoft is making to how Windows treats GPUs and interfaces with them, requiring the clean break. This may pose a problem for users that want to upgrade their hardware without upgrading their OS. It is likely that driver support will allow for DX9 compatibility, while new feature support could easily be added through OpenGL caps, but the exact steps ATI and NVIDIA will take to keep everyone happy will have to unfold over time.

There seems to be some misunderstanding in the community that DX9 hardware will not run with DirectX 10 installed or with games designed using DirectX 10. It has been a while, but this transition (under Vista) will be no different to the end user than the transition to DirectX 8 and 9, where users with older DirectX 7 hardware could still install and play most DX 8/9 games, only without the pixel or vertex shaders. New games which use DirectX 10 under Vista while running on older DX9 hardware will be able to gracefully fall back to the proper level of support. We've only recently begun to see games come out that refuse to run on DX8 level hardware, and it isn't likely we will see DX10-only games for several more years. Upgrading to Vista and DX10 won't absolutely require a hardware upgrade. The benefit comes in the advanced features made possible.

While we'll have more on the new hardware features supported by DirectX 10 later this year, we can talk a bit about what we know now. DirectX 10 will be bringing support for a new type of shader, the geometry shader, which allows for the modification of triangles in the middle of rendering at certain stages. Microsoft will also be implementing some technology from the Xbox 360, enabling the practical use of unified shaders like we've seen on ATI's Xenos GPU for the 360. Although DirectX 10 compliance does not require unified hardware shaders, the driver interface will be unified. This should make things easier for software developers, while at the same time allowing hardware designers to approach things in the manner they see best. Pixel and vertex shading will also be receiving some upgrades under the Shader Model 4.0 banner.

Click to enlarge


DirectX 10 will also be implementing some significant optimizations to the API itself, as the continuous building of DirectX versions upon themselves along with CPU-intensive pre-rendering techniques such as z-culling and hidden surface removal has resulted in a fairly large overhead being put on the CPU. Using their developer tools, ATI has estimated that the total CPU utilization spent working directly on graphics rendering (including overhead) can approach 40% in some situations, which has resulted in some games being CPU limited solely due to this overhead. With these API changes, DX10 should remove a good deal of the overhead, and while it still means that there will be a significant amount of CPU time required for rendering (20% in ATI's case), the 20% savings can be used to ultimately render more or more complex frames. Unfortunately, these API changes will work in tandem with hardware changes to support them, so these benefits will only be available to DirectX 10 class hardware.

The bigger story at the moment with DirectX 10, however, is how it also forms the basis of Microsoft's changes to how Windows will treat and interface with GPUs. With current GPU designs and the associated treatment from the operating system, GPUs are treated as single-access devices; one application is effectively given sovereign access and control of the GPU's 3D capabilities at any given moment. To change which application is utilizing these resources, a very expensive context switch must take place that involves swapping out the resources of the first application for that of the second. This can be clearly seen today when Alt+Tabbing out of a resource intensive game, where it may take several seconds to go in and out of it, and is also part of the reason that some games simply don't allow you to Alt+Tab. Windowed rendering in turn solves some of this problem, but it incurs a very heavy performance hit in some situations, and is otherwise a less than ideal solution.

With the use of full 3D acceleration on the desktop now with Aero, the penalties become even more severe for these context switches, which has driven Microsoft to redesign DirectX and how it interfaces with the GPU. The result of this is a new group of interface standards, which Microsoft is calling the Windows Display Driver Model, which will replace the older XP Display Driver Model used under XP.

The primary change with the first iteration of the WDDM, which is what will be shipping with the release version of Vista, is that Microsoft is starting a multi-year plan to influence hardware design so that Windows can stop treating the GPU as a single-tasking device, and in the inevitable evolution of GPUs towards CPUs, the GPU will become a true multi-tasking device. WDDM 1.0 as a result is largely a clean break from the XP DDM; it is based on what current SM2.0+ GPUs can do, with the majority of the change being what the operating system can do to attempt multitasking and task scheduling with modern hardware. For the most part, the changes brought in WDDM 1.0 will go unnoticed by users, but it will be laying the groundwork for WDDM 2.0.

While Microsoft hasn't completely finalized WDDM 2.0 yet, what we do know at this point is that it will require a new generation of hardware, again likely the forthcoming DirectX 10 class hardware, that will be built from the ground up to multitask and handle true task scheduling. The most immediate benefit from this will be that context switches will be much cheaper, so applications utilizing APIs that work with WDDM2.0 will be able to switch in/out in much less time. The secondary benefit of this will be that when there are multiple applications running that want to use the full 3D features of the GPU, such as Aero and an application like Google Earth, that their performance will be improved due to the faster context switches; at the moment context switches mean that even in a perfectly split load neither application is getting nearly 50% of the GPU time (and thus fall short of their potential performance in a multitasking environment). Even further in the future will be WDDM 2.1, which will be implementing "immediate" context switching. A final benefit is that the operating system should now be able to make much better use of graphics memory, so it is conceivable that even lower-end GPUs with large amounts of memory will have a place in the world.

In the mean time, Microsoft's development of WDDM comes at a cost: NVIDIA and ATI are currently busy building and optimizing their drivers for WDDM 1.0. The result of this is that along with Vista already being a beta operating system, their beta display drivers are in a very early state, resulting in what we will see is very poor gaming performance at the moment.

The First Look, Continued User Account Control and Security
Comments Locked

75 Comments

View All Comments

  • stash - Friday, June 16, 2006 - link

    Sleep is more effecient in the long run. Shutting down and doing a cold boot every day uses a lot more electricity than sleep. When the machine is in sleep, it uses a fraction of a single watt. Yes, this is obviously more than zero (completely off), but when you cold boot a system, it uses many times more power.

    As a side benefit, you get back to where you left off almost instantly because sleep combines standby with hibernation.
  • Griswold - Friday, June 16, 2006 - link

    Oh so wrong. Why would a cold boot use more power? Because the HDDs spin up? Going from sleep to full on does the same. Because the OS has to be loaded from the HDD? Sleep mode also writes to disk. And thats actually it. This is a computer, not an engine that uses more fuel at startup than when it runs.
  • stash - Friday, June 16, 2006 - link

    When you can resume from sleep in a few seconds compared to 45-60 seconds from a cold boot, then yes, a cold boot uses much more power.
  • johnsonx - Friday, June 16, 2006 - link

    Stash, your logic is faulty. Please give up.
  • stash - Friday, June 16, 2006 - link

    Why should I give up? How is my logic faulty.
  • smitty3268 - Friday, June 16, 2006 - link

    <I>Is Expose the same as the new compiz and XGL?</I>

    No, that is more like the Aero interface or OSX's Quartz Extreme. Expose lets you hit a button and then automatically scales and moves every window so that you can see them all and pick out which program you want to use. Think of it as a replacement for ALT-TAB. There is a plugin in compiz that does the same thing.
  • Locutus465 - Friday, June 16, 2006 - link

    Not sure what your issues with 3D were, I only skimmed the artical so I'm sure which video card you used... But it's possible that if you're using ATI you experienced problems due to their drivers. I've seen many more ATI issues in the MS groups than nVida. My 7800GT has no problem with 1600x1200 (full 3d acceloration, no apparent crashing). My only concern wth Vista 64 is drivers... As of right now there's no driver avaailable for the Promies Ultra100TX2 controller card which is a huge issue for me as I have my secondary drive (used to store installers and as my page file drive in XP). I hope MS manages to convince to support 64b as well as 32b is supported. When I do upgrade to Vista, it will be to 64b.
  • JarredWalton - Friday, June 16, 2006 - link

    Page 10: using 6800 Ultra card.

    The problem is both with drivers (64-bit are still being worked on), the OS (still being worked on), and resource requierments are increased under 64-bit mode. Compatibility with various hardware is already worse with Vista, but 64-bit mode is even worse still. Can they fix it before shipping? Hopefully, and one way or another we're going 64-bit in the future.

    It could be that other test systems would be more or less stable, but with a preview of Vista Beta 2 that's really too much extra work. The article was already over 12000 words, so trying it out on five other platforms would make this monolithic task even more daunting. The bottom line is that Vista is still interesting, but it's definitely not ready for release. There's a good reason it has been delayed until 2007, just like the XP x64 delays in the past.
  • DerekWilson - Friday, June 16, 2006 - link

    We have tested Vista with both ATI and NVIDIA drivers and see similar issues between the two. While the numbers were gathered under NVIDIA hardware, we are confident that the same patterns would emerge with ATI at this point in time.
  • Locutus465 - Friday, June 16, 2006 - link

    Well, weird... I've had my share of beta issues but thus far Glass + 3D acceleration hasn't been one of them. I have noticed that installing QuickTime 7 on Vista (at least in my case) renders Vista Ultimite 64 unbootable.

Log in

Don't have an account? Sign up now