Drilling Down: DX11 And The Multi-Threaded Game Engine

In spite of the fact that multi-threaded programming has been around for decades, mainstream programmers didn't start focusing on parallel programming until multi-core CPUs started coming along. Much general purpose code is straightforward as a single thread; extracting performance via parallel programming can be difficult and isn't always obvious. Even with talented programmers, Amdahl's Law is a bitch: your speed up from parallelization is limited by the percent of code that is necessarily sequential.

Currently, in game development, rendering is one of those "necessarily" sequential tasks. DirectX 10 isn't set up to appropriately handle multiple threads all throwing commands at the GPU. That doesn't mean parallelization of renderers can't happen, but it does limit speed up because costly synchronization techniques or management threads need to be implemented in order to make sure nothing steps out of line. All this limits the benefit of parallelization and discourages programmers from trying too hard. After all, it's a better idea to put more of your effort into areas where performance can be improved more significantly. (John Carmack put it really well once, but I can't remember the quote... and I'm doing too much benchmarking to go look for it now. :-P)

No matter what anyone does, some stuff in the renderer will need to be sequential. Programs, textures, and resources must be loaded up; geometry happens before pixel processing; draw calls intended to be executed while a certain state is active must have that state set first and not changed until completion. Even in such a massively parallel machine, order must be maintained for many things. But order doesn't always matter.

Making more things thread-safe through an extended device interface using multiple contexts and making a lot of synchronization overhead the responsibility of the API and/or graphics driver, Microsoft has enabled game developers to more easily and effortlessly thread not only their rendering code, but their game code as well. These things will also work on DX10 hardware running on a system with DX11, though some missing hardware optimizations will reduce the performance benefit. But the fundamental ability to write code differently will go a long way to getting programmers more used to and better at parallelization. Let's take a look at the tools available to accomplish this in DX11.

First up is free threaded asynchronous resource loading. That's a bit of a mouthful, but this feature gives developers the ability to upload programs, textures, state objects, and all resources in a thread-safe way and, if desired, concurrent with the rendering process. This doesn't mean that all this stuff will get pushed up in parallel with rendering, as the driver will manage what gets sent to the GPU and when based on priority, but it does mean the developer no longer has to think about synchronizing or manually prioritizing resource loading. Multiple threads can start loading whatever resources they need whenever they need them. The fact that this can also be done concurrently with rendering could improve performance for games that stream in data for massive open worlds in addition to enabling multi-threaded opportunities.

In order to enable this and other threading, the D3D device interface is now split into three separate interfaces: the Device, the Immediate Context, and the Deferred Context. Resource creation is done through the Device. The Immediate Context is the interface for setting device state, draw calls, and queries. There can only be one Device and one Immediate Context. The Deferred Context is another interface for state and draw calls, but many can exist in one program and can be used as the per-thread interface (Deferred Contexts themselves are thread unsafe though). Deferred Contexts and the free threaded resource creation through the device are where DX11 gets it multi-threaded benefit.

Multiple threads submit state and draw calls to their Deferred Context which complies a display list that is eventually executed by the Immediate Context. Games will still need a render thread, and this thread will use the Immediate Context to execute state and draw calls and to consume the display lists generated by Deferred Contexts. In this way, the ultimate destination of all state and draw calls is the Immediate Context, but fine grained synchronization is handled by the API and the display driver so that parallel threads can be better used to contribute to the rendering process. Some limitations on Deferred Contexts include the fact that they cannot query the device and they can't download or read back anything from the GPU. Deferred Contexts can, however, consume the display lists generated by other Deferred Contexts.

The end result of all this is that the future will be more parallel friendly. As two and four core CPUs become more and more popular and 8 and 16 (logical) core CPUs are on the horizon, we need all the help we can get when trying to extract performance from parallelism. This is a good move for DirectX and we hope it will help push game engines to more fully utilize more than two or even four cores when the time comes.

From Evolution to Expansion and Multi-Threading: The Mile High Overview Going Deeper: The DX11 Compute Shader and OpenCL/OpenGL
Comments Locked

109 Comments

View All Comments

  • Havor - Saturday, January 31, 2009 - link

    DX10 is Vista only ware XP and 98SE/win2k shared a common DX

    And yeah XP was Win2K but it had better gaming support then W2K so there was no reason not to use XP over W2K, and only very old games gave some problems whit XP over 98SE.

    Ware Vista almost had no RL benefits over XP just a high resource hog and steep learning curve.

    If people could had a those between XP and vista i think Vista Nr's would have bin 75% lower

    Other then DX10 and x64 there is no reason for me to go to Vista, so i will wait for Win7 and upgrade to i7/i5 till then my X2 6000+ and XP sp3 will do
  • michal1980 - Friday, January 30, 2009 - link

    i agree with most of your logic. And it makes sense. It didn't feel right with the rest of the article though.


    I will however disagree with it being compared to Win ME.

    Win ME was just junk, unstable, worthless, and never improving. And while vista had teething issues, alot of it was due to a huge shift in the actual OS.

    I haven't tired 7 yet, but from all I read,it seems like vista re-tuned. However I doubt Win7 could have ever gotten to where it is without Vista. Vista was a trail by fire, and in most cases it made it. A huge problem early on was hardware specs to run it were set too low, and it was hell for people on cheap low end hardware.

    My experance has been overall very postive, esspically when I moved to 4gb's of ram. Driver problems now are minimal (x64), and no matter how stable XP ever was. IMHO, and in my experance, vista has been leaps and bounds more stable. I can't recall having an OS crash/lock up that required a reboot. If not for hardware changes/updates, my vista box would never reboot.
  • Havor - Saturday, January 31, 2009 - link

    The compering to ME holds up some bid, do vista is not as bad as ME was.

    How ever on my computer club there was a 100% return to 98SE ore win2k, most users(70%) here that run dual-boot Vista/XP machine, returned to XP after trying Vista for a wile (me included)

    And there has never bin Any love for the OS especially compared to XP.

    The bigger problem for dev. is that DX10 is Vista only, and there are way to many XP machines out there to develop DX10 only games, so from a dev. point of view Vista was/still is the next ME.
  • just4U - Wednesday, February 4, 2009 - link

    WinME is what Vista will eventually be compared to no matter what. I had less problems with WinME then I did with 98, or 95 but it got a bad rap.. Nowadays people just all say it was crap. Im sure 5-7+ years from now they will say the exact same thing about Vista. (shrug)
  • marsbound2024 - Friday, January 30, 2009 - link

    I agree with what you have said. Microsoft had years to develop Vista where previously its OS releases came within two years of each other. It should have been optimized at the very least and we have seen that it is resource-intensive and one might even say "bloated." While Vista does work and I rather like working with Vista 64-bit, I think it should have been more than what it is for having such a lengthy development timeframe. While many of us superbly informed on computer operating systems and application support, the average consumer seems to really dislike Vista. From UAC to horrible startup times on occasion (usually due to services such as Cyberlink), to the network connectivity problems that seem to be related to IPv6 and the fact that it used to bring budget systems to their knees (when they were manufactured with the purpose of running Vista), most people have a bad taste in their mouths from running Vista. Vista had a lot of promise on the table from what was on paper, but in execution it simply generated more headaches than it should have. Windows 7 should hopefully be getting back to what an operating system should be: streamlined GUI with robust, yet optimally programmed features that range from security to file management on NTFS.
  • srp49ers - Friday, January 30, 2009 - link

    The vista comments seemed out of place considering the tone of the rest of the article.
  • Cuhulainn - Friday, January 30, 2009 - link

    Agreed. Even if you think it sucks, give the reasons related to the article. I would like to know what issues there are between Vista/DirectX, as I am a current user. Rather than being told that what I am running sucks, tell me what is wrong with it, or what is right with 7 that is an improvement over Vista.

    That being said, I have little to no knowledge of these things and still found this to be an interesting read. Much appreciated.
  • Staples - Saturday, January 31, 2009 - link

    And the fact that a year ago, most people who said that Vista sucks are ones who unsurprisingly never actually used it. When someone says Vista sucks, I always think there is a high probablility that they are someone who has never used it (and therefore stupid for saying something like that). Anyway, I have been using Vista since it came out and except for the fixes which made it use less resources, there never has been anything wrong with the OS despite what the legion of "I read it on the internet so it must be true" people would have you believe. Seems like the majority of geeks thought it sucked without having ever used it which is just idiotic. When I realized this, it was sad to know how gulible people really are.
  • ssj4Gogeta - Friday, January 30, 2009 - link

    It will be great if MS releases DX11 for XP. I multi-boot XP and Vista and Ubuntu. I think I'll replace Vista with Win 7 and not XP.

    By the way, why haven't we heard anything about Larrabee? Intel said that the first samples would be ready by the end of 2008. It seems to me like it will be a revolutionary step in graphics computing.
  • Ryan Smith - Friday, January 30, 2009 - link

    It won't happen, it can't happen. DX10 goes hand-in-hand with a massive rearchitecting of GPU threading and memory management, which is why we transitioned from the XPDM to WDDM for Vista. You can't backport that kind of stuff, it's a fundamental change in how the OS addresses the GPU and allocates work & resources for it.

Log in

Don't have an account? Sign up now