Going Deeper: The DX11 Compute Shader and OpenCL/OpenGL

Many developers are excited about the added flexibility of the Compute Shader (also referred to as the CS). This addition to the pipeline steps further from a render-centric API and enables more general purpose algorithms. We see added flexibility in both the type of operations that can be preformed on data and the type of data that can be operated on.

In other pipeline stages, we see limitations imposed that are designed to speed up execution that get in the way of general purpose code. Although we can shoehorn general purpose algorithms into a pixel shader program, we don't have the freedom to use data structures like trees, sharing data between pixels (and thus threads) is difficult and costly, and we have to go through the motions of drawing triangles and mapping solutions onto this.

Enter DirectX11 and the CS. Developers have the option to pass data structures over to the Compute Shader and run more general purpose algorithms on them. The Compute Shader, like the other fully programmable stages of the DX10 and DX11 pipeline, will share a single set of physical resources (shader processors).

This hardware will need to be a little more flexible than it currently is as when it runs CS code it will have to support random reads and writes and irregular arrays (rather than simple streams or fixed size 2D arrays), multiple outputs, direct invocation of individual or groups of threads as per the programmer's needs, 32k of shared register space and thread group management, atomic instructions, synchronization constructs, and the ability to perform unordered IO operations.

At the same time, the CS loses some features as well. As each thread is no longer treated as a pixel, so the association with geometry is lost (unless specifically passed in a data structure). This means that, although CS programs can still use texture samplers, automatic trilinear LOD calculations are not automatic (LOD must be specified). Additionally, depth culling, anti-aliasing, alpha blending, and other operations that have no meaning to generic data cannot be performed inside a CS program.

The type of new applications opened up by the CS are actually infinite, but the most immediate interest will come from game developers looking to augment their graphics engines with fancy techniques not possible in the Pixel Shader. Some of these applications include A-Buffer techniques to allow very high quality anti-aliasing and order independent transparency, more advanced deferred shading techniques, advanced post processing effects and convolution, FFTs (fast Fourier transforms) for frequency domain operations, and summed area tables.

Beyond the rendering specific applications, game developers may wish to do things like IK (inverse kinematics), physics, AI, and other traditionally CPU specific tasks on the GPU. Having this data on the GPU by performing calculations in the CS means that the data is more quickly available for use in rendering and some algorithms may be much faster on the GPU as well. It might even be an option to run things like AI or physics on both the GPU and the CPU if algorithms that always yield the same result on both types of processors can be found (which would essentially substitute compute power for bandwidth).

Even though the code will run on the same hardware, PS and CS code will perform very differently based on the algorithms being implemented. One of the interesting things to look at is exposure and histogram data often used in HDR rendering. Calculating this data in the PS requires several passes and tricks to take all the pixels and either bin them or average them. Despite the fact that sharing data is going to slow things down quite a bit, sharing data can be much faster than running many passes and this makes the CS an ideal stage for such algorithms.

A while back we took a look at OpenCL, and we know that OpenCL will be able to share data structures with OpenGL. We haven't yet gotten a developer's take on comparing OpenCL and the DX11 CS, but at first blush it seems that the possibilities opened up for game developers and graphics processing with DX11 and the Compute Shader will also be possible with OpenGL+OpenCL. Although the CS can be used as a general purpose hardware accelerated GPU computing interface, OpenCL is targeted more at that arena and its independence from Microsoft and DirectX will likely mean wider adoption as a GPU compute language for general purpose tasks.

The use of OpenGL has declined significantly in the game developer community over the last five years. While OpenCL may enable DX11 like applications to be written in combination with OpenGL, it is more likely that this will be the venue of workstation applications like CAD/CAM and simulations that require visualization. While I'm a fan of OpenGL myself, I don't see the flexibility of OpenCL as a significant boon to its adoption in game engines.

Drilling Down: DX11 And The Multi-Threaded Game Engine So What's a Tessellator?
Comments Locked

109 Comments

View All Comments

  • ssj4Gogeta - Saturday, January 31, 2009 - link

    "DX11 offers nothing new over DX10, as quoted in the article its just a strict superset that builds on and adds features to DX10 capability."

    aren't you contradicting yourself? :)
  • chizow - Saturday, January 31, 2009 - link

    Oh right, that should read nothing new with regards to hardware requirements. They could've just as easily added the features and called it DX10a or DX10.2 etc....
  • FesterSilently - Saturday, January 31, 2009 - link

    Hrm...all I really pulled from this article was:

    - "...the rejection of Vista" pg. 1
    - "...no one knew how much Vista would really suck" pg. 2
    - "...slow adoption of Vista" pg. 3
    - "...ends up being a more expensive Vista in a shiny package" pg. 3
    - "...because of Vista's failure" pg. 7
    - "...as Vista still sucks" pg. 8
    - "...better upgrade option for XP users than Vista" pg. 8

    Oh, yeah! And:

    - "...DX 11 looks to rawk" (my quote)

    Well.

    I'm glad we cleared all that up. Now where's that XP disk...?

    :/
  • ssj4Gogeta - Saturday, January 31, 2009 - link

    Sorry for posting this again, but Derek, have we had any more news on Larrabee? Weren't the first samples supposed to be ready by the end of 2008?

    I also read somewhere that Intel bought Project Offset to use their technology in the launch title for Larrabee.
  • scruffypup - Saturday, January 31, 2009 - link

    Interesting there is still the bashing on Vista,..

    Some say it "sucks"

    Answer this:
    Does Vista do everything Xp does? YES
    Does Xp do everything Vista does? NO

    So how can you say Vista sucks in comparison to XP? The driver issue? That has happened on most releases of Microsoft operating systems and is not the fault of the operating system? The fact old software does not always work on it? That again is not the operating system fault,.. the software was written for a certain operating system,...

    Security? I think we all know that Vista is inherently more secure
    Performance? Does a new software package (OS, driver, game) always mean better performance,... most often NO!!! GAMES especially,.. they do more,... but are bigger resource hogs,... most drivers you can say the same,...

    I feel that Derek's article was unprofessional and filled with a bias which will lead me to steer clear of his future articles,... and ESPECIALLY any opinions he wants to chime about,... sorry to see the Anandtech site have such "craptacular" articles that "suck"!!!
  • MightyDrunken - Wednesday, February 4, 2009 - link

    To love or hate Vista - either way is an opinion. For me there is no correct answer regarding Vista. If the article writer is not allowed an opinon which disagrees with some of it's readers then AnandTech articles will be worthless.
    I use Vista daily and my impression is it sucks, sorry.
    On a two year old dell its slow, very slow(2 Gig RAM, Dual Core duo). All drivers are up to date. My slower windows XP machine was much faster.
    The only improvements with Vista I notice are the breadcrumb trail in Explorer and search on the start menu.
    Those improvements are not worth 13+ gigs of files and a fairly recent computer. Someone will pipe up and say, "Oh but hard drives are cheap", but what if I want to backup my install to DVD, memory stick...?
    Vista is pure bloat. Lets hope the Windows 7 hype is not as misleading as Vista's hype before release.
  • epyon96 - Saturday, January 31, 2009 - link

    Suffice to say the article does not have the flair of Anand Shimpi but it was educational. The Vista comment was unnecessary and seemed out of place.

    You kept emphasizing how Dx11 is a superset of Dx10. I am wondering why Microsoft just named it Dx10.2 or something of that nature to indicate the superset nature of it? What is the fundamental difference between a 1 and 0.1 or 0.2 advancement in Direct X technologies.
  • bigsnyder - Saturday, January 31, 2009 - link

    I think many of you are trying to interpret what you want to hear from his Vista comments. Bottom line, it is his article, he can say what he wants. I would say that there is far more people agreeing with his comments than what is posting here. There is no denying the fact the Vista did not live up to its hype at launch. Sure, XP had teething problems as well, but the difference here is that XP does offer a significant reason to upgrade over its predecessors win98/ME (w2k was a different market segment). Outside of DX10, what does Vista offer that I should be compelled to upgrade? Vista does not offer that same compelling reason. The current state of Vista is almost irrelevant (I'm sorry, but even with the improvements, Vista still does not paint a rosie picture). The damage is already done. Why do you think MS is accelerating Windows 7 development? Derek, thank you for your honest perspective.
  • Intelman07 - Saturday, January 31, 2009 - link

    Urgh Vista bashing from Anandtech...

    Vista simply does not suck.
  • bobvodka - Saturday, January 31, 2009 - link

    Ok, lets cover a few things with one post;

    1) Vista "sucks".
    I find this claim today intresting; 99% of those I know who have used Vista have seen it as a large improvement over XP, myself included, and those who haven't generally have low spec or unsupported hardware. I've used Win3, 3.11, 95, 98, 98SE, ME, 2K, XP, XPx64 and now Vista and out of all of them Vista has been the smoothest OS I've had from day one (this was March 2007 when I accidently killed my XPx64 install by not paying attention) with the only troubles being 3rd party drivers (such as Creative's inability to write drivers which work first time out and NV apprently forgetting how to write them for around 9months in 2007).

    So, Vista far from sucks, what Vista suffers from is being bashed left, right and center even before it was released by 'tech sites' who brought into the whole 'Vista sucks' thing and continued the myth. I can only assume this is because you get better readership from saying something of MS's sucks rather than 'hey, it isn't perfect BUT...' type thing. Hey, that's journalism all over I guess.

    2) Vista's development time
    This was always going to be a problem for MS. XP was built upon Win2K, indeed they share the same driver model, which was built upon NT and the 9x kernel (in places) so it had a very long development history behind it. Vista had a whole new design thrown at it, new driver model, improved security model etc etc; this stuff doesn't happen quickly nor cheaply. The fact it had such a major overhall and worked so well out of the gate is nothing short of impressive.

    The problem however is that many of these changes are 'under the hood'. All the end user sees is a new shiney interface and wonders 'why did this take so long?'. Now, I guess MS could have tried to explain this to the ordinary person, much like they did to technical people, however I suspect this would have been a waste of time because all average Joe User cares about is if it will run his stuff.

    (side note: this is something MS really don't get enough praise for, the mindbending amount of work they put in to maintain backwards compatbility between their OS revisions. Take program written for Win95 and chances are it'll work just fine in Vista, THATS impressive.)

    3) DX10 and the performance quest
    This is another one of those things were people needed more information than they were given to understand whats going on here. The simple truth is, yes, DX10 allows you to write programs which use the GPU better and reduce CPU overhead (this reduction was infact a major part of the performance they were talking about, however everyone assumed when they said 'performance' they meant 'frames per second'); however this would require writing DX10 code, not naively port their DX9c code across and hope everything works out. The problem is this cost time and money, and with the major 2 consoles being DX9 level hardware (more or less) anything which needs to be crossplatform isn't going to have 'shiney DX10 renderer' high on their 'todo list'. (site note: the PS3 doesn't use OpenGL, it has an OpenGL|ES library but anyone with any sense codes to Sony's own graphics library instead).

    Of course, once these DX10 renderers are done they add more things to the scene as well, be it particles or general increase in the level of detail. So suddenly you are getting more things on screen for around the same cost in many cases.

    End of the day however the DX10 API IS a better API than DX9c and OpenGL; OpenGL did have the chance to 'catch up' but with the dropping of Longs Peak and the release of OpenGL3.0 they threw that away. (personal note; I'd used OpenGL since 1999, however that dropping of the ball made me move away from it).

    4) DX11 on XP.
    Not going to happen.
    Cost and development time don't make it worth while; unless ofcourse everyone was prepared to pay $150+ for an upgrade, because it makes no financal sense to even consider doing this for free and at that cost, well, you might as well get Vista or Windows7.

    5) DX11 and Multi-threading
    I was at the XNA Gamefest 2008 in london and I'm 99% sure that the multithreaded stuff DOESNT require a driver update. Granted, you'll get better performance with one but the runtime itself can deal with it.

    (As for who I am; I work as a programmer for a UK based games company. I wrote the chapter on GLSL for More OpenGL Game Programming and I've been coding now for over 15 years on various pieces of hardware. Just incase you felt I was some newbie :))

Log in

Don't have an account? Sign up now