Tessellation: Because The GS Isn't Fast Enough

Microsoft and AMD tend to get the most excited about tessellation whenever the topic of DX11 comes up. AMD jumped on the tessellation bandwagon long ago, and perhaps it does make sense for consoles like the XBox 360. Adding fixed function hardware to quickly and efficiently handle a task that improves memory footprint has major advantages in the living room. We still aren't sold on the need for a tessellator on the desktop, but who's to argue with progress?

Or is it really progressive? The tessellator itself is fixed function rather than programmable. Sure, the input to and output of the tessellator can be manipulated a bit through the Hull Shader and Domain Shader, but the heart of the beast is just not that flexible. The Geometry Shader is the programmable block in the pipeline that is capable of tessellation as well as much more, but it just doesn't have the power to do tessellation on any useful scale. So while most everything has been moving towards programmability in the rendering pipe, we have sort of a step backward here. But why?

The argument between fixed function and programmable hardware is always one of performance versus flexibility and usefulness. In the beginning, fixed function was necessary to get the desired performance. As time went on, it became clear that adding in more fixed function hardware to graphics chips just wasn't feasible. The transistors put into specialized hardware just go unused if developers don't program to take advantage of it. This made a shift toward architectures where expanding the pool of compute resources that could be shared and used for many different tasks became a much more attractive way to go. In the general case anyway. But that doesn't mean that fixed function hardware doesn't have it's place.

We do still have the problem that all the transistors put into the tessellator are worthless unless developers take advantage of the hardware. But the reason it makes sense is that the ROI (return on investment: what you get for what you put in) on those transistors is huge if developers do take advantage of the hardware: it's much easier to get huge tessellation performance out of a fixed function tessellator than to put the necessary resources into the Geometry Shader to allow it to be capable of the same tessellation performance programmatically. This doesn't mean we'll start to see a renaissance of fixed function blocks in our graphics hardware; just that significantly advanced features going forward may still require the sacrifice of programability in favor of early adoption of a feature. The majority of tasks will continue to be enabled in a flexible programmable way, and in the future we may see more flexibility introduced into the tessellator until it becomes fully programmable as well (or ends up just being merged into some future version of the Geometry Shader).

Now don't let this technical assessment of fixed function tessellation make you think we aren't interested in reaping the benefits of the tessellator. Currently, artists need to create different versions of their objects for different LODs (Level of Detail -- reducing or increasing complexity as the object moves further or nearer the viewer), and geometry simulation through texturing at each LOD needs to be done by pixel shaders. This requires extra work from both artists and programmers and costs a good bit in terms of performance. There are also some effects than can only be done with more geometry.

Tessellation is a great way to get that geometry in there for more detail, shadowing, and smooth edges. High geometry also allows really cool displacement mapping effects. Currently, much geometry is simulated through textures and techniques like bump mapping or parallax occlusion mapping or some other technique. Even with high geometry, we will want to have large normal maps for our lighting algorithms to use, but we won't need to do so much work to make things like cracks, bumps, ridges, and small detail geometry appear to be there when it isn't because we can just tessellate and displace in a single pass through the pipeline. This is fast, efficient, and can produce very detailed effects while freeing up pixel shader resources for other uses. With tessellation, artists can create one sub division surface that can have a dynamic LOD free of charge; a simple hull shader and a displacement map applied in the domain shader will save a lot of work, increase quality, and improve performance quite a bit.

If developers adopt tessellation, we could see cool things, and with the move to DX11 class hardware both NVIDIA and AMD will be making parts with tessellation capability. But we may not see developers just start using tessellation (or the compute shader for that matter) right away. Because DirectX 11 will run on down level hardware and at the release of DX11 we will already have a huge number cards on the market capable of running a subset of DX11 bringing with it a better, more refined, programming language in the new version of HLSL and seamless parallelization optimizations, we will very likely see the first DX11 games only implementing features that can run completely on DX10 hardware.

Of course, at that point developers can be fully confident of exploiting all the aspects of DX10 hardware, which they still aren't completely taking advantage of. Many people still want and need a DX9 path because of Vista's failure, which means DX10 code tends to be more or less an enhanced DX9 path rather than something fundamentally different. So when DirectX 11 finally debuts, we will start to see what developers could really do with DX10.

Certainly there will be developers experimenting with tessellation, but these will probably just be simple amplification to get rid of those jagged edges around curved surfaces at first. It will take time for the real advanced tessellation techniques everyone is excited about to come to fruition.

So What's a Tessellator? One Last Thing and Closing Thoughts
Comments Locked

109 Comments

View All Comments

  • ssj4Gogeta - Saturday, January 31, 2009 - link

    "DX11 offers nothing new over DX10, as quoted in the article its just a strict superset that builds on and adds features to DX10 capability."

    aren't you contradicting yourself? :)
  • chizow - Saturday, January 31, 2009 - link

    Oh right, that should read nothing new with regards to hardware requirements. They could've just as easily added the features and called it DX10a or DX10.2 etc....
  • FesterSilently - Saturday, January 31, 2009 - link

    Hrm...all I really pulled from this article was:

    - "...the rejection of Vista" pg. 1
    - "...no one knew how much Vista would really suck" pg. 2
    - "...slow adoption of Vista" pg. 3
    - "...ends up being a more expensive Vista in a shiny package" pg. 3
    - "...because of Vista's failure" pg. 7
    - "...as Vista still sucks" pg. 8
    - "...better upgrade option for XP users than Vista" pg. 8

    Oh, yeah! And:

    - "...DX 11 looks to rawk" (my quote)

    Well.

    I'm glad we cleared all that up. Now where's that XP disk...?

    :/
  • ssj4Gogeta - Saturday, January 31, 2009 - link

    Sorry for posting this again, but Derek, have we had any more news on Larrabee? Weren't the first samples supposed to be ready by the end of 2008?

    I also read somewhere that Intel bought Project Offset to use their technology in the launch title for Larrabee.
  • scruffypup - Saturday, January 31, 2009 - link

    Interesting there is still the bashing on Vista,..

    Some say it "sucks"

    Answer this:
    Does Vista do everything Xp does? YES
    Does Xp do everything Vista does? NO

    So how can you say Vista sucks in comparison to XP? The driver issue? That has happened on most releases of Microsoft operating systems and is not the fault of the operating system? The fact old software does not always work on it? That again is not the operating system fault,.. the software was written for a certain operating system,...

    Security? I think we all know that Vista is inherently more secure
    Performance? Does a new software package (OS, driver, game) always mean better performance,... most often NO!!! GAMES especially,.. they do more,... but are bigger resource hogs,... most drivers you can say the same,...

    I feel that Derek's article was unprofessional and filled with a bias which will lead me to steer clear of his future articles,... and ESPECIALLY any opinions he wants to chime about,... sorry to see the Anandtech site have such "craptacular" articles that "suck"!!!
  • MightyDrunken - Wednesday, February 4, 2009 - link

    To love or hate Vista - either way is an opinion. For me there is no correct answer regarding Vista. If the article writer is not allowed an opinon which disagrees with some of it's readers then AnandTech articles will be worthless.
    I use Vista daily and my impression is it sucks, sorry.
    On a two year old dell its slow, very slow(2 Gig RAM, Dual Core duo). All drivers are up to date. My slower windows XP machine was much faster.
    The only improvements with Vista I notice are the breadcrumb trail in Explorer and search on the start menu.
    Those improvements are not worth 13+ gigs of files and a fairly recent computer. Someone will pipe up and say, "Oh but hard drives are cheap", but what if I want to backup my install to DVD, memory stick...?
    Vista is pure bloat. Lets hope the Windows 7 hype is not as misleading as Vista's hype before release.
  • epyon96 - Saturday, January 31, 2009 - link

    Suffice to say the article does not have the flair of Anand Shimpi but it was educational. The Vista comment was unnecessary and seemed out of place.

    You kept emphasizing how Dx11 is a superset of Dx10. I am wondering why Microsoft just named it Dx10.2 or something of that nature to indicate the superset nature of it? What is the fundamental difference between a 1 and 0.1 or 0.2 advancement in Direct X technologies.
  • bigsnyder - Saturday, January 31, 2009 - link

    I think many of you are trying to interpret what you want to hear from his Vista comments. Bottom line, it is his article, he can say what he wants. I would say that there is far more people agreeing with his comments than what is posting here. There is no denying the fact the Vista did not live up to its hype at launch. Sure, XP had teething problems as well, but the difference here is that XP does offer a significant reason to upgrade over its predecessors win98/ME (w2k was a different market segment). Outside of DX10, what does Vista offer that I should be compelled to upgrade? Vista does not offer that same compelling reason. The current state of Vista is almost irrelevant (I'm sorry, but even with the improvements, Vista still does not paint a rosie picture). The damage is already done. Why do you think MS is accelerating Windows 7 development? Derek, thank you for your honest perspective.
  • Intelman07 - Saturday, January 31, 2009 - link

    Urgh Vista bashing from Anandtech...

    Vista simply does not suck.
  • bobvodka - Saturday, January 31, 2009 - link

    Ok, lets cover a few things with one post;

    1) Vista "sucks".
    I find this claim today intresting; 99% of those I know who have used Vista have seen it as a large improvement over XP, myself included, and those who haven't generally have low spec or unsupported hardware. I've used Win3, 3.11, 95, 98, 98SE, ME, 2K, XP, XPx64 and now Vista and out of all of them Vista has been the smoothest OS I've had from day one (this was March 2007 when I accidently killed my XPx64 install by not paying attention) with the only troubles being 3rd party drivers (such as Creative's inability to write drivers which work first time out and NV apprently forgetting how to write them for around 9months in 2007).

    So, Vista far from sucks, what Vista suffers from is being bashed left, right and center even before it was released by 'tech sites' who brought into the whole 'Vista sucks' thing and continued the myth. I can only assume this is because you get better readership from saying something of MS's sucks rather than 'hey, it isn't perfect BUT...' type thing. Hey, that's journalism all over I guess.

    2) Vista's development time
    This was always going to be a problem for MS. XP was built upon Win2K, indeed they share the same driver model, which was built upon NT and the 9x kernel (in places) so it had a very long development history behind it. Vista had a whole new design thrown at it, new driver model, improved security model etc etc; this stuff doesn't happen quickly nor cheaply. The fact it had such a major overhall and worked so well out of the gate is nothing short of impressive.

    The problem however is that many of these changes are 'under the hood'. All the end user sees is a new shiney interface and wonders 'why did this take so long?'. Now, I guess MS could have tried to explain this to the ordinary person, much like they did to technical people, however I suspect this would have been a waste of time because all average Joe User cares about is if it will run his stuff.

    (side note: this is something MS really don't get enough praise for, the mindbending amount of work they put in to maintain backwards compatbility between their OS revisions. Take program written for Win95 and chances are it'll work just fine in Vista, THATS impressive.)

    3) DX10 and the performance quest
    This is another one of those things were people needed more information than they were given to understand whats going on here. The simple truth is, yes, DX10 allows you to write programs which use the GPU better and reduce CPU overhead (this reduction was infact a major part of the performance they were talking about, however everyone assumed when they said 'performance' they meant 'frames per second'); however this would require writing DX10 code, not naively port their DX9c code across and hope everything works out. The problem is this cost time and money, and with the major 2 consoles being DX9 level hardware (more or less) anything which needs to be crossplatform isn't going to have 'shiney DX10 renderer' high on their 'todo list'. (site note: the PS3 doesn't use OpenGL, it has an OpenGL|ES library but anyone with any sense codes to Sony's own graphics library instead).

    Of course, once these DX10 renderers are done they add more things to the scene as well, be it particles or general increase in the level of detail. So suddenly you are getting more things on screen for around the same cost in many cases.

    End of the day however the DX10 API IS a better API than DX9c and OpenGL; OpenGL did have the chance to 'catch up' but with the dropping of Longs Peak and the release of OpenGL3.0 they threw that away. (personal note; I'd used OpenGL since 1999, however that dropping of the ball made me move away from it).

    4) DX11 on XP.
    Not going to happen.
    Cost and development time don't make it worth while; unless ofcourse everyone was prepared to pay $150+ for an upgrade, because it makes no financal sense to even consider doing this for free and at that cost, well, you might as well get Vista or Windows7.

    5) DX11 and Multi-threading
    I was at the XNA Gamefest 2008 in london and I'm 99% sure that the multithreaded stuff DOESNT require a driver update. Granted, you'll get better performance with one but the runtime itself can deal with it.

    (As for who I am; I work as a programmer for a UK based games company. I wrote the chapter on GLSL for More OpenGL Game Programming and I've been coding now for over 15 years on various pieces of hardware. Just incase you felt I was some newbie :))

Log in

Don't have an account? Sign up now