DirectX 9 vs. DirectX 8: Image Quality

Remember ATI’s Shader Day last year where Valve announced that NVIDIA’s DirectX 9 hardware should be treated as DX8 hardware and nothing more?  Well, things haven’t really changed – in our tests, NVIDIA’s GeForce 5900XT was between 50 and 72% slower in DX9 mode than in DX8 mode.  In fact, the 5900XT is so slow in DX9 mode that ATI’s $80 Radeon X300 SE actually posts significantly higher average frame rates.  So if you own a NV3x class GPU, you are pretty much excluded from running Valve’s DirectX 9 codepath.  What, then do you lose by going down to the DirectX 8.1/8.0 codepaths?

The first thing we wanted to check was the flashlight shader – how different did it look from DX9 to DX8?  The default image below is the DX9 image, mouse over the image to see the flashlight shader rendered using Valve’s DX8 path:



Hold mouse over image to see DX8 mode

There are some slight differences between the two images, but interesting enough none of them appear to have anything to do with the flashlight shader itself. 

The first difference is in the shading on the gun, the DX8 gun has a much brighter surface while the DX9 gun looks a bit more realistically lit. The same can be said about the rails on the train tracks, the DX8 rails stand out a lot more while the DX9 rails appear to be more realistically lit. 

There are many minor differences like this, however the biggest difference between DX8 and DX9 is the water:



Hold mouse over image to see DX8 mode

Using the DirectX 9 codepath, the water in Half Life 2 is so much more realistic.  You can download full uncompressed versions of these images here.

Overall, the move from DX9 down to DX8 isn’t horrible; while it does reduce some of the appeal of Half Life 2, the game still looks incredible in DX8 mode.  There are some issues with forcing NV3x GPUs to run in DX9 mode mainly involving the water, but as you will see on the coming pages, if you've got a NV3x you're not going to want to play in DX9 mode.

Index DirectX 9 Performance Impact
Comments Locked

62 Comments

View All Comments

  • Lord Banshee - Monday, November 22, 2004 - link

    Sorry about above post,

    #50, i hope you are only takinf about nv3x and below core? the nv4x core is almost as good as the newest radeon in rendering dx9 games.

    On a side note does anybody care the reason why doom3 models and textures are as good as half-life2? One being the amount of GPU processing power the lighting system takes. And the special effects. I am sure if every body had a 6800 Ultra then ID would have made the textures in doom3 better and used more high polygon models.

    But in we all don't so they instead used alot of normal mapping(the future in gameing) and a brand new light system never seen in games before.

    But again you most see that the doom3 engine has the ability of using huge textures and models but it is game dependent. Not all games that will use this engine will have the same lighting effect and such, they might want to show off their texture skills, it is the game companies choice.

    What doom3 fails at is outdoor enviroments, this is where the Source engine has them good (so they say, i have yet to play half-life 2)

    But it looks like the Unreal3 engine will be the best of both worlds, but thats another 2 years most likly.
  • Lord Banshee - Monday, November 22, 2004 - link

  • nserra - Monday, November 22, 2004 - link

    #40 T8000 ???!?!

    So why 6600 and 6800 perform very well and 6200 so bad? Aren’t they all the same card? Your post is pointless.

    Luckily Valve was hacked?, are you kidding how many people including like my self buy a piece of crap like the 5600, that performs so bad no only on this game but many others. TOO BAD IT WAS HACKED!!!

    Sure any card plays it today like one year ago, but not the right way!!!!

    I don’t know but I bet when more DX9.0 games came out the difference between the Ati and nvidia will be bigger. Unless there will be an option to enable the fast FP16 mode providing lower image quality like Far Cry.
  • nserra - Monday, November 22, 2004 - link

    We all know that who bought the Ati 9xxx have done a better job than the ones who bought the FX5xxx series card.

    Now what about an 8500 vs GF3/4.
    And some 9000 card too?

    DX8.1 is different of DX8.0, I would like to know if the 8500/9000 was a better buy, but today over the geforce3/4.

    It’s really important since GFfx sucks today but not 2 years ago, who know what will happen 2 years from now with 9xxx and 6xxx.

    Why 6200 performs so badly, and 6600 and 6800 so good?
  • dderidex - Monday, November 22, 2004 - link

    FYI, the compare image on [L=this page]http://www.anandtech.com/video/showdoc.aspx?i=2281...[/L] for the water is all wrong. I don't know what they were using for the 'DX8' sample of the water reflection, but that's not what it looks like at all on a GeForce FX card. It looks virtually indistinguishable from the DX9 sample, only with noticeably less smooth transitions with the coastal terrain (not shown in that shot).

    Unless AT intentionally disabled world reflections when switching to DX8 mode? But, I have a hard time believing they would be so biased.
  • blckgrffn - Sunday, November 21, 2004 - link

    8500/9100 & 9000/9200 & fx5200/5700 Radeon 7000/7500 & GF3/GF2 benches please! There are a lot of these cards out there and I am curious!
  • TheRealSkywolf - Sunday, November 21, 2004 - link

    45, ati contributed with a big cut of the budget for half life 2. Thats why it got delayed 1 year.
    So it is blatant obbious that valve was told to not not make dx 9.0 work well for nvidia fx.
  • moletus - Sunday, November 21, 2004 - link

    #40, you are so wrong wrong and wrong again. What kinda idiot game developer woulnt code as good as possible, regards of who gave em what development money? There are plenty of nvidia cards out there and im quite sure they want to play HL2 too.

    It's all about making $$$, so...
  • Cybercat - Sunday, November 21, 2004 - link

    #40, not necessarily. The 6200 is typically found to perform close to the X300. Only a few times will it meet up with the X600 Pro's standards.

    http://anandtech.com/video/showdoc.aspx?i=2238&...
  • abakshi - Sunday, November 21, 2004 - link

    *other (not over lol)

Log in

Don't have an account? Sign up now