DirectX 9 Performance Impact

Now that you've seen what improvements Half Life 2's DX9 path can give you, let's take a look at the price to pay for some of those impressive visual effects. In order to measure the impact of the DX9 path we did the following: ran benchmarks using both the DX8 and DX9 paths, then took the percentage decrease in performance seen by going to DX9. We then averaged the percentage decrease across all five of our custom Half Life 2 benchmarks, per card, per resolution. We will look at actual performance numbers shortly, but this is just to give you an idea of what's to come:

At 800 x 600 the game is mostly CPU bound on cards like the Radeon 9600XT, thus the performance drom from DX8 to DX9 is quite small. Even on cards like the X300 and the Radeon 9550 the performance hit isn't too bad at less than 20%. But here's the kicker, the GeForce 5900XT sees almost a 60% drop in performance by going to DX9 mode. This type of a performance drop should be relatively consistent across the entire NV3x line (e.g. FX 5900 Ultra, FX 5600, etc...).

Half Life 2 DirectX 9 vs DirectX 8 Performance Penalty

At 1024 x 768 now all of the GPUs are in double digit performance losses, but even the GeForce 6200 with its 25% performance hit is nothing compared to the 5900XT which incurrs a 65% performance hit when going to DX9.

Half Life 2 DirectX 9 vs DirectX 8 Performance Penalty

At 1280 x 1024 things get just a little worse, but you should get the picture by now - the GeForce FX line does not perform well as a DX9 part under Half Life 2.

Half Life 2 DirectX 9 vs DirectX 8 Performance Penalty

You will undoubtedly see these statistics reflected in the actual performance of the 5900XT in the coming pages, but basically if you are a NV3x GPU owner you will want to run Half Life 2 in DX8 mode and not DX9 mode.

Now let's take a look at how the rest of the GPUs perform in both DX8 and DX9 modes. For these tests we used the exact same drivers and platforms as our first article, just with different video cards so the numbers are comparable.

DirectX 9 vs. DirectX 8: Image Quality Battle in the Canal
Comments Locked

62 Comments

View All Comments

  • Lord Banshee - Monday, November 22, 2004 - link

    Sorry about above post,

    #50, i hope you are only takinf about nv3x and below core? the nv4x core is almost as good as the newest radeon in rendering dx9 games.

    On a side note does anybody care the reason why doom3 models and textures are as good as half-life2? One being the amount of GPU processing power the lighting system takes. And the special effects. I am sure if every body had a 6800 Ultra then ID would have made the textures in doom3 better and used more high polygon models.

    But in we all don't so they instead used alot of normal mapping(the future in gameing) and a brand new light system never seen in games before.

    But again you most see that the doom3 engine has the ability of using huge textures and models but it is game dependent. Not all games that will use this engine will have the same lighting effect and such, they might want to show off their texture skills, it is the game companies choice.

    What doom3 fails at is outdoor enviroments, this is where the Source engine has them good (so they say, i have yet to play half-life 2)

    But it looks like the Unreal3 engine will be the best of both worlds, but thats another 2 years most likly.
  • Lord Banshee - Monday, November 22, 2004 - link

  • nserra - Monday, November 22, 2004 - link

    #40 T8000 ???!?!

    So why 6600 and 6800 perform very well and 6200 so bad? Aren’t they all the same card? Your post is pointless.

    Luckily Valve was hacked?, are you kidding how many people including like my self buy a piece of crap like the 5600, that performs so bad no only on this game but many others. TOO BAD IT WAS HACKED!!!

    Sure any card plays it today like one year ago, but not the right way!!!!

    I don’t know but I bet when more DX9.0 games came out the difference between the Ati and nvidia will be bigger. Unless there will be an option to enable the fast FP16 mode providing lower image quality like Far Cry.
  • nserra - Monday, November 22, 2004 - link

    We all know that who bought the Ati 9xxx have done a better job than the ones who bought the FX5xxx series card.

    Now what about an 8500 vs GF3/4.
    And some 9000 card too?

    DX8.1 is different of DX8.0, I would like to know if the 8500/9000 was a better buy, but today over the geforce3/4.

    It’s really important since GFfx sucks today but not 2 years ago, who know what will happen 2 years from now with 9xxx and 6xxx.

    Why 6200 performs so badly, and 6600 and 6800 so good?
  • dderidex - Monday, November 22, 2004 - link

    FYI, the compare image on [L=this page]http://www.anandtech.com/video/showdoc.aspx?i=2281...[/L] for the water is all wrong. I don't know what they were using for the 'DX8' sample of the water reflection, but that's not what it looks like at all on a GeForce FX card. It looks virtually indistinguishable from the DX9 sample, only with noticeably less smooth transitions with the coastal terrain (not shown in that shot).

    Unless AT intentionally disabled world reflections when switching to DX8 mode? But, I have a hard time believing they would be so biased.
  • blckgrffn - Sunday, November 21, 2004 - link

    8500/9100 & 9000/9200 & fx5200/5700 Radeon 7000/7500 & GF3/GF2 benches please! There are a lot of these cards out there and I am curious!
  • TheRealSkywolf - Sunday, November 21, 2004 - link

    45, ati contributed with a big cut of the budget for half life 2. Thats why it got delayed 1 year.
    So it is blatant obbious that valve was told to not not make dx 9.0 work well for nvidia fx.
  • moletus - Sunday, November 21, 2004 - link

    #40, you are so wrong wrong and wrong again. What kinda idiot game developer woulnt code as good as possible, regards of who gave em what development money? There are plenty of nvidia cards out there and im quite sure they want to play HL2 too.

    It's all about making $$$, so...
  • Cybercat - Sunday, November 21, 2004 - link

    #40, not necessarily. The 6200 is typically found to perform close to the X300. Only a few times will it meet up with the X600 Pro's standards.

    http://anandtech.com/video/showdoc.aspx?i=2238&...
  • abakshi - Sunday, November 21, 2004 - link

    *other (not over lol)

Log in

Don't have an account? Sign up now