If you haven't read Part 1 of our Half Life 2 GPU Roundup, click here to read it before continuing on with this article.

The golden rule of Half Life 2 is this – regardless of what sort of graphics card you have (within reason), the game will run well, but at varying levels of image quality.  Here’s an example: in our at_canals_08 benchmark at 1280 x 1024, the ATI Radeon 9700 Pro averages 54.2 frames per second.  A GeForce4 MX440, averages 45.6 frames per second - pretty close considering the 9700 Pro was one of ATI's most powerful GPUs and sold for $400, and the GeForce4 MX is basically a GeForce2 MX.  Now both of these cards were run at their maximum detail settings, but here’s where the two cards differ: the best image quality settings the GeForce4 MX can achieve are significantly lower than what the Radeon 9700 Pro can achieve.  It is this type of situation that lays the foundation for our comparison here today.

When it comes to developing games, the biggest thing publishers look for are minimum system requirements.  The lower the minimum system requirements, the more games publishers can sell.  But the problem with really low minimum system requirements is that you end up with a game that doesn’t look too great on higher end hardware.  Valve has spent a great deal of time ensuring that multiple rendering paths were included with Half Life 2 to not only offer great performance on low end graphics cards, but also to offer spectacular image quality on high end GPUs.  The three codepaths that we will be looking at today are Valve’s DirectX 7, DirectX 8/8.1 and DirectX 9 paths in Half Life 2.

All of the GPUs we compared in Part 1 of our Half Life 2 performance guides not only default to the DirectX 9 codepath, but also perform very well under it.  We will show a comparison between the DX9 and DX8 codepaths shortly but rest assured that if you can run Half Life 2 in DX9 mode then you definitely should as it offers the best image quality you can get.

Half Life 2 performs a system-wide autodetect that will automatically select the best rendering path and image quality settings for your hardware. You can find out which rendering path you are using by looking at the video options control panel within Half Life 2:

The Hardware DirectX level field indicates what rendering path you are currently using. You can specify alternate rendering paths by using the following commandline switch: -dxlevel xx, where xx is 90 for DirectX 9, 81 for DirectX 8.1, 80 for DirectX 8.0 and 70 for DirectX 7.0. You can specify whatever DX level you'd like, but if your hardware doesn't support it you won't get any of the benefits of it. For our tests here today we used the DX9, DX8 and DX7 rendering paths. When possible we used the DX8.1 rendering path.

The point of this article is to compare both the image quality and the performance of the more mainstream DirectX 8 and DirectX 9 GPUs, and what better way to start than by looking at the difference in image quality between the DX8 and DX9 codepaths...

DirectX 9 vs. DirectX 8: Image Quality
Comments Locked

62 Comments

View All Comments

  • abakshi - Sunday, November 21, 2004 - link

    Just a note - the last graph on page 7 seems to be a bit messed up -- the GeForce 6200 is shown as 82.3 FPS - higher than all of the over cards - while the data chart and line graph show it as 53.9 FPS.
  • KrikU - Sunday, November 21, 2004 - link

    Why cant we see benchmarks with AA & AF enabled with mainstream graphics cards? HL2 is using a such engine that is only CPU limited, so AA & AF tests are really welcome!
    Im playing with ti4400 (o/c to ti4600 speeds) with AA 2x & AF 2x! This is first such new game where I can use these image quality enhancements with my card!
  • T8000 - Sunday, November 21, 2004 - link

    Half life 2 seems to be designed around the Radeon 9700.

    Because Valve seems to have made certain promises to ATI, they where not allowed to optimize any Geforce for DX9.

    This also shows with the GF6200, that should be close to the R9600, but is not, due to the optimized Radeon 9700 codepath.

    Luckely, Valve was hacked, preventing this game from messing up the marketplace. Now, almost any card can play it and Nvidia may even be tempted to release a patch in their driver to undo Valves DX9 R9700 cheats and make the game do DX9 the right way for FX owners, without sacrificing any image quality. Just to prove Valve wrong.
  • draazeejs - Sunday, November 21, 2004 - link

    Well, I like HL2 a lot, much more so than the pitch-black, ugly-fuzzy texture D3. But, honestly - to me it looks exactly like Far Cry, engine-wise. Is there any difference?
    Respect to the level-designers of HL2, none of the games comes even close nowadays to that sort of detail and scenery. Also I think the physics of the people and faces and AI is by far superior. And the Raven-yard is much more scary than the whole D3 :)))
  • kmmatney - Sunday, November 21, 2004 - link

    [sarcasm] Oh, and have fun running those DX games on other platforms without emulation. [/sarcasm]

    Obviously, this game isn't meant for other platforms, and that's fine by me. I think the original half-life had an OpenGL option, but it sucked (at least on my old Radeon card). In general, OpenGL has always been a pain, dating back to the old miniGL driver days. In my experience, when playing games that had either a Dx or OpenGL option, the DX option has usually been more reliable. It sould be because I usually have ATI based cards...
  • kmmatney - Sunday, November 21, 2004 - link

    I didn't mean that DX literally "looks" better than OpenGl, I meant that it seems to be more versatile. Here's a game that can be played comfortably over several generations of video cards. You have to buy a new one to play D3 at decent resolution. The HL2 engine seems to have room to spare in terms of using DX 9 features, so the engine can be further enhanced in the future. I would think this game engine would be preferred over the Doom3 engine.
  • oneils - Sunday, November 21, 2004 - link

    #15, Steam's site (under "updates") indicates that the stuttering is due to a sound problem, and that they are working on a fix. Hopefully this will help you.

  • vladik007 - Saturday, November 20, 2004 - link

    " I'm missing words to how pathetic that is. "

    1st my post was no.2 NOT no.3.
    2nd unlike many people i dont have time to work on my personal computers all the time. IF i dont upgrade this holliday season , i'll possibly have to wait until summer vacation. And you dont see nforce4 out now , do you ?
    3rd No it's not pathetic to follow something that's never failed me. Ever heard of satisfied customer ? Well Abit has always treated me very well , RMA proccess , crossshiping , bios updates , good support on official forums ... etc Why on earth should i change ?
    4th got it ?
  • moletus - Saturday, November 20, 2004 - link

    I really would like to see some ATI 8500-9200 results too..
  • Pannenkoek - Saturday, November 20, 2004 - link

    #18: It depends on what features of the videocards are used for how a game will look like, and the art. It's not dirct3d vs opengl, the videocards are the limiting factor. Doom III is just too dark, and that's because of an optimization used in the shadowing. ;-)

    #26: Surely you mean "#2", I'm all for AMD. Not that my current computer is not pathetic compared with what's around nowadays...

Log in

Don't have an account? Sign up now