A Pretty Decent DirectX 7

The second most popular Half Life 2 GPU according to Valve’s own statistics is the GeForce4 MX.  As a quick refreshed, the GeForce4 MX is basically a GeForce2 MX with an updated memory controller, so its feature set will not be DirectX 8 compliant like the GeForce4, rather it is more of a DX7 part.  So what do you lose if you’ve got an older card like a GeForce2, GeForce4 MX or original Radeon?

Of course the water quality in DX7 mode is similar to what we saw in DX8 mode, but there are much larger sacrifices made in DX7 mode. 

For starters, features like bump mapping are gone, making the levels look significantly worse than when using the DX8 path.  The screenshot below shows Valve’s DX8 path at work, mouse over to see what you lose by going to DX7:



Hold mouse over image to see DX7 mode

While you could argue that there’s not too big of a difference between DX9 and DX8 (other than the water), everything looks significantly worse in DX7 mode. 

The other big change is that in DX7 mode the draw distances are significantly reduced, so what you notice are that certain objects will slowly fade in the closer you get to them.  For example, staring into the distance we see nothing in front of the chain link fence:

Stepping forward we begin to see something faint fade in:

Moving in a little closer we see the green dumpster completely:

Another few steps and we see two more objects faintly appear:

A little further and we see two trashcans appear as well:

Running in DX7 mode does sacrifice quite a bit, but the game is extremely playable as you are about to see.  If you have the ability to run in a higher quality mode then you definitely should, but what’s most important is that even in DX7 mode Half Life 2 looks better than any other DX7 title.  At the same time Half Life 2 in DX7 mode runs and looks better than newer games on graphics hardware that’s now four years old.  You can download all of the screenshots on this page in an uncompressed format here.

Let’s see how well the GeForce4 MX runs in DirectX 7 mode…

The Slowest Level in the Game GeForce4 MX DirectX 7 Performance
Comments Locked

62 Comments

View All Comments

  • Lord Banshee - Monday, November 22, 2004 - link

    Sorry about above post,

    #50, i hope you are only takinf about nv3x and below core? the nv4x core is almost as good as the newest radeon in rendering dx9 games.

    On a side note does anybody care the reason why doom3 models and textures are as good as half-life2? One being the amount of GPU processing power the lighting system takes. And the special effects. I am sure if every body had a 6800 Ultra then ID would have made the textures in doom3 better and used more high polygon models.

    But in we all don't so they instead used alot of normal mapping(the future in gameing) and a brand new light system never seen in games before.

    But again you most see that the doom3 engine has the ability of using huge textures and models but it is game dependent. Not all games that will use this engine will have the same lighting effect and such, they might want to show off their texture skills, it is the game companies choice.

    What doom3 fails at is outdoor enviroments, this is where the Source engine has them good (so they say, i have yet to play half-life 2)

    But it looks like the Unreal3 engine will be the best of both worlds, but thats another 2 years most likly.
  • Lord Banshee - Monday, November 22, 2004 - link

  • nserra - Monday, November 22, 2004 - link

    #40 T8000 ???!?!

    So why 6600 and 6800 perform very well and 6200 so bad? Aren’t they all the same card? Your post is pointless.

    Luckily Valve was hacked?, are you kidding how many people including like my self buy a piece of crap like the 5600, that performs so bad no only on this game but many others. TOO BAD IT WAS HACKED!!!

    Sure any card plays it today like one year ago, but not the right way!!!!

    I don’t know but I bet when more DX9.0 games came out the difference between the Ati and nvidia will be bigger. Unless there will be an option to enable the fast FP16 mode providing lower image quality like Far Cry.
  • nserra - Monday, November 22, 2004 - link

    We all know that who bought the Ati 9xxx have done a better job than the ones who bought the FX5xxx series card.

    Now what about an 8500 vs GF3/4.
    And some 9000 card too?

    DX8.1 is different of DX8.0, I would like to know if the 8500/9000 was a better buy, but today over the geforce3/4.

    It’s really important since GFfx sucks today but not 2 years ago, who know what will happen 2 years from now with 9xxx and 6xxx.

    Why 6200 performs so badly, and 6600 and 6800 so good?
  • dderidex - Monday, November 22, 2004 - link

    FYI, the compare image on [L=this page]http://www.anandtech.com/video/showdoc.aspx?i=2281...[/L] for the water is all wrong. I don't know what they were using for the 'DX8' sample of the water reflection, but that's not what it looks like at all on a GeForce FX card. It looks virtually indistinguishable from the DX9 sample, only with noticeably less smooth transitions with the coastal terrain (not shown in that shot).

    Unless AT intentionally disabled world reflections when switching to DX8 mode? But, I have a hard time believing they would be so biased.
  • blckgrffn - Sunday, November 21, 2004 - link

    8500/9100 & 9000/9200 & fx5200/5700 Radeon 7000/7500 & GF3/GF2 benches please! There are a lot of these cards out there and I am curious!
  • TheRealSkywolf - Sunday, November 21, 2004 - link

    45, ati contributed with a big cut of the budget for half life 2. Thats why it got delayed 1 year.
    So it is blatant obbious that valve was told to not not make dx 9.0 work well for nvidia fx.
  • moletus - Sunday, November 21, 2004 - link

    #40, you are so wrong wrong and wrong again. What kinda idiot game developer woulnt code as good as possible, regards of who gave em what development money? There are plenty of nvidia cards out there and im quite sure they want to play HL2 too.

    It's all about making $$$, so...
  • Cybercat - Sunday, November 21, 2004 - link

    #40, not necessarily. The 6200 is typically found to perform close to the X300. Only a few times will it meet up with the X600 Pro's standards.

    http://anandtech.com/video/showdoc.aspx?i=2238&...
  • abakshi - Sunday, November 21, 2004 - link

    *other (not over lol)

Log in

Don't have an account? Sign up now