GeForce4 MX DirectX 7 Performance

In our at_canals_08 demo the numbers speak for themselves, using the DX7 path even the GeForce4 MX440 is extremely playable under Half Life 2:

Half Life 2 DirectX 7 GeForce4 MX Performance

Performance looks even better in at_coast_05.dem:

Half Life 2 DirectX 7 GeForce4 MX Performance

Half Life 2 DirectX 7 GeForce4 MX Performance

Half Life 2 DirectX 7 GeForce4 MX Performance

Even in our most stressful demo, the GeForce4 MX does just fine.

Half Life 2 DirectX 7 GeForce4 MX Performance

While it may not look as good as the DX8 and DX9 codepaths, the DirectX 7 support in Half Life 2 is nothing short of incredible. Older card owners should upgrade their CPUs as needed but needn't upgrade their graphics cards unless they want better image quality, in terms of speed, even something as slow as a GeForce4 MX will do just fine.

A Pretty Decent DirectX 7 Final Words
Comments Locked

62 Comments

View All Comments

  • ukDave - Friday, November 19, 2004 - link

    Not that i'm saying that is the reason it performs so badly, it is due to its poor implementation of DX9.0. I think the whole nV 5xxx line needs to be swept under the carpet because i simply can't say anything nice about it :)
  • ukDave - Friday, November 19, 2004 - link

    Doom3 was optimized for nVidia, much like HL2 is for ATi.
  • mattsaccount - Friday, November 19, 2004 - link

    How can a 5900 be so poor at dx9 style effects in HL2, and excel at an (arguably) more graphically intense game like Doom 3? The difference can't be due only to the AP (Dx vs OGL), can it?
  • ZobarStyl - Friday, November 19, 2004 - link

    Doh login post: FYI the bar graphs on page six are both the DX8 pathway.
  • ZobarStyl - Friday, November 19, 2004 - link

  • Cybercat - Friday, November 19, 2004 - link

    Good article. I'm a little disappointed in the 6200's performance though.
  • thebluesgnr - Friday, November 19, 2004 - link

    Hi!

    Have not read the article yet but I'd like to ask one thing:

    The Radeon 9550 tested has 64-bit or 128-bit memory interface? From your numbers I'm sure it's 128-bit, but I think some people might order the cheapest (=64-bit) after reading the article, so it would be nice to see it mentioned.

    On the same line, I would like to see AnandTech mention the GPU and memory clocks for all the video cards benchmarks.

    btw, the X300SE was tested on a platform with the same processor as the other AGP cards, right?

    Thank you.
  • shabby - Friday, November 19, 2004 - link

    Holy crap my ti4600 can muster 60fps in hl2 ahahaha.
  • skunkbuster - Friday, November 19, 2004 - link

    yikes! i feel sorry for those people using video cards that only support DX7.
  • Pannenkoek - Friday, November 19, 2004 - link

    I wonder if "playability" is merely based on the average framerates of demos, or that somebody actually tried to play the game with an old card. Counter Strike became barely playable with less than 40 fps later in its life, while average framerates could be "good enough" and while it used to run smoothly at the same framerate in older versions.

Log in

Don't have an account? Sign up now