Half-Life 2

We have all been waiting for this title to hit the market, as it employs DX9 to almost the brim. The pixel shaders employed in this title alone can make one awe. For cost reasons, this engine is likely to be more attractive to gaming developers, as the DOOM III engine costs nearly 1 million dollars US. Valve has taken a different approach by slashing their prices down, and has asked for higher royalties for their license to make up for the lower upfront cost.

From the code standpoint, NVIDIA’s NV3x has had to take a lower code path in HL2, which Valve had to incorporate specially into their design, so that reasonable game play would be achieved on NV3x based cards. You can read more details relating to Half-Life 2 in Anand’s coverage.

In our mobile coverage, we forced on the DX9 code path, 32-bit depth, tri-linear filtering, and other high settings for which NVIDIA hardware would not automatically allow. For this review, we ended up deciding to keep AA and AF turned off because Half-Life 2 is a very intensive game with the use of pixel shaders to match. The scores that we are reporting, however they may raise a brow, are reflective of game play. The version of Half-Life 2 we used was source v0.4.



Shockingly, only 3 out of 8 times was the GeForce FX Go5650 able to surpass the 10 fps barrier. Even at its best, the GeForce FX Go5650 was only able to close the gap between the Mobility Radeon 9600 to 234%. “Slow as a pregnant yak” was a phase that we often heard in reference to these scores. While we wouldn’t put it in this exact context, the Mobility Radeon 9600 beats the GeForce FX Go5650 “no questions asked” in all of these scenarios, with the highest difference of 415% (36.6fps vs. 7.1fps).

What more can be said? The Mobility Radeon 9600 comfortably passes through all benchmark scenarios easily and never hits sub 30 fps, but this is putting everything lightly. For a mobile system that uses a Mobility Radeon 9600, Half-Life 2 won’t be a software title that is intimidated by this. Like its desktop counterpart, the Mobility Radeon 9600 has the wits to match even the best offering of NVIDA in this benchmark. Meanwhile, the GeForce FX Go5650, as per Valve’s recommendation, will need to run in a lower codepath (DX8 with lower quality settings) to attain reasonable game play.

Splinter Cell AquaMark 3
Comments Locked

47 Comments

View All Comments

  • Anonymous User - Monday, September 15, 2003 - link

    That 30 FPS-eye-limit rubbish always comes up in these sort of threads - I can't believe there are people who think they can't tell the difference between a game running at 30 FPS and 60 FPS.

    Anyway, I'd like to ask about the HL2 benches - you mention the 5600 is supposed to drop down a code path, but don't specifically say which one was used in the tests. DX8? Mixed? The charts say "DX 9.0", so if that was indeed used then it's interesting from a theoretical point of view but doesn't actually tell us how the game will run on such a system, since the DX8 code path is recommmended by Valve for the 5200/5600.
  • Anonymous User - Monday, September 15, 2003 - link

    The "car wheels not rotating right" effect is caused by aliasing, and you'll still get that effect even if your video card is running at 2000fps.

    Besides, you're limited by your monitor's refresh rate anyhow.
  • Anonymous User - Monday, September 15, 2003 - link

    #14 that is incorrect and totally misleading. Humans can tell the difference up to about 60fps (sometimes a little more).

    Have you ever seen a movie where the car's tires dont seem to rotate right? Thats becuse at 29.97fps you notice things like that.

  • Anonymous User - Monday, September 15, 2003 - link

    #13, unless your not human, the human eye cant see a difference at 30fps and up. 60fps is a goal for users cause at that point, even if there is a slow down to 30fps you cant see the difference.
  • Anonymous User - Monday, September 15, 2003 - link

    Overall, I liked the article...

    However, whilst I understand that you wanted to run everything at maximum detail to show how much faster one chipset may be than another, it would have been helpful if some lower resolution benchmarks could have been thrown in.

    After all, what good does it do you to know that chip B may perform at 30fps whilst chip A performs at 10fps if both are unplayable?

    I don't mind whether I can play a game at an astoundingly good detail level or not - I care more about whether I can play the game at all! :)

    In the end, we'd all love to be able to play all our games in glorious mega-detail looks-better-than-real-life mode at 2000fps, but it's not always possible.

    A big question should be can I play the game at a reasonable speed with a merely acceptable quality. And that's the sort of information that helps us poor consumers! :)

    Thanks for your time and a great article.
  • Sxotty - Monday, September 15, 2003 - link

    Um do you mean floating point (FP16) or 16bit color? As opposed to FP32 on the NV hardware, as ATI's doesn't even support FP32, which is not 32bit color. ATI supports FP24. LOL and the no fog thing was just funny, that is NV's fault it is not like it has to be dropped they did it to gain a tiny fraction of performance.
  • rqle - Monday, September 15, 2003 - link

    I really like this comment:

    "Don’t forget that programmers are also artists, and on a separate level, it is frustrating for them to see their hard work go to waste, as those high level settings get turned off."

    Hope future article on graphic card/chipset will offer more insight on how the may developer feel.
  • Anonymous User - Monday, September 15, 2003 - link

    please note: the warcraft benchmark was done under direct3d. now nvidia cards perform badly under direct 3d with warcraft whereas ati does a very fine job. it's a completely different story, however, if u start warcraft 3 with the warcraft.exe -opengl command. so please take note of that, only very few people about this anyway. my quadro 4 700 go gl gets about +10fps more under opgengl compared to d3d!
  • Pete - Monday, September 15, 2003 - link

    Nice read. Actually, IIRC, UT2003 is DX7, with some DX7 shaders rewritten in DX8 for minimal performance gains. Thus, HL2 should be not only the first great DX9 benchmark, but also a nice DX8(.1) benchmark as well.
  • Anonymous User - Monday, September 15, 2003 - link

    so valve let you guys test out half life 2 on some laptops eh? very nice. (great review to, well written)

Log in

Don't have an account? Sign up now