Benchmarking Half Life 2

Unlike Doom 3, Half Life 2 has no build in benchmark demo but it has full benchmark functionality.  To run a Half Life 2 timedemo you must first modify your Half Life 2 shortcut to include the -console switch then launch the game.

Once Half Life 2 loads, simply type timedemo followed by the name of the demo file you would like to run.  All Half Life 2 demos must reside in the C:\Program Files\Valve\Steam\SteamApps\username\half-life 2\hl2\ directory. 

Immediately upon its launch, we spent several hours playing through the various levels of Half Life 2, studying them for performance limitations as well as how representative they were of the rest of Half Life 2.  After our first pass we narrowed the game down to 11 levels that we felt would be good, representative benchmarks of gameplay throughout the entire game of Half Life 2.  We further trimmed the list to just five levels: d1_canals_08, d2_coast_05, d2_coast_12, d2_prison_05 and d3_c17_12.  We have put together a suite of five demos based on these levels that we believe are together representative of Half Life 2 gameplay.  You can download a zip of our demos here. As we mentioned earlier, ATI is distributing some of their own demos but we elected not to use them in order to remain as fair as possible.

When benchmarking Half Life 2 we discovered a few interesting things:

Half Life 2's performance is generally shader (GPU) limited when outdoors and CPU limited when indoors; now this rule of thumb will change if you run at unreasonably high resolutions (resolutions too high for your GPU) or if you have a particularly slow CPU/GPU, but for the most part take any of the present day GPUs we are comparing here today and you'll find the above statement to be true. 

Using the flashlight can result in a decent performance hit if you are already running close to the maximum load of your GPU.  The reason behind this is that the flashlight adds another set of per pixel lighting calculations to anything you point the light at, thus increasing the length of any shaders running at that time. 


The flashlight at work

Levels with water or any other types of reflective surfaces generally end up being quite GPU intensive as you would guess, so we made it a point to include some water/reflective shaders in our Half Life 2 benchmarks. 

But the most important thing to keep in mind with Half Life 2 performance is that, interestingly enough, we didn't test a single card today that we felt was slow.  Some cards were able to run at higher resolutions, but at a minimum, 1024 x 768 was extremely playable on every single card we compared here today - which is good news for those of you who just upgraded your GPUs or who have made extremely wise purchases in the past.

For our benchmarks we used the same settings on all GPUs:

Our test platforms were MSI's K8N Neo2 (nForce3) for AGP cards and ASUS' nForce4 motherboard for PCI Express graphics cards. The two platforms are comparable in performance so you can compare AGP numbers to PCI Express numbers, which was our goal. We used an Athlon 64 4000+ for all of our tests, as well as 1GB of OCZ DDR400 memory running at 2-2-2-10.

Index Battle in the Canal
Comments Locked

79 Comments

View All Comments

  • nthexwn - Wednesday, November 17, 2004 - link

    In reply to Jeff7181 (#14):

    I have a Radeon 9700 pro with the 4.11 drivers and I'm having the same problems with my LCD (Samsung Syncmaster 710T @ 1280x1024)! Refresh rate is set to 70hz and with vsync I either get 35 (Interleaving frames to every other) or 70 fps (Matching frames to refresh rate)... Since our cards are from different companies I'm guessing it's a problem with the game itself...

    I've tried both triple buffering and alternating the DVI frequency (don't know if that would even help) and it doesn't solve the problem...

    It's rather irritating because I actually PLAY my games instead of just gawking over the benchmark scores (I'm one of those lucky people that has some free time!), and the screen looks like a Freddy Kruger job without vsync on! :*(

    Also, when the game switches between 70 and 35 there is a bit of a stall, which, even though 35fps is still playable can ruin online play in CS:S! Especially since player models running onto the screen tend to temporarily stress the card enough to make it hitch up on me, in which time said player usually caps me in the head and moves on! :*(

    I suppose we could type "fps_max 35" or "fps_max 42.5" (assuming it accepts floating values. You could just set your monitor to 80hz and set fps_max to 40) in the console (don't use the "s), but limiting the framerate to those values isn't what I'd call an ideal solution...

    Oh well...

    GREAT GAME! GREAT HARDWARE! GREAT WEBSITE!
  • smn198 - Wednesday, November 17, 2004 - link

    I'v got a 9800SE 128MB (256bit) card. Would like to know how that compares. I fried my 9500Pro making it into a 9700Pro so that won't do 3D no more (Artifacts then crashes) :(

    What graphics card which will be tested would have similar performance to a 9800SE (256bit RAM)?
  • ElMoIsEviL - Wednesday, November 17, 2004 - link

    "The one issue I'm struggling with right now is the fact that the X700 XT is still not available in retail, while the X700 Pro (256MB) is. If I have the time I may go back and run some X700 Pro numbers to make this a more realistic present-day comparison."

    I should post you a picture.. the x700XT is available at futurshop in Canada and has been for about a week now.. :)

    Althought not my cup of tea they are selling quite well I'm told.

    But then again ATi cards always sell well in Canada.. so well ATi usually cannot fill the demand (with the USA taking soo many of the chips lol).
  • ElMoIsEviL - Wednesday, November 17, 2004 - link

    Well... for one thing the numbers are not even close to what other sites are showing and secondly where's the x800XT PE?

    It's the card I own (albeit mine is clocked at 650/625).

    It's good to see ATi in the lead by such significant margins and that the game can be easilly played at 1600x1200 with 4xAA and 8xAF with an x800XT PE. Also great to see that the game runs well without the final HL2 drivers from ATi (yeah the 4.12's are only interim, the full 4.12's are going to be fully optimised).

    The biggest surprise is how well the 6600GT performed although losing convinsingly against the x700XT it still put on a good showing.

    BTW, other sites are showing the x800 Pro beating the 6800 Ultra with the same drivers albeit using an AthlonFX-55.

    Meh,

    Looks like ATi can probably offer even greater performance at lower resolutions according to the 1600x1200 results being soo close to the lower resolutions.
  • SMT - Wednesday, November 17, 2004 - link

    Anand,

    My flashlight worked throughout Nova Prospekt. Are you sure yours wasn't available?
  • abravo01 - Wednesday, November 17, 2004 - link

    Was the 6800GT used on the test 128 or 256MB? Huge price difference around here: if it was the 128MB, than it's definitely the best buy.
  • Anand Lal Shimpi - Wednesday, November 17, 2004 - link

    The AA benchmarks actually used 8X Aniso as well.

    Take care,
    Anand
  • OriginalReaper - Wednesday, November 17, 2004 - link

    on page 8 and 9 you discuss AA and AF, yet on page 10, 11, 12, and 13, you only list 4xAA being used. Did you forget to put 8xAF in the results or did the benchmark only do 4xAA?

    Thanks.
  • CU - Wednesday, November 17, 2004 - link

    I think an investigative article that shows when what hardware becomes a bottleneck for HL2 would be great. I look forward to it.

    "Any other requests?

    Take care,
    Anand"

    Can you send me all the hardware when you are done testing HL2. :-)
  • Cybercat - Wednesday, November 17, 2004 - link

    Nice, I wanted to know how the 9550 performed, mostly to see how it compares with the FX 5200. Is that 128 bit memory or 64 bit memory interface version? I'm pretty excited about the 6200 as well, since this is finally a budget card that performs better than the Ti4200. The performance leap this gen is spectacular.

    Overall, I think you left the other guys in the dust with this one.

    And on the subject of the X700 Pro, it's kind of an odd card, because with its price range (the 128MB version at about $180, 256MB at $200), it's unclear what card it's competing with. It acts like a fifth wheel in this way. People would much rather buy a X700XT or 6600GT instead since they're in the same general price range. Only thing is, like you said the X700XT isn't widely available yet, making the X700 Pro a stopgap for now, and giving NVIDIA the clear win in the mainstream market until ATI can start shipping out the more competitive card. That's the only thing saving the X700 Pro right now from being completely pointless.

Log in

Don't have an account? Sign up now