Half-Life 2: Episode One Performance

Episode One of the new Half-Life 2 series makes use of recent Source engine updates to include Valve's HDR technology. While some people have done HDR that won't allow antialiasing (even on ATI cards), Valve put a high value on building an HDR implementation that everyone can use with whatever settings they want. Consistency of experience is usually not important enough to developers who care about pushing the bleeding edge of technology, so we are very happy to see Valve going down this path.

We use the built-in timedemo feature to benchmark the game. Our timedemo consists of a protracted rocket launcher fight and features much debris and pyrotechnics. The source engine timedemo feature is more like the nettimedemo of Id's Doom 3 engine, in that it plays back more than just the graphics. In fact, Valve includes some fairly intensive diagnostic tools that will reveal almost everything about every object in a scene. We haven't found a good use for this in the context of reviewing computer hardware, but our options are always open.

The highest visual quality settings possible were used including the "reflect all" setting which is normally not enabled by default. Antialiasing was left disabled for this test, and anisotropic filtering was set at 8x. While the Source engine is notorious for giving great framerates for almost any hardware setup, we find the game isn't as enjoyable if it isn't running at at least 30fps. This is very attainable even at the highest resolution we tested on most cards, and thus our target framerate is a little higher in this game than others.

Half-Life 2: Episode One

Most of the solutions scale the same in Half-Life 2: Episode 1, with the possible exception of the 7900 GTX SLI setup hitting a bit of an NVIDIA driver inspired CPU limitation at 1280x1024. We can't really complain, as scoring over 200 fps is really an accomplishment in itself. With scores like these across the board, there's no reason not to run with AA enabled.

Half-Life 2: Episode One

With even the slowest tested solution offering over 50 FPS at 2048x1536 4xAA, gamers playing HL2 variants can run with any of the high-end GPU solutions without problem. ATI does manage to claim a ~10% performance victory with the X1950 CrossFire over the 7900 GTX SLI, so if the pattern holds in future episodes ATI will be a slightly faster solution. The X1900 CrossFire configuration was also slightly faster than the SLI setup, though for all practical purposes that matchup is a tie.

F.E.A.R. Performance Quake 4 Performance
Comments Locked

74 Comments

View All Comments

  • DerekWilson - Saturday, August 26, 2006 - link

    yeah ... i didn't test power with crossfire -- which is a whole lot higher. also, i have a minimal set of componets to make it work -- one hdd, one cdrom drive, and no addin cards other than graphics.

    we'll do multi-gpu power when we look at quadsli
  • ElFenix - Thursday, August 24, 2006 - link

    the review states that power consumption was measured at the wall wtih a kill-a-watt, during a 3Dmark run.

    in addition to the water cooling, it could be he's running a more efficient PSU. in a powerful system drawing 220 watts from the power supply would draw 277 watts from the wall with an 80% efficient PSU (like a good seasonic) and draw 314 watts with a 70% efficient PSU. that's a pretty decent difference right there.

    ... still waiting for nvidia's HQ driver run...
  • poohbear - Thursday, August 24, 2006 - link

    thanks
  • Rock Hydra - Wednesday, August 23, 2006 - link

    With those competitively price parts, hopefully nVIDIA will respond with lower prices.
  • CreepieDeCrapper - Wednesday, August 23, 2006 - link

    I'm not familiar with 1920x1440, did you mean 1920x1200? What resolution were these tests performed? Thank you!

  • JarredWalton - Wednesday, August 23, 2006 - link

    1920x1440 is a standard 4:3 aspect ratio used on many CRTs. It is often included as performance is somewhat close to 1920x1200 performance.
  • CreepieDeCrapper - Wednesday, August 23, 2006 - link

    Thanks, I've been using my LCD for so long I forgot about the vintage CRT res's out there ;) Plus I never ran that particular res on my CRT when I had one, so I just wasn't familiar.
  • cgaspar - Wednesday, August 23, 2006 - link

    While average frame rates are interesting, I _really_ care about minimum frame rates - 300fps average is useless if at a critical moment in a twitch game the frame rate drops to 10fps for 3 seconds - this is especially true in Oblivion. Of course it's possible that the minimums would be the same for all cards (if the game is CPU bound in some portion), but they might not be.
  • JarredWalton - Wednesday, August 23, 2006 - link

    A lot of games have instantaneous minimums that are very low due to HDD accesses and such. Oblivion is a good example. Benchmarking also emphasizes minimum frame rates, as in regular play they occur less frequently. Basically, you run around an area for a longer period of time in actual gaming, as opposed to a 30-90 second benchmark. If there's a couple seconds at the start of the level where frame rates are low due to the engine caching textures, that doesn't mean as much as continuos low frame rates.

    More information is useful, of course, but it's important to keep things in perspective. :)
  • kmmatney - Wednesday, August 23, 2006 - link

    The charts show tht the 7900GT gets a huge boost from being factory overclocked. It would be nice to see if the X1900XT 256 MB can also be overclocked at all, or if there is any headroom.

Log in

Don't have an account? Sign up now