Half-Life 2: Episode One Performance

Episode One of the new Half-Life 2 series makes use of recent Source engine updates to include Valve's HDR technology. While some people have done HDR that won't allow antialiasing (even on ATI cards), Valve put a high value on building an HDR implementation that everyone can use with whatever settings they want. Consistency of experience is usually not important enough to developers who care about pushing the bleeding edge of technology, so we are very happy to see Valve going down this path.

We use the built-in timedemo feature to benchmark the game. Our timedemo consists of a protracted rocket launcher fight and features much debris and pyrotechnics. The source engine timedemo feature is more like the nettimedemo of Id's Doom 3 engine, in that it plays back more than just the graphics. In fact, Valve includes some fairly intensive diagnostic tools that will reveal almost everything about every object in a scene. We haven't found a good use for this in the context of reviewing computer hardware, but our options are always open.

The highest visual quality settings possible were used including the "reflect all" setting which is normally not enabled by default. Antialiasing was left disabled for this test, and anisotropic filtering was set at 8x. While the Source engine is notorious for giving great framerates for almost any hardware setup, we find the game isn't as enjoyable if it isn't running at at least 30fps. This is very attainable even at the highest resolution we tested on most cards, and thus our target framerate is a little higher in this game than others.

Half-Life 2: Episode One

Most of the solutions scale the same in Half-Life 2: Episode 1, with the possible exception of the 7900 GTX SLI setup hitting a bit of an NVIDIA driver inspired CPU limitation at 1280x1024. We can't really complain, as scoring over 200 fps is really an accomplishment in itself. With scores like these across the board, there's no reason not to run with AA enabled.

Half-Life 2: Episode One

With even the slowest tested solution offering over 50 FPS at 2048x1536 4xAA, gamers playing HL2 variants can run with any of the high-end GPU solutions without problem. ATI does manage to claim a ~10% performance victory with the X1950 CrossFire over the 7900 GTX SLI, so if the pattern holds in future episodes ATI will be a slightly faster solution. The X1900 CrossFire configuration was also slightly faster than the SLI setup, though for all practical purposes that matchup is a tie.

F.E.A.R. Performance Quake 4 Performance
Comments Locked

74 Comments

View All Comments

  • SixtyFo - Friday, September 15, 2006 - link

    So do they still use a dongle between the cards? If you had 2 xfire cards then it won't be connecting to a dvi port. Is there an adaptor? I guess what I'm asking is are you REALLY sure I can run 2 crossfire ed. x1950s together? I'm about to drop a grand on video cards so that piece of info may come in handy.
  • unclebud - Friday, September 1, 2006 - link

    "And 10Mhz beyond the X1600 XT is barely enough to warrant a different pair of letters following the model number, let alone a whole new series starting with the X1650 Pro."

    nvidia has been doing it for years with the 4mx/5200/6200/7300/whatever and nobody here said boo!
    hm.
  • SonicIce - Thursday, August 24, 2006 - link

    How can a whole X1900XTX system use only 267 watts? So a 300w power supply could handle the system?
  • DerekWilson - Saturday, August 26, 2006 - link

    generally you need something bigger than a 300w psu, because the main problem is current supply on both 12v rails must be fairly high.
  • Trisped - Thursday, August 24, 2006 - link

    The crossfire card is not the same as the normal one. The normal card also has the extra video out options. So there is a reason to buy the one to team up with the other, but only if you need to output to a composite, s-video, or component.
  • JarredWalton - Thursday, August 24, 2006 - link

    See discussion above under the topic "well..."
  • bob4432 - Thursday, August 24, 2006 - link

    why is the x1800xt left out of just about every comparison i have read? for the price you really can't beat it....
  • araczynski - Thursday, August 24, 2006 - link

    ...I haven't read the article, but i did want to just make a comment...

    having just scored a brand new 7900gtx for $330 shipped, it feels good to be able to see the headlines for articles like this, ignore them, and think "...whew, i won't have to read anymore of these until the second generation of DX10's comes out..."

    I'm guessing nvidia will be skipping the 8000's, and 9000's, and go straight for the 10,000's, to signal the DX10 and 'uber' (in hype) improvements.

    either way, its nice to get out of the rat race for a few years.
  • MrJim - Thursday, August 24, 2006 - link

    Why no Anisotropic filtering tests? Or am i blind?
  • DerekWilson - Saturday, August 26, 2006 - link

    yes, all tests are performed with at least 8xAF. Under games that don't allow selection of a specific degree of AF, we choose the highest quality texture filtering option (as in BF2 for instance).

    AF comes at fairly little cost these days, and it just doesn't make sense not to turn on at least 8x. I wouldn't personally want to go any higher without angle independant AF (like the high quality af offered on ATI x1k cards).

Log in

Don't have an account? Sign up now