Half-Life 2: Episode One Performance

Episode One of the new Half-Life 2 series makes use of recent Source engine updates to include Valve's HDR technology. While some people have done HDR that won't allow antialiasing (even on ATI cards), Valve put a high value on building an HDR implementation that everyone can use with whatever settings they want. Consistency of experience is usually not important enough to developers who care about pushing the bleeding edge of technology, so we are very happy to see Valve going down this path.

We use the built-in timedemo feature to benchmark the game. Our timedemo consists of a protracted rocket launcher fight and features much debris and pyrotechnics. The Source engine timedemo feature is more like the nettimedemo of Id's Doom 3 engine, in that it plays back more than just the graphics. In fact, Valve includes some fairly intensive diagnostic tools that will reveal almost everything about every object in a scene. We haven't found a good use for this in the context of reviewing computer hardware, but our options are always open.

The highest visual quality settings possible were used including the "reflect all" setting which is normally not enabled by default, and anisotropic filtering was set at 8x. While the Source engine is notorious for giving great framerates for almost any hardware setup, we find the game isn't as enjoyable if it isn't running at at least 30fps. This is very attainable even at the highest resolution we tested on most cards, and thus our target framerate is a little higher in this game than others.  

Half-Life 2: Episode One

Showing about a 10% performance advantage over the 7900 GS, the X1950 Pro delivers a good level of performance under Half-Life 2: Episode One without AA enabled. Combine that with the fact that CrossFire delivers about an 80% performance improvement to SLI's 66%, and we have a clear winner in the mulit-GPU department here.



Half-Life 2: Episode One

Enabling 4xAA under HL2:EP1 closes the gap between ATI and NVIDIA at the $200 mark, but still leaves ATI in the lead. The same is true in the multi-GPU arena.

 

F.E.A.R. Performance Quake 4 Performance
Comments Locked

45 Comments

View All Comments

  • Zoomer - Thursday, October 19, 2006 - link

    Is this a optical shrink to 80nm?

    Answering this question will put overclocking expectations in line. Generally, optically shrunk cores from TSMC overclock to the about the same as the original or perhaps slightly worse.
  • coldpower27 - Friday, October 20, 2006 - link

    Well no as this piepline configuration doesn't exist natively before on the 90nm node. It's a 3 Quad Part, so it's basedon R580 but has 1 Quad Physical removed as well as being shrunk to 80nm. Not to mention Native Crossfire support was added onto the die.
  • Spoelie - Friday, October 20, 2006 - link

    Optical shrink, this is 80nm and the original was 90nm. You're normally correct because the first optical shrink usually does not have the same technologies as the proces higher up (low-k and SOI for example, this was the case with 130nm -> 110nm), but I don't think it's the case for this generation. Regardless, haven't seen any overclocking articles on it yet so I'm quite curious.
  • Spoelie - Friday, October 20, 2006 - link

    oie, maybe I should add that it's reworked as well, so both actually. Since this core didn't exist before (rv570 and that pipeline configuration), I don't think that they just sliced a part of the core...
  • Zstream - Tuesday, October 17, 2006 - link

    Beyond3D reported the spec change a month before anyone received the card. I think you need to do some FAQ checking on your opinions mate.

    All in all decent review but poor unknowledgeable opinions…
  • DerekWilson - Wednesday, October 18, 2006 - link

    Just because ATI made the spec change public does not mean it is alright to change the specs of a product that has been shipping for 4 months.

    X1900 GT has been available since May 9 as a 575/1200 part.

    The message we want to send isn't that ATI is trying to hide something, its that they shouldn't do the thing in the first place.

    No matter how many times a company says it changed the specs of a product, when people search for reviews they're going to see plenty that have been written since May talking about the original X1900 GT.

    Naming is already ambiguous enough. I stand by my opinion that having multiple versions of a product with the exact same name is a bad thing.

    I'm sorry if I wasn't clear on this in the article. Please let me know if there's anything I can reword to help get my point across.
  • Zoomer - Thursday, October 19, 2006 - link

    This is very common. Many vendors in the past have passed off 8500s that run at 250/250 instead of the stock 275/275, and don't label them as such.

    There are some Asus SKUs that have this same handicap, but I can't recall what models that were.
  • xsilver - Tuesday, October 17, 2006 - link

    any word on what the new price for the x1900gt's will be now that the x1950pros are out?
    or are they being phased out and no price drop is being considered?
  • Wellsoul2 - Monday, November 6, 2006 - link

    You guys are such cheerleaders..

    For a single card buy why would you get this?
    Why would you buy the 1900GT even after the
    1900XT 256MB came out?

    I got my 1900XT 256MB for $240 shipped..

    Except for power consumption it's a much better card.
    You get to run Oblivion great with one card.

    Two cards is such a scam. More expensive motherboard..power consumption etc.
    This is progress? CPU's have evolved..
    It's hard to even find a motherboard with 3 PCI slots..
    What a scam! Where's my ultra-fast HDTV board for PCI Express?
    Seriously..Why buy into SLI/Crossfire? Why not 2 GPU's on one card?
    Too late..You all bought into it.

    Sorry I am just so sick of the praise for this money-grab of SLI/Crossfire.

  • jcromano - Tuesday, October 17, 2006 - link

    Are the power consumption numbers (98W idle, 181W load) for just the graphics card or are they total system power?

    Thanks in advance,
    Jim

Log in

Don't have an account? Sign up now