Introduction

If anyone were asked to name the two most anticipated games of this year, no thought would be required before putting forth the inevitable reply of "Doom 3 and Halflife 2". Order may very depending on the gamer's background and taste, and the occasional "Sims 2" response may be heard in the minority. So far, we have Doom 3 (even The Sims 2 is already Gold), but Halflife 2 is still MIA. We keep hearing rumors (though nothing substantial), but even last year's certain release wasn't set in stone.

Ever since the incendiary remarks made by Valve's Gabe Newell about the ability of Valve's programmers to come up with code that ran as fast on NVIDIA's hardware as it did on ATI's (and putting the blame for this square on NVIDIA's shoulders), and the following source code leak, both the buzz over the Halflife sequel and its delay have increased in what feels like an exponential manner.

Of course, last year, ATI "bundled" Halflife 2 with its cards, but consumers who made the purchase were left with little in the way of fulfillment of this offer. That is, until now. Last week, Valve pushed out a one level beta version of the Counterstrike mod fitted to the Halflife 2 core over steam for those customers who had registered their ATI HL2 coupons. Eventually, the game will be released as Counterstike: Source, but, for now, the beta version shows off the bells, whistles, and capabilities of the new Source engine that powers HL2.



Light blooming through windows, illuminating dust in the air in Counterstrike: Source Beta


As an added bonus, Valve included a video card "stress test" in the beta version of CS: Source. Now, with an updated version of the engine, refined drivers, and brand new cards, we take a look at another bit of the story behind the ever lengthening saga of Halflife 2.

Source, CS, and Halflife 2
POST A COMMENT

50 Comments

View All Comments

  • Connoisseur - Thursday, August 26, 2004 - link

    I'm glad to see that they used the 4.8 catalyst drivers. I was wondering whether you guys can run Doom 3 benchies with the 4.8's as well. With my laptop (M6805 R9600), I saw an incredible performance gain (between 15-30%) in resolutions up to 1024x768 going from 4.7 to 4.8. I was wondering if this wasy typical. Reply
  • blckgrffn - Thursday, August 26, 2004 - link

    Have you ever played a game at 20*15? If you have, you know it is awesome. Why would I plop $600 on my Sony 21" if I didn't want to use the higher resolutions? Battlefield looks really good :-) I fully support seeing these resolutions in the future, I was really happy when we finally saw the shift away from 1024*768 on review sites. Reply
  • deathwalker - Thursday, August 26, 2004 - link

    #7..I can't necessarily disagree with you...I own a 6800gt (replaced my 9700pro)..my point was to mearly point out that you can't crown a graphics card line king based soley on its performance in one game. They could do these comparisons all day long and the results will flip flop back and forth depending on the tool you are using to measure performance. Reply
  • Bumrush99 - Thursday, August 26, 2004 - link

    These numbers are screwed up. The HL2 video stress test IS NOT ACCURATE, nearly every review site has very different results. Don't use this review as your only source of information. On my 6800GT overclocked to ultra and my AMD 64 @ 2310 i'm getting way lower results. Funny thing is the first few times I ran the benchmark my results were in line with Anandtech's review... Reply
  • FuryVII - Thursday, August 26, 2004 - link

    #5, I don't think it clearly shows that. For what its worth I have owned both nVidia and ATI (ATI Currently) and I have no problem buying the best card in price/performance ratio. I'd have to say that nVidia seems to be my choice for a new card. Also the gap in the benchmarks doesn't show ATI having that great of a lead. Its damned close.

    Also this was rather disappointing. Those resolutions are just ridiculous.
    Reply
  • deathwalker - Thursday, August 26, 2004 - link

    This test clearly demonstrates that Nvidia is "not" the performance leader when it comes to gaming. Mearly being on top of the heap in one game "Doom3" doesn't crown you the king. Especially when that game uses a graphis engine (opengl) that is clearly not the engine of choice over the long haul in graphics engines for future game developement...that being said..the new line of 6800 cards from Nvidia are products that they should be proud of. Reply
  • AtaStrumf - Thursday, August 26, 2004 - link

    I wish you would include R9800Pro (very good buy right now), since it is not the same as 9800XT. Just look at xbit labs article and you will see a surprisingly big difference. Not all 9800Pros take kindly to OC-ig to 9800XT levels, and many people just don't bother!

    And those of you who want to see how other, lower end cards, perform under Source, may want to check it out as well.

    http://www.xbitlabs.com/articles/video/display/cou...
    Reply
  • mlittl3 - Thursday, August 26, 2004 - link

    #2

    The GF4 4400 is not last generation hardware. It is two generations old. We now have GF 6xxx and before that was GF FX 5xxx. If anyone is still using a GF4 to play current games, I think they know they will be running at resolutions at 800x600 with no AF or AA. No one needs a benchmark test to prove that. Besides anyone who can't afford a last generation or this generation video card, probably don't have an Athlon 64 FX in their rig. So looking at a GF 4 4400 with this processor will tell you nothing about how your GF4 with probably a 1st generation P4 or Athlon XP will run on this game.

    I'm sure when the actual game comes out we will see exactly what we saw with Doom 3 at Anandtech. A huge CPU and GPU roundup with exhaustive tests. So you will get your chance to see the 9600 in action.
    Reply
  • DefconFunk - Thursday, August 26, 2004 - link

    This review was kind of dissapointing.

    I would have found it much mor useful if they'd included at least some ATI cards other than the top range. A 9600 would have been very much appreciated. Same with an FX5700. That way we could have some idea how the less financially able gamer (read: those of us with finicial obligations outside our computer) will be able to play CS:Source / HL2.

    I did however appreciate the inclusion of the GF4 4400. Knowing how last gen products run on current games is important when thinking about purchasing either a new card or the new game in question.
    Reply
  • mikecel79 - Thursday, August 26, 2004 - link

    Great article. Found one little mistake:
    "Kicking a box or rolling an oil down a hill are fun enough to distract players from the game at hand"

    Shouldn't that be oil drum.
    Reply

Log in

Don't have an account? Sign up now