Introduction

If anyone were asked to name the two most anticipated games of this year, no thought would be required before putting forth the inevitable reply of "Doom 3 and Halflife 2". Order may very depending on the gamer's background and taste, and the occasional "Sims 2" response may be heard in the minority. So far, we have Doom 3 (even The Sims 2 is already Gold), but Halflife 2 is still MIA. We keep hearing rumors (though nothing substantial), but even last year's certain release wasn't set in stone.

Ever since the incendiary remarks made by Valve's Gabe Newell about the ability of Valve's programmers to come up with code that ran as fast on NVIDIA's hardware as it did on ATI's (and putting the blame for this square on NVIDIA's shoulders), and the following source code leak, both the buzz over the Halflife sequel and its delay have increased in what feels like an exponential manner.

Of course, last year, ATI "bundled" Halflife 2 with its cards, but consumers who made the purchase were left with little in the way of fulfillment of this offer. That is, until now. Last week, Valve pushed out a one level beta version of the Counterstrike mod fitted to the Halflife 2 core over steam for those customers who had registered their ATI HL2 coupons. Eventually, the game will be released as Counterstike: Source, but, for now, the beta version shows off the bells, whistles, and capabilities of the new Source engine that powers HL2.



Light blooming through windows, illuminating dust in the air in Counterstrike: Source Beta


As an added bonus, Valve included a video card "stress test" in the beta version of CS: Source. Now, with an updated version of the engine, refined drivers, and brand new cards, we take a look at another bit of the story behind the ever lengthening saga of Halflife 2.

Source, CS, and Halflife 2
Comments Locked

50 Comments

View All Comments

  • DerekWilson - Friday, August 27, 2004 - link

    we run with default configuration -- trilin opt on aniso opt off ... this probably accounts for the issues.

    there isn't a config that you can set to make nv and ati do the exact same thing. unfortunately. also, most people run default settings when it comes to opts (AFAIK).
  • Tobyus - Thursday, August 26, 2004 - link

    Derek, I may have missed it in the article, but did you say whether or not you enabled Trilinear and Anistropic Optimizations? Also, I didn't see whether you ran with vsync off, but I am guessing you did since that causes about a 10 fps performance loss on my system.

    I ran the benchmarks with this system (yes, they are beta drivers, and I had Anistropic and Trilinear optimizations enabled, but I also ran the test with 61.77 drivers and at 1600x1200 with 4xAA, 8xAF and the Highest detail settings including Water: Reflect all, I was getting 52 fps)

    Athlon 64 3000+
    MSI K8T800
    1GB OCZ PC3200
    Geforce 6800 GT
    Windows XP Pro SP1
    DX9.0c
    Forceware 65.62

    My tests were all run with the highest settings in the advanced options, except reflect world/reflect all which I will specify in each individual benchmark. These tests were also run with 4xAA and 8xAF.

    800x600
    Reflect World: 126.88
    Reflect All: 114.45

    1024x768
    Reflect World: 113.25
    Reflect All: 102.98

    1280x960
    Reflect World: 88.95
    Reflect All: 83.25

    1600x1200
    Reflect World: 55.61
    Reflect All: 53.21

    2048x1536
    Reflect World: 31.30
    Reflect All: 29.98

    I don't understand why I had better performance than your system Derek. I have nothing overclocked, and the only settings I can think of that I have enabled that you may not have are the optimizations. Is it true that ATI cards default to running with optimizations and they cannot be disabled? If that is true, I would think that it would be fair to enable optimizations on the nvidia cards and that may show a nice improvement and a closer race between the two brands of cards.
  • SirDude - Thursday, August 26, 2004 - link

    "As a student at North Carolina State University, Derek Wilson [B]double majored[/B] in both [B]Electrical and Computer Engineering[/B]. After graduating, Derek brought his extensive Engineering background to work with the AnandTech team. Derek's specializations include [B]compiler theory and design[/B], giving him [B]a unique understanding of microprocessor architecture and optimization[/B]. He has also done [B]extensive work[/B] in the [B]3D field[/B], having [B]designed and implemented[/B] his own [B]3D rendering engine[/B] as well as having done much [B]programming for modern console platforms[/B]. Derek's hands-on experience in the realm of 3D graphics gives him a unique eye in his coverage of the PC graphics industry."

    #35 you know better then this guy I suppose? Here's an idea for ya', why don't you shut up and go away you [B][U][I]Troll[/I][/U][/B]
  • thelanx - Thursday, August 26, 2004 - link

    Just read the rest of the article, after reading between the typos and reading the conclusion, it appears even with the console commands, the FX series is still running the DX 8.0/8.1 path, even if you try to force the dx 9 path. Thus AT is justified in not including the fx 5950 in their review.

    #35, Looks like you could benefit from some homework too, perhaps reading that article you posted more carefully. ;) Next time, give constructive criticism but try not to be so harsh. You aren't the only one guilty of harshness, intellectual discussion and debate is great, but many of the discussions on the net would be better with more cool heads. :)
  • thelanx - Thursday, August 26, 2004 - link

    #35 As I recall, and as the article you posted confirms, the 6800 series do not automatically run the benchmark in dx8, it is only the fx series and below.
  • Ballistics - Thursday, August 26, 2004 - link

    If you guys would have done your homework before posting this article you would have been informed. Having done that you could have accurstely informed us.

    Don't know a good way to benchmark CS Source? Don't know how to force the hardware to use DX 9.0 ? Don't mention that all nVidia cards are forced to use DX 8.1 while ATI trudges away at DX 9.0 and coincidentally falls behind?

    Here's a link to the article: http://www.firingsquad.com/hardware/half_life_2_fx...

    Educate yourselves.
  • yanon - Thursday, August 26, 2004 - link

    In the future, Anandtech should do at least two benchmarks--one for the extreme gamer and one for the average gamer.

    Right now, the extreme gamers probably have AMD FX-53, the Raptor Drive, 2 Gbytes worth of elite super overclocked ram, and Geforce 6800GT/Ultra.

    The average gamers probably have something close to AMD XP 2500+, any 7200rpm harddrive with 8Mbytes of cache, 512 MB to 1 GB worth of value DDR ram, Geforce 5700/ATI 96000/ ATI 9800.
  • yanon - Thursday, August 26, 2004 - link

    The sentiment is clear. People want to see a benchmark score for a setup that includes AMD XP 2500+, ATI 9800 Pro, and 512 Mbytes of DDR 3200 ram.
  • flashbacck - Thursday, August 26, 2004 - link

    Can you guys post results for more midrange hardware? Not everyone has a Geforce 6800 XT, Radeon X800 SuperMegaUltraProPlatinumSpecialLimitedEdition or Athlon 64 50000+.
  • Cygni - Thursday, August 26, 2004 - link

    I like the way people are bitching about typos on a site thats offering FREE articles to the public. Jeeze.

    And oh, I dont really care what OTHER sites are getting on these tests. If you have been around the net, you know the likelyhood of Anandtech being wrong is pretty close to nil. This aint Toms.

Log in

Don't have an account? Sign up now