Introduction

If anyone were asked to name the two most anticipated games of this year, no thought would be required before putting forth the inevitable reply of "Doom 3 and Halflife 2". Order may very depending on the gamer's background and taste, and the occasional "Sims 2" response may be heard in the minority. So far, we have Doom 3 (even The Sims 2 is already Gold), but Halflife 2 is still MIA. We keep hearing rumors (though nothing substantial), but even last year's certain release wasn't set in stone.

Ever since the incendiary remarks made by Valve's Gabe Newell about the ability of Valve's programmers to come up with code that ran as fast on NVIDIA's hardware as it did on ATI's (and putting the blame for this square on NVIDIA's shoulders), and the following source code leak, both the buzz over the Halflife sequel and its delay have increased in what feels like an exponential manner.

Of course, last year, ATI "bundled" Halflife 2 with its cards, but consumers who made the purchase were left with little in the way of fulfillment of this offer. That is, until now. Last week, Valve pushed out a one level beta version of the Counterstrike mod fitted to the Halflife 2 core over steam for those customers who had registered their ATI HL2 coupons. Eventually, the game will be released as Counterstike: Source, but, for now, the beta version shows off the bells, whistles, and capabilities of the new Source engine that powers HL2.



Light blooming through windows, illuminating dust in the air in Counterstrike: Source Beta


As an added bonus, Valve included a video card "stress test" in the beta version of CS: Source. Now, with an updated version of the engine, refined drivers, and brand new cards, we take a look at another bit of the story behind the ever lengthening saga of Halflife 2.

Source, CS, and Halflife 2
POST A COMMENT

50 Comments

View All Comments

  • DerekWilson - Thursday, August 26, 2004 - link

    #18 If there are any typos, please point them out any typos and they will be corrected. We have already fixed the problem our first commenter pointed out.

    #19 We have Extreme cards from a couple different manufactures. We also have Platinum cards from a couple different manufactures. We wouldn't still be testing these cards if all we had were NV and ATI reference samples.
    Reply
  • Drayvn - Thursday, August 26, 2004 - link

    Umm, i dont know why, but there are no official Ultra Extremes only overclocked ones, nVidia has stated that they told the Add on manufacturers they can indeed overclock their cards, but they cannot call it the Ultra Extreme, so i dont know why u have that card in there as there are none, if u would have called it an Ultra OC then that would have been fine, because it seems there will never be an Ultra Extreme. Reply
  • esun - Thursday, August 26, 2004 - link

    Regardless of the quality of the article and benchmarks and whatnot, it seems like there are a lot of typos in this article (just takes away from its credibility and professionalism IMO). Reply
  • DerekWilson - Thursday, August 26, 2004 - link

    #15

    Sorry, the video stress test does not run with any sound. It actually does (as much as possible) what it says -- it focuses on video performance.
    Reply
  • Jalf - Thursday, August 26, 2004 - link

    Shame about the X800 pro. Would be interesting to be able to compare it to the GT at high-res. Would be interesting to see how much the GT benefits from having all 16 pipelines at the high-res scenario... (Or how much it loses)

    In either case, I disagree with #5.
    It doesn't show clearly that NVidia isn't performance leader.
    On the other hand it shows that NVidia isn't clearly the performance leader. :P

    Performance-wise, I'd call it a tie for now. They're both damn fast as far as I'm concerned. ATI are working on improving their Doom 3 performance, and I have a hunch NVidia are going to put some more effort into their HL2 performance now.

    Anyway, to those wanting to see a mid-range card, you've got the 4400. You should be able to extrapolate from that.
    Reply
  • ir0nw0lf - Thursday, August 26, 2004 - link

    Was the sound turned on or off during these tests? There is no mention that I could find of that, perhaps I missed it being mentioned? Reply
  • thelanx - Thursday, August 26, 2004 - link

    Granted these are real game benchmarks, but we can extrapolate and estimate like the article said. HL2 will probably be more cpu intesive and less graphically intensive. These benchmarks will cheer up many people I think. Those with high end cards will be happy that whatever they chose to buy, it will run HL2 great, and mid-range card owners will be happy that their cards should run HL2 very well. The real game will probably be less graphically demanding but more cpu intenstive, so my 9700pro with my 2.5GHz A64 will probably run the game better than the graphics stress test, especially at 10x7, my lcd's native resolution. Reply
  • Zephyr106 - Thursday, August 26, 2004 - link

    I agree completely with #2. Benchmark it on some of the midrange cards. And a $400 6800GT isn't midrange. Specifically because Valve has said they hope the game will be scalable for slower hardware, and alot of those 9600 Pro/XT owners have HL2 vouchers, and I'm sure not all have upgraded. Reply
  • Avalon - Thursday, August 26, 2004 - link

    Can't exactly call ATI the performance winner here. the X800 XT PE is often only a few frames better than the 6800 UE, and the GT is often a few frames better than the X800 pro. Seems almost more closer to a tie than one side actually performing better. Regardless, by using a little observation, it seems like my 9700 pro will be able to run the game just fine at 10x7, and I might even have room for a bit of eye candy :) Reply
  • PsharkJF - Thursday, August 26, 2004 - link

    Why would you even need to run HL2 at 20x15? lol.
    10x7 or 12x10 is fine for me, and it looks like my old GF4Ti4200 can run it well enough.
    Reply

Log in

Don't have an account? Sign up now