Half Life 2 Performance

by Anand Lal Shimpi on October 1, 2003 11:48 PM EST
So of course the one thing everyone asked for was something I couldn't deliver for the first part of the Fall 2003 Video Card roundup - Half Life 2 performance. Of course there were requests for Doom3 performance and believe me, if we had the benchmarks we would definitely have included them. As you can guess, Valve has not released the Half Life 2 benchmark as originally expected and thus we don't have updated Half Life 2 numbers for you. The conclusion of the recent article did reference Doom3 and Half Life 2 performance however, and believe me I wasn't just pulling numbers and thoughts out of the air - I've got some basis for what I've said.

Here are some Half Life 2 numbers for you to look at; they were provided by a reliable source, but I could not verify anything myself so take them with a grain of salt. ATI was running in their DX9 codepath and the mixed mode codepath was used for NVIDIA. No AA/AF was enabled and we're looking at 1024x768 scores:

Half Life 2 Demo
Radeon 9800XT
NVIDIA GeForce FX 5950 Ultra
e3_bugbait
71.4
73.9
e3_c17_01
57.6
57.5
e3_c17_02
49.9
49.3
e3_phystown
74.5
77.2
e3_seafloor
53.9
53.3
e3_techdemo_5
83.5
64.5
e3_techdemo_6
76.9
71.7
e3_under_02
77.3
71.1

If those numbers hold true then things definitely look better than from Half Life 2 day, but we'll reserve judgement until we get the benchmark in house. I just thought you'd like to see what we're seeing, I wouldn't draw any conclusions based on this data yet, just wanted to share :)

Tomorrow is my 8AM day again, and maybe I'll get those tests back that I took on Monday (hopefully not...I'd like tomorrow to be a good day :)...). I'm off to sleep, have a good night everyone. Derek and I will be back to work on Part II tomorrow; I'll update you as soon as I can.

Comments Locked

102 Comments

View All Comments

  • Anonymous - Thursday, October 2, 2003 - link

    its amazing how the nvidia fanboys say something to bash someone when the other party is just telling facts. Go get a real card ( ati ) nvidia is cheating crap
  • Anonymous - Thursday, October 2, 2003 - link

    It's amazing how blatant and ridiculous some of the ATI fanboys are. Every time something that shows nVIdia in a good light it's immediately "cheating," sometime without a single shred of evidence. It is indeed possible that GeForce cards can perform well, much to the chagrin of fanATIcs. And these aren't even verified, so no point in getting features ruffled over nothing.
  • Anonymous - Thursday, October 2, 2003 - link

    Hey if the optimizations/cheats they made in the drivers make HL2 all that much faster and the IQ differences are neglible or you need to blow up a single frame to 400% to notice then kudos to nvidia. Conversely if the IQ differences are drastic enough to notice during actual gameplay then I'm not interested.

    I swear you fanatics could drive a guy to nvidia just on grounds of being so irritating....
  • UnI - Thursday, October 2, 2003 - link

    Ffs, every one is allready jumping to conclusion's. Just f*cking wait till the cards are both for sale and tested with the official HL2 benchmark before every one is calling that Nvidia is cheating. To comment on the 51.75 drv it's a BETA what did everyone expect?? That it was perfect the first time? As far as i know there are no perfect drivers. Yes i own a Nvidia card and no i am not a fanboy (before some stupid flame war is beginning about fanboys) I am thinking of purchasing a radeon card in the near future (R420) or stay with Nvidia (NV40).

    Anand did a fine job on publishing this news, i looked with interest to this "bench" and can't wait to see some official figures, maybe Nvidia did find a way to improve performance without loosing IQ only Time will tell
  • UlricT - Thursday, October 2, 2003 - link

    WHY THE HELL IS EVERYONE FLAMING ANAND?

    All he did was put up some preliminary results from a source HE considers reliable. People visiting Anandtech consider Anand to be a reliable person. If he was to jump ship (or sell his soul..) to the marketing side, he would have done so a LoNG time ago!

    WAiT for official benchmarks before making any judement calls...

    Anand, PleAsE don't be taken aback by these peope comments. I, for one, really hope that you will keeping posting stuff like this! Get us on the inside of what is going on!!! Thanks... :)
  • Anonymous - Thursday, October 2, 2003 - link

    .... do you feal the fresh air in your face?!? Do you see birds, dogs, people in the streats?? No?!? Ok... turn off your comp and get a life.... please... theres a lot of things to see and feald.... get a life....
  • Andy - Thursday, October 2, 2003 - link

    I would say calling DX9 a standard is a bit of a stretch. It's not as if it was defined by some independent standard body, like say the OpenGL ARB. Let's call it what it is - a Microsoft specification.

    And it seems rather obvious that Microsoft has been somewhat concerned of late with Nvidia dominance in the graphics industry. Nvidia had the audacity to refuse to lower the already slim margin on the XGPU, despite repeated aggressive demands from Microsoft. Aside from that, Nvidia has consistently pushed a multi-OS strategy and OpenGL as an alternative API. This obstinance clearly is intolerable.

    Could it be a coincidence that MS gave their seal of approval to Nvidia's struggling competitor and at the same time left Nvidia high and dry with their ambitious 32-bit architecture?

    I would say the surprise is not that Nvidia is behind. It's how well they manage to keep up. And it remains to be seen how much of a tradeoff the mixed precision path will be.
  • Bay - Thursday, October 2, 2003 - link

    Um, relax folks. "...take them with a grain of salt." ATI will always run at higher quality anyhow, so no big deal. The ATI folks will be happy, and the nVidia folks will be happy because they can at least RUN the game at acceptable frame rates at the expense of visual quality. Oh Well.
  • Jahara - Thursday, October 2, 2003 - link

    "superior and flexible architecture"?

    They didn't make the hardware change anything, they had to make the software use a different hardware path that doesn't run at the correct DX9 specifications.

    Do you have any clue what you're talking about?

    Nvidia just has a different architecture that doesn't run at the standard DX9 specs, thus it is a inferior card and isn't in any way "superior and flexible" since it isn't a standard.

    It's like comparing a rocket engine to a car motor. Sure it'll generate more thrust, but you can't just stick it on wheels and call it an efficent car. You need to change the road layout completely, but that doesn't make it a car.
  • Reality - Thursday, October 2, 2003 - link

    Well, at least the game will run at playable framerates on Nvidia hardware now. Too bad the game will look very sub-par compared to ATi's offerings.

    Glad I bought my 9800 Non Pro (Modified to Pro) for $250, instead of paying $300+ for a 5900 Non Ultra. Hahah.

Log in

Don't have an account? Sign up now