It's now been two months since Half Life 2's release and much to everyone's surprise, the game was far from a GPU hog. The more powerful your graphics card, the better everything looked and the smoother everything ran, but even people with GeForce4 MXs are able to enjoy Valve's long awaited masterpiece.

Immediately upon its release we looked closely at the impacts of GPUs on Half Life 2 performance in Parts I and II of our Half Life 2 coverage. Part I focused on the performance of High End DirectX 9 class GPUs, while Part II focused on mid-range GPUs as well as the previous generation of DirectX 8 class GPUs.

The one area we had not covered up to this point was the impact of CPUs on Half Life 2 performance. In a 3D game, the CPU is responsible mainly for the physics of the environment as well as the artificial intelligence of the NPC elements of the game. There is also a good deal of graphics driver overhead that taxes the CPU, and thus with more complicated games we get higher dependencies on fast CPUs.

Half Life 2 was an intriguing case in itself simply because the game boasted the most sophisticated physics engines that had been seen in a game to date. Elements of the game such as the gravity gun would prove to be extremely taxing on your CPU. In fact, we found that even the fastest $500+ video cards can still be CPU bound in Half Life 2 at normally GPU limited resolutions.

Although much delayed, today we are able to bring you the third and final part of our Half Life 2 coverage focusing entirely on CPU performance as it relates to graphics performance in Half Life 2. After all, a $500 graphics card is worthless if it is bound by a slow CPU.

All of the tests in this article use the same test beds and testing methodology as our first two Half Life 2 articles. You can download all of the demos used in this article here.

We apologize for the delay in the publication of this article, but as often the case, we get busy and things such as this article get postponed and postponed. Rather than shelve it, we decided to publish it - better late than never. Now on to the benchmarks...

AMD vs. Intel Performance


View All Comments

  • zhangping0233 - Thursday, January 5, 2012 - link

    Nice job, If you need any flashlight, pls contact feel free. Reply
  • PrinceGaz - Friday, January 28, 2005 - link

    by the way, you can't mod a 9500Pro to a 9700Pro, the 9500Pro circuit-board only has a 128-bit memory-bus and there's no way you can change it to 256-bit. Reply
  • PrinceGaz - Friday, January 28, 2005 - link

    #55- the cut-off point where you can't really tell the difference of a higher framerate, is when the framerate exceeds the monitor refresh rate.

    If your monitor is updating the display 85 times per second (a common setting for cheaper CRT displays), then a *minimum* framerate of higher than 85fps makes no difference. With flat-panels, refresh-rates of 60-75hz are more common so you don't even need to maintain 85fps. A faster graphics-card is still worthwhile though as it allows you to crank up AA and Aniso settings (8x anti-aliasing is lovely).
  • maestroH - Friday, January 28, 2005 - link

    Coming from the recent dark age of a P4 1.7/9500Pro(@9700Pro) combi, this is great article to decide on my new machine. Although HL2 is playing quite nicely, I am lucky to have a one-off opportunity to buy myself a FX-55/X800XT combi.
    Never having experienced even any fps close to what's on these charts, something in the back of my mind keeps saying that a 10-20 fps more when you are already over 100 fps, will make no difference to the experience except a bigger hole in my wallet. Can anyone tell me where the 'cut-off' point is where even the most discerning of gamers cannot see/feel the difference? Knowing that dual core is coming up (even though games for them still need to be made), would buying a 3500+ be smarter or should I go for the FX-55 simply because I can (only this once)? Thx Anand for the article.
  • essjae - Thursday, January 27, 2005 - link

    Those graphs look nice, but they don't really mean much. Based on similar graphs and results I just bought a a64-4000+ and MSI Neo2 Platinum to replace my P4-3.2GHz and Asus P4c-800E.

    With the same ATI X800XT Platinum, memory, and hard drives, I can't see any difference between then, in fact, the P4 seemed to play smoother.

    Do I have any proof, no, other than playing half-life 2 on my p4 was more enjoyable.
  • Spacecomber - Thursday, January 27, 2005 - link

    Can anyone offer some insight into why the extreme edition northwoods did as poorly as they did? The 3.2GHz EE could barely keep up with the 3.0GHz Prescott; so, it's more than the raw clock speed of the high end Prescotts.

    Could it be related to running the Northwood on a platform really intended for Prescotts?

  • mixpix - Thursday, January 27, 2005 - link

    Awesome article. It was exactly what I've been looking for. My 2600+ is not cutting it with my 6600GT AGP and I was thinking it was the CPU that was limiting preformance. Reply
  • TheCanuck - Thursday, January 27, 2005 - link

    Firing Squad did a review on the Athlon XP performance with HL2 a while ago:

    The 3200+ XP got about 95fps in the Canals_09 demo at 1280x1024 with an X800 XT PE. Not sure how well it compares to the Canals_08 that Anandtech uses, but I doubt the difference would be that great.
  • Guspaz - Thursday, January 27, 2005 - link

    I'm very dissapointed with this article. I have been eagerly awaiting it for ages, expecting to see how Half-Life 2 scales down to lower speed processors; I've long maintained that low end processors like an AthlonXP 1900+, run the game quite poorly.

    The big deal with HL2 is that it was supposed to run on much older computers. But nobody seems to have benchmarked it on anything but pretty new hardware. I expected that a CPU scaling article would cover that, in fact I thought that was the entire point.
  • Visual - Thursday, January 27, 2005 - link

    i'd be curious to see normal 6600, as well as maybe some lower-end ati card in the comparison :)
    ok ok, i know this isn't a GPU shootout, but still...

Log in

Don't have an account? Sign up now