Final Words

With the low cost of CPUs these days and with many affordable GPUs on the market, building a system capable of playing Half Life 2: Episode Two just isn't that difficult.

On the CPU side we were caught off guard by exactly how much cache size impacted performance in Episode Two, rendering the Core 2 Duo E4000 and Pentium E2000 series processors much slower than their competition.

AMD was also far more competitive than expected, most likely as a result of the Source engine's dependence on low latency memory accesses. While Intel continues to hold the performance crown, in the $133 and lower price points AMD actually ends up being the better processor to have. If and when Phenom can get to those price points, AMD could actually end up being significantly more competitive than it has been since the launch of Core 2.

Given the performance impact we've seen from faster FSBs and larger caches however, Intel's Penryn core should do a good job of fixing lower end performance once Intel's 45nm core makes its way down to lower price points as well. It also remains to be seen how much of the cache sensitivity we saw here today will translate into other up and coming games, such as today's Unreal Engine 3 based UT3 demo.

While NVIDIA is the only solutions for those who wish to run Episode 2 with all the features enabled at 2560x1600 with 4xAA enabled, the 2900 XT does outperform the 8800 GTS at the $400 price point. The 8800 GTS 320MB is once again a huge value for the money as it performs almost identically to the 8800 GTS 640MB part (with the exception of anything above 1920x1200 with 4xAA which handicaps the lower memory card).

As we mentioned, almost anything can play Episode 2, but if you want high quality at 1280x1024, you'll at least need the equivalent performance of a modern $100+ graphics card. Serious (and even casual) PC gamers will very likely already have something that meets this requirement. Clearly this is no Crysis, but at the same time we applaud Valve's efforts to keep its engine up to date.

GPU Performance
Comments Locked

46 Comments

View All Comments

  • Trixanity - Saturday, October 13, 2007 - link

    I kinda agree with that. I would like to see some tests with more standard specs, than top-end. You should do top-end when comparing new hardware against each other (AMD vs Intel and Nvidia vs AMD), but when testing a game, you may still use these, but try also to include older hardware such as DX9 cards like X1900XTX and 7800GTX or 7900GTX (And GT).
    Would be more helpful to the average gamer, as not all have the money to invest in the top end each month. About the resolution, mostly you do out-of-reach resolution for most, but this time I think you did include 1024x768 for CPU testing, right? I think you should do 1024x768 and 1280x1024 (Or Wide Screen resolutions) more often. I myself run my games at 1024x768, but would probably change when I get a new monitor. (Have old CRT). Those 1900x1200 resoluations are only for people with 24" screens or more.
  • Cali3350 - Saturday, October 13, 2007 - link

    Demo files arent linked correctly, cant get them to download in either firefox or IE.
  • tmx220 - Saturday, October 13, 2007 - link

    Why wasn't the HD 2900Pro not tested?
    The review boasts the 8800GTS 320MB as the best value but clearly it didn't have its price competitor (the HD 2900Pro) to go up against
  • Proteusza - Saturday, October 13, 2007 - link

    This test is kinda screwed I think. They didnt test the 8800 GTS 640, but then claim ATI is the winner at that price point. Hello? You didnt test it, how do you know? They also state that the 320 mb version performs identically at lower resolutions and AA settings, which is true, but they didnt test the 640mb version so we wont know how it performs against the 2900XT at high res/AA settings. Thanks guys.
  • tonjohn - Saturday, October 13, 2007 - link

    Probably b/c the GTS is already competitive with the XT as it is.
  • 8steve8 - Friday, October 12, 2007 - link

    it's misleading (not saying intenionally... but still...)
    to compare cpu costs without considering chipset costs.
    intel motheboards have higher costs partially due to their memory controller.

    what is more applicable to us is cpu+motherboard cost comparasons
    or cpu+motherboard+ram (but here we assume its all the same ddr2 so doesn't really matter)



    just one example (i was recently looking at building a system with hdmi)

    cheapest core 2 duo board with hdmi is $114 shipped at newegg (an ati chipset)
    cheapest intel chipset board for core 2 duo with hdmi is $126 shipped at newegg

    cheapest AM2 board with hdmi is $72 shipped (ati chipset)
    cheapest AM2 nvidia chipset board with hdmi is $79 shipped.

    so maybe this particular type of motherboard isnt a great example. but here we see an avg price difference of over $40.

    for my purpose that means intel cpus have to be $40 cheaper with teh same performance...





    before anyone spams me, i agree intel has better cpus right now, but comparing cost on cpu alone is not relevant to consumers, when its useless without a motherboard.
  • mcnabney - Friday, October 12, 2007 - link

    That is a very valid point to make. However, motherboards and the chipsets inboard can also impact performance, so once you start adding more variables this simple article will need a Rosseta stone to decipher.

    also, since most of these GPUs can be run in SLI/Crossfire, does either ATI or Nvidia scale better with a second card?
  • KeithP - Friday, October 12, 2007 - link

    It would have been far more useful with some older GPUs benchmarked.
  • johnsonx - Sunday, October 14, 2007 - link

    indeed... why spend hours testing with 5 different speeds of the same CPU, yet not even bother with any X1k series and GeForce 7 series GPU's? Do we really need tests to tell us an X2-5000 is a little faster than an X2-4800, which is a little faster than a 4600, which is a little faster than a 4400, etc.? At most we'd need tests of the fastest and slowest sample of each processor type. I guess for Intel this is mostly what was done, although even a couple of those could be dropped, but for AMD X2's only 2 or maybe 3 needed to be included. Also, how about a couple of common single cores to see how much dual-core benefits the new source engine, say an A64-3500 and a 3.2Ghz P4?
  • Vidmar - Saturday, October 13, 2007 - link

    No doubt. Here I was expecting to see how the game may run on nVidia 7800 or 7900 cards. A bit misleading on them to suggest that this would be a comprehensive review. Far from it with only 4 GPUs used.

Log in

Don't have an account? Sign up now