Left 4 Dead Analysis

Based on Valve's Source engine, Left 4 Dead can run fairly smoothly with any card we tested at any resolution with the maximum settings. The game is definitely not bad looking either, so getting playable framerates on the Radeon HD 4850 at 2560x1600 is no small feat. We do test with a custom demo that makes use of a heavy swarm of zombies in an outdoor area, and though the performance impact is as heavy as we could make it in our benchmark, it may still be possible to hit situations where the lowest end cards stutter with lots of enemies around at very high resolution.




1680x1050    1920x1200    2560x1600


At 1680x1050 and 1920x1200, CrossFire and single ATI cards tend to do better than their NVIDIA counterparts. The GTX 295 does hang out near the top, though. Oddly, the 9800 GTX+ SLI does better than GT200 2 card solutions. If it were just the single card GTX 295 that performed better than the two card options, we would speculate that there was some bus bandwidth or latency issue that caused problems, but it seems that there's something else limiting the performance of the NVIDIA GT200 SLI options. Of course, with the high framerates we see, we aren't exactly complaining. We recommend turning on triple buffering for this game to both eliminate tearing and minimize the input latency possible when just enabling vsync.




1680x1050    1920x1200    2560x1600


Performance at 2560x1600 shirts the playing field putting SLI and CrossFire on more equal footing. While NVIDIA's GTX 260 options lead the competing 4870 class options, the 4850 does very well against its competition. There are no real disappointments with this game and any multiGPU solution though.

When it comes to scaling, at lower resolutions we see CPU/system limited performance affecting the improvement possible with multiple GPUs. The 9800 GTX and 4850 are the only cards that see any real improvement at 1680x1050, and it's not even that much better at 1920x1200. Moving up to 2560x1600 we finally see most of the players above 50% scaling. The exceptions are the fastest single GPU configurations in this test: the GTX 280 and GTX 285.




1680x1050    1920x1200    2560x1600


Between the two "low" resolutions we test, there's no change in the value lineup even though there are shifts in the performance lineup. At these resolutions, multiGPU options tend not to be a good investment because of the CPU/system limit issue. The exception are the 4850 CrossFire and 9800 GTX+ SLI because of the fact that the single card options are GPU limited and we see better scaling for the money with two GPUs. At higher resolution, value compresses more and the 4870 1GB drops in value while NVIDIA hardware pushes up a bit.

FarCry 2 Analysis Race Driver GRID Analysis
Comments Locked

95 Comments

View All Comments

  • Hauk - Monday, February 23, 2009 - link

    To you grammer police... get a life will ya?!?

    Who gives a rats ass! It's the data!

    Your smug comments are of ZERO value here. You want to critique, go to a scholarly forum and do so.

    Your whining is more of a distraction! How's that for gramaticly correct?
  • Slappi - Tuesday, February 24, 2009 - link


    It should be grammar not grammer.


  • SiliconDoc - Wednesday, March 18, 2009 - link

    Grammatically was also spelled incorrectly.
    lol
  • The0ne - Monday, February 23, 2009 - link

    "In general, more than one GPU isn't that necessary for 1920x1200 with the highest quality settings,..."

    I see many computer setups with 22" LCDs and lower that have high end graphic cards. It just doesn't make sense to have a high end card when you're not utilizing the whole potential. Might as well save some money up front and if you do need more power, for higher resolutions later, you can always purchase an upgrade at a lower cost. Heck, most of the time there will be new models out :)

    Then again, I have a qaud-core CPU that I don't utilize too but... :D
  • 7Enigma - Monday, February 23, 2009 - link

    Everyone's situation is unique. In my case I just built a nice C2D system (OC'd to 3.8GHz with a lot of breathing room up top). I have a 4870 512meg that is definitely overkill with my massive 19" LCD (1280X1024). But within the year I plan on giving my dad or wife my 19" and going to a 22-24". Using your logic I should have purchased a 4850 (or even 4830) since I don't NEED the power. But I did plan ahead to future proof my system for when I can benefit from the 4870.

    I think many people also don't upgrade their systems near as frequently as some of the enthusiasts do. So we spend a bit more than we would need to at that particular time to futureproof a year or two ahead.

    Different strokes and all that...
  • strikeback03 - Monday, February 23, 2009 - link

    The other side of the coin is that most likely for similar money, you could have bought something now that more closely matches your needs, and a 4870 in a year once it has been replaced by a new card if it still meets your needs.
  • 7Enigma - Tuesday, February 24, 2009 - link

    Of course. Or I could spend $60 now, another $60 in 3 months, and you see the point. It's all dependant on your actual need, your perceived need, and your desire to not have to upgrade frequently.

    I think the 4870 is one of those cards like the ATI 9800pro that has a perfect combination of price and performance to be a very good performer for the long haul (similarly to how the 8800GTS was probably the best part from a price/performance/longevity standpoint if you were to buy it the day it first came out).

    Also important is looking at both companies and seeing what they are releasing in the next 3-6 months for your/my particular price range. Everything coming out seems to be focused either on the super high end, or the low end. I don't see any significant mid-range pieces coming out in the next 3-6 months that would have made me regret my purchase. If it was late summer or fall and I knew the next round of cards were coming out I *may* have opted for a 9600GT or other lower-midrange card to hold over until the next big thing but as it stands I'll get easily a year out of my card before I even feel the need to upgrade.

    Frankly the difference between 70fps and 100fps at the resolutions I would be playing (my upgrade would be either to a 22 or 24") is pretty moot.
  • armandbr - Monday, February 23, 2009 - link

    http://www.xbitlabs.com/articles/video/display/rad...">http://www.xbitlabs.com/articles/video/display/rad...

    here you go

  • Denithor - Monday, February 23, 2009 - link

    Second paragraph, closing comments:
    quote:

    This unique card really shined and held it's own all the way up to 2560x1600.


    Fourth paragraph, closing comments:
    quote:

    But when were talking multiple graphics cards for rendering it's really not worth it without the highest resolution around.


    Please remove the apostrophe from the first sentence (where it should read its) and instead move it to the second (which should be we're).

    Otherwise excellent article. This is the kind of work I remember from years past that originally brought me to the site.

    One thing - would it be too difficult to create a performance/watt chart based on a composite performance score for each single/pair of cards?

    I do think you really pushed the 4850X2 a bit too much. The 9800GTX+ provides about the same level of performance (better in some cases, worse in others) and the SLI version manages to kick the crap out of the GTX 280/285 nearly across the board (with the exception of a couple of 2560x1600 memory-constricted cases) at a lower price point. That's actually in my mind one of the best performance values available today.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Forget about Derek removing the apostrophe, how about removing the raging red fanboy ati drooling ?
    When the GTX260 SLI scores the 20 games runs of 21, and the 4850 DOESN'T, Derek is sure to mention not the GTX260, and on the very same page blab the 4850 sapphire "ran every test"...
    This is just another red raging fanboy blab - so screw the apostrophe !
    Nvidiai DISSED 'em because they can see the articles Derek posts here bleeding red all over the place.
    DUH.

Log in

Don't have an account? Sign up now