FarCry 2 Performance

FarCry 2 can lean AMD and NVIDIA depending on the area of the game tested. We tried to go with a fairly balanced test.

FarCry 2

The GTX 285 can must a 10% performance improvement here, and it does still trail the 4870 X2 which has (again) over an 18% performance advantage for the slightly greater than 13% increased cost.


Click to Enlarge

There are three performance teirs here. We've got the single GPU solutions in the bottom teir, the two single card dual-GPU offering bunched up with GTX 260 SLI in the middle and GTX 280 and 285 SLI taking the lead. The GeForce GTX 285 holds it's own against all the other single GPU setups.

Fallout 3 Performance Left 4 Dead Performance
Comments Locked

76 Comments

View All Comments

  • nyran125 - Friday, January 16, 2009 - link

    even the 8800GTS 512mb run all these games still with a decent frame rate of 30 - 80 fps, no point in upgrading at the moment till the net big graphical enhancement or graphical revolution to games , like the next big Crysis or Oblivian type enhancement. Till then , its a complete waste of money buying a newer card if you already have an 8800. Because you know that when the next big graphics game comes out like a new Elder Scrolls(Oblivian) or something the newest card out at that time wont even be enough to run it till 6 months to a year after the game is out. Sometimes it takes even longer for the graphics cards to catch up to the games on Maximum settings. So stop wasting your money lol.
  • Captain828 - Friday, January 16, 2009 - link

    so you're telling me you're getting >30FPS when playing Crysis @ Very High, 1680x1050 + 2xAA + 4xAF ??!
  • ryedizzel - Thursday, January 15, 2009 - link

    Not sure if anyone has said this yet but why the heck are you guys running all your benchmarks at 2560x1600? I mean seriously, how many people are really using monitors that big? AT THE LEAST please show some benchmarks for 1680x1050!
  • Iketh - Friday, January 16, 2009 - link

    lol they are there... are you unable to read a line chart?
  • JarredWalton - Friday, January 16, 2009 - link

    You don't buy parts like the GTX 285/295 or 4870X2 to play at 1680x1050 for the most part. In fact, 2560x1600 and 30" LCDs is the primary reason that I bought a 4870X2 around the time Fallout 3 came out. You can see that even at higher resolutions, there are several titles that are somewhat system limited (i.e. the GPU is so powerful that the benchmark isn't fully stressing the GPU subsystem).
  • MadMan007 - Friday, January 16, 2009 - link

    That's certainly true and I think we understand why the charts are for the highest resolution and it's nice to provide data for lower resolutions. Aside from making a graph for each resolution, perhaps it would be possible to make them interactive somehow...say I click on 1920x1200 below the graph, then that data is charted. What would be really top notch is if I could choose which cards and which resolutions to compare.
  • GhandiInstinct - Friday, January 16, 2009 - link

    MadMan,

    I only wish they did that. Then their reviews would be my #1 source.
  • Stonedofmoo - Friday, January 16, 2009 - link

    But that's just the point though. Most people are still running 22" monitors at 1680x1050 res. We don't NEED top end powerful cards that Nvidia and ATI seem only interested in building.

    What we're looking for are upper midrange parts like a hypothetical GTX 240/220 if they were to exist to replace the aging and now redundant Geforce 9 series.

    Seriously:-
    ATI have more midrange parts than nvidia but really need to work on their rubbish power consumption, especially at idle.
    Nvidia need to actually have some midrange parts but have the power consumption sorted.

    Both need to refocus. I've never seen Nvidia go for so long without releasing a completely new series of cards from top to bottom end.
  • SiliconDoc - Monday, January 19, 2009 - link

    Well more cards are always better and keep us entertained and interested, but this continuous call for "midrange" from NVidia is somewhat puxzzling to me.
    Since the 9800xxxx takes on the 4850, then there's the 9600 on down and the 88gtxx -- I mean it's all covered...
    ATI jusr relased the 4830 on their new highest core crippled to take on the 9800(GT) so they claim...
    I guess if I were NVidia I wouldn't waste my company time pr money on people wanting to read "new technology reviews" based upon cards that REFILL an already filled space that the competition just a bit ago, after 2 years of near NADA, finally brought to market some competition.
    Since ATI was dang out of it for so long - why should NVidia retool the GT200 core to STOMP all their 9800 8800 9600 and the like pieces?
    You want them to destroy themselves and their own line so you can say " Hey, new tech in the midrange - now I can decide if I want a 4830 or 4670 or 4850 or one of these crippled GT200's " - then moments later you'll say to yorself " Wait a minute, why should I get rid of my 9800gtx ?!"...
    I mean REALLY ...
    Would someone please expalin to me what I'm missing ?
    It is all "I want a crippled cheap GT200 core ", isn't it ?
    Maybe part of it is why are we still here when the 8800 was released in Nov 2006 ?
    Maybe the question should be why is ATI still taking on 2 year old NVidia tech.
    AMD just took another huge charge loss from it's ati division, and I'm not certain NV is doing any much better ( though gpu-z shows 65% of the market is NV's ) - so why would NV do an expensive die/core rollout that crushes their already standing cores that compete with ati midrange just fine ?
    It just does not make any sense.
  • hk6900 - Saturday, February 21, 2009 - link


    Die painfully okay? Prefearbly by getting crushed to death in a
    garbage compactor, by getting your face cut to ribbons with a
    pocketknife, your head cracked open with a baseball bat, your stomach
    sliced open and your entrails spilled out, and your eyeballs ripped
    out of their sockets. Fucking bitch

    I really hope that you get curb-stomped. It'd be hilarious to see you
    begging for help, and then someone stomps on the back of your head,
    leaving you to die in horrible, agonizing pain. *beep*

    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.

Log in

Don't have an account? Sign up now