Age of Conan Performance

We've added a bunch of new tests this time around, but we decided to keep a few games around. Age of Conan is one of these, and it's one of the games we've consistently tested that favors AMD hardware. As we can see not much has changed this time around either. The none of the NVIDIA hardware can keep up with the AMD Radeon HD 4870 X2 in this test. As for NVIDIA versus NVIDIA, the GeForce GTX 295 splits the difference between the GTX 260 SLI and GTX 280 SLI setups.


Click to Enlarge

With this game, at 2560x1600 and our quality settings (not even the highest possible), a multi-GPU solution is required for higher than 30fps gameplay. The nature of the game makes it playable at slightly lower framerates, but the safe bet is on lowering settings or getting more than one GPU. We don't expect that the GTX 285 will be able to keep up with the Radeon HD 4870 1GB either.

We're also still waiting for the DirectX 10 version of the game to come out, and we will transition to that version when we are able.

Index Call of Duty World at War Performance
Comments Locked

100 Comments

View All Comments

  • SiliconDoc - Monday, January 12, 2009 - link

    Pssst ! The GTX295 wins hands down in both those departments...that's why it's strangely "left out of the equation".
    (most all the other sites already reported on that - heck it was in NVidia's literature - and no they didn't lie - oh well - better luck next time).
  • Amiga500 - Wednesday, January 14, 2009 - link

    Well... to be honest...


    If leaving out power consumption pisses people like you off - good one anandtech!


    (I guess your nice and content your nvidia e-penis can now roam unopposed?)
  • SiliconDoc - Thursday, January 15, 2009 - link

    First of all, it's "you're" when you are referring to my E-PENIS. (Yes, please also capitalize ALL the letters to be properly indicative of size.)
    Second, what were you whining about ?
    Third, if you'd like to refute my points, please make an attempt, instead of calling names.
    Fourth, now you've fallen to the lowest common denominator, pretending to hate a fellow internet poster, and supporting shoddy, slacker work, with your own false fantasy about my temperament concerning power and the article.
    What you failed to realize is me pointing out the NVidia advantage in that area, has actually pissed you off, because the fanboy issue is at your end, since you can't stand the simple truth.
    That makes your epeenie very tiny.
  • kk650 - Tuesday, January 13, 2009 - link

    Remove yourself from the gene pool, fuckwit americunt
  • darckhart - Monday, January 12, 2009 - link

    from all the games where the gtx295 beats the 4870x2, it's only a 3-5 fps win. i don't see how that "gets the nod even considering price." at best, that's $10 per frame. i think we need to see thermals and power draw (i don't recall if you talked about these in the earlier article) to better justify that extra $50.
  • JarredWalton - Tuesday, January 13, 2009 - link

    I bought a 4870X2 a couple months back... if I had had the option of the GTX 295, it would have been my pick for sure. I wanted single-slot, dual-GPU (because I've got a decent platform but am tired to dual video cards). I cannot overstate the importance of drivers, and frankly ATI's drivers still disappoint on a regular basis. Until and unless ATI can get driver profiles into their drivers for CrossFire, I don't think I can ever feel happy with the solution. Also, 4870X2 drivers need to bring back the "CrossFire disable" option in the drivers; we all know there are two GPUs, and there are still occasional games where CrossFire degrades performance over a single GPU.
  • TheDoc9 - Monday, January 12, 2009 - link

    Definitely a half-ass review, something I don't expect from Anandtech. Something more to come later?

    Many questions on these cards can still be asked;
    -Testing at other resolutions, not just a recommendation to stay away unless playing at 2500 res. and a 30" monitor.
    -Testing on other rigs, such as a mid range quad core and dual core to give us an idea of how it might perform on our rig (who don't own a mega clocked i7)

    I don't like to sound negative, but honestly there was no enthusiasm written in this preview/review/snapshot/whatever it's supposed to be. Kind of disappointing how every other major site has had complete reviews since launch day. Was this written on the plane trip home?
  • bigonexeon - Friday, January 16, 2009 - link

    i dont see the point of using an intel i7 as intel released a article that the i7 cache 3 has a memory leak thats only attempt at fixing it is a software patch which there not going too fix until the second generation of the i7s also why compare standard cards too a newer card why not put the older cards newer designs against the newer cards etc the superclocks,xxx which is the latest edition of the old model used
  • theprodigalrebel - Monday, January 12, 2009 - link

    Might want to take a second look at the line graph and the table below it.
  • Iketh - Monday, January 12, 2009 - link

    This article was just right. I had no enthusiasm to read about this card because there isnt anything to get excited about. Apparently Derek didn't either. Im sure there will that enthusiasm again when a next gen card appears and there is something new to talk about.

    It's also having to follow the Phenom II article.

Log in

Don't have an account? Sign up now