Fallout 3 Performance

Fallout 3 is based on the Gamebryo engine that powered Oblivion, though it has gone through modifications and tweaks along the way. The game looks great with all the settings cranked and it's pretty fun as well. We disabled vsync and set iPresentInterval to zero, but we still had some quirks with framerate. It seems like we may be system limited in some ways, which is impressive given our system. In the future we will try to refine our test to make it more graphically intense though some .ini file tweaks and see if that helps out.


Click to Enlarge

Framerates are compressed up near the 77 frames per second. The lead the NVIDIA hardware has here is thus also compressed. In running around, it is clear framerate is not limited: we saw max FPS up over 110, but the benchmark we ran just ends up coming out this way. This is a straight line outdoor run at twilight.

Again, the single Radeon HD 4870 1GB leads the GTX 260, but GTX 260 SLI outperforms the 4870 X2 indicating an SLI advantage over CrossFire (or an NVIDIA driver advantage over AMD -- without more low level technical data we can't assess what exactly the issue is). But as this article focuses on the GTX 295, even though the gap looks small we can say that this one goes to NVIDIA as well.

Crysis Warhead Performance FarCry 2 Performance
Comments Locked

100 Comments

View All Comments

  • SiliconDoc - Monday, January 12, 2009 - link

    Pssst ! The GTX295 wins hands down in both those departments...that's why it's strangely "left out of the equation".
    (most all the other sites already reported on that - heck it was in NVidia's literature - and no they didn't lie - oh well - better luck next time).
  • Amiga500 - Wednesday, January 14, 2009 - link

    Well... to be honest...


    If leaving out power consumption pisses people like you off - good one anandtech!


    (I guess your nice and content your nvidia e-penis can now roam unopposed?)
  • SiliconDoc - Thursday, January 15, 2009 - link

    First of all, it's "you're" when you are referring to my E-PENIS. (Yes, please also capitalize ALL the letters to be properly indicative of size.)
    Second, what were you whining about ?
    Third, if you'd like to refute my points, please make an attempt, instead of calling names.
    Fourth, now you've fallen to the lowest common denominator, pretending to hate a fellow internet poster, and supporting shoddy, slacker work, with your own false fantasy about my temperament concerning power and the article.
    What you failed to realize is me pointing out the NVidia advantage in that area, has actually pissed you off, because the fanboy issue is at your end, since you can't stand the simple truth.
    That makes your epeenie very tiny.
  • kk650 - Tuesday, January 13, 2009 - link

    Remove yourself from the gene pool, fuckwit americunt
  • darckhart - Monday, January 12, 2009 - link

    from all the games where the gtx295 beats the 4870x2, it's only a 3-5 fps win. i don't see how that "gets the nod even considering price." at best, that's $10 per frame. i think we need to see thermals and power draw (i don't recall if you talked about these in the earlier article) to better justify that extra $50.
  • JarredWalton - Tuesday, January 13, 2009 - link

    I bought a 4870X2 a couple months back... if I had had the option of the GTX 295, it would have been my pick for sure. I wanted single-slot, dual-GPU (because I've got a decent platform but am tired to dual video cards). I cannot overstate the importance of drivers, and frankly ATI's drivers still disappoint on a regular basis. Until and unless ATI can get driver profiles into their drivers for CrossFire, I don't think I can ever feel happy with the solution. Also, 4870X2 drivers need to bring back the "CrossFire disable" option in the drivers; we all know there are two GPUs, and there are still occasional games where CrossFire degrades performance over a single GPU.
  • TheDoc9 - Monday, January 12, 2009 - link

    Definitely a half-ass review, something I don't expect from Anandtech. Something more to come later?

    Many questions on these cards can still be asked;
    -Testing at other resolutions, not just a recommendation to stay away unless playing at 2500 res. and a 30" monitor.
    -Testing on other rigs, such as a mid range quad core and dual core to give us an idea of how it might perform on our rig (who don't own a mega clocked i7)

    I don't like to sound negative, but honestly there was no enthusiasm written in this preview/review/snapshot/whatever it's supposed to be. Kind of disappointing how every other major site has had complete reviews since launch day. Was this written on the plane trip home?
  • bigonexeon - Friday, January 16, 2009 - link

    i dont see the point of using an intel i7 as intel released a article that the i7 cache 3 has a memory leak thats only attempt at fixing it is a software patch which there not going too fix until the second generation of the i7s also why compare standard cards too a newer card why not put the older cards newer designs against the newer cards etc the superclocks,xxx which is the latest edition of the old model used
  • theprodigalrebel - Monday, January 12, 2009 - link

    Might want to take a second look at the line graph and the table below it.
  • Iketh - Monday, January 12, 2009 - link

    This article was just right. I had no enthusiasm to read about this card because there isnt anything to get excited about. Apparently Derek didn't either. Im sure there will that enthusiasm again when a next gen card appears and there is something new to talk about.

    It's also having to follow the Phenom II article.

Log in

Don't have an account? Sign up now