Metro 2033

Paired with Crysis as our second behemoth FPS is Metro: 2033. Metro gives up Crysis’ lush tropics and frozen wastelands for an underground experience, but even underground it can be quite brutal on GPUs, which is why it’s also our new benchmark of choice for looking at power/temperature/noise during a game. If its sequel due this year is anywhere near as GPU intensive then a single GPU may not be enough to run the game with every quality feature turned up.

Metro was another game that the GTX 680 had trouble with, leading to it trailing the 7970 by the slightest bit. With multiple GPUs thrown into the mix that slight gap has significantly widened, leading to the GTX 690 once again trailing the 7970CF, particularly at 2560 and 5760. In this case the GTX 690 is only hitting 82% of the 7970CF’s performance at 5760, and 84% at 2560. It’s only at 1920 (and 100fps) that the GTX 690 can catch up. So much like the GTX 680, NVIDIA’s not necessarily off to a great start here compared to AMD.

Meanwhile GTX 690 performance relative to the GTX 680 SLI once again looks good here, although not quite as great as with Crysis. At 5760 the GTX 690 achieves 96% of the performance, and at 2560 97% of the performance. So far the GTX 690 is more or less living up to NVIDIA’s claims of being two 680s on a single card.

Crysis: Warhead DiRT 3
Comments Locked

200 Comments

View All Comments

  • CeriseCogburn - Thursday, May 3, 2012 - link

    Keep laughing, this card cannot solid v-sync 60 at that "tiny panel" with only 4xaa in the amd fans revived favorite game crysis.
    Can't do it at 1920X guy.
    I guess you guys all like turning down your tiny cheap cards settings all the time, even with your cheapo panels?
    I mean this one can't even keep up at 1920X, gotta turn down the in game settings, keep the CP tweaked and eased off, etc.
    What's wrong with you guys ?
    What don't you get ?
  • nathanddrews - Thursday, May 3, 2012 - link

    Currently the only native 120Hz displays (true 120Hz input, not 60Hz frame doubling) are 1920x1080. If you want VSYNC @ 120Hz, then you need to be able to hit at least 120fps @ 1080p. Even the GTX690 fails to do that at maximum quality settings on some games...
  • CeriseCogburn - Thursday, May 3, 2012 - link

    It can't do 60 v-sync at 1920 in crysis, and that's only on 4xaa.
    These people don't own a single high end card, that's for sure, or something is wrong with their brains.
  • nathanddrews - Thursday, May 3, 2012 - link

    You must be talking about minimum fps, because on Page 5 the GTX690 is clearly averaging 85fps @1080p.

    Tom's Hardware (love 'em or hate 'em) has benchmarks with AA enabled and disabled. Maximum quality with AA disabled seems to be the best way to get 120fps in nearly every game @ 1080p with this card.
  • CeriseCogburn - Friday, May 4, 2012 - link

    You must be ignoring v-sync and stutter with frames that drop below 60, and forget 120 frames a sec.
    Just turn down the eye candy... on the 3 year old console ports, that are "holding us back"... at 1920X resolutions.
    Those are the facts, combined with the moaning about ported console games.
    Ignore those facts and you can rant and wide eye spew like others - now not only is there enough money for $500 card(s)/$1000dual, there's extra money for high end monitors when the current 1920X pukes out even the 690 and CF 7970 - on the old console port games.
    Whatever, everyone can continue to bloviate that these cards destroy 1920X, until they look at the held back settings benches and actually engage their brains for once.
  • hechacker1 - Thursday, May 3, 2012 - link

    Well not if you want to do consistent 120FPS gaming. Then you need all the horsepower you can get.

    Hell my 6970 struggles to maintain 120FPS, and thus makes the game choppy, even though it's only dipping to 80fps or so.

    So now that I have a 120FPS monitor, it's incredibly easy to see stutters in game performance.

    Time for an upgrade (1080p btw).
  • Sabresiberian - Thursday, May 3, 2012 - link

    Actually, they use the 5760x1200 because most of us Anandtech readers prefer the 1920x1200 monitors, not because they are trying to play favorites.
  • CeriseCogburn - Thursday, May 3, 2012 - link

    Those monitors are very rare. Of course none of you have even one.
  • Traciatim - Thursday, May 3, 2012 - link

    My monitor runs 1920x1200, and I specifically went out of my way to get 16:10 instead of 16:9. You fail.
  • CeriseCogburn - Friday, May 4, 2012 - link

    Yes you went out of your way, why did you have to they are so common, I'm sure you did.
    In any case, since they are so rare the bias is still present here as shown.

Log in

Don't have an account? Sign up now