Battlefield 3

Its popularity aside, Battlefield 3 may be the most interesting game in our benchmark suite for a single reason: it’s the first AAA DX10+ game. It’s been 5 years since the launch of the first DX10 GPUs, and 3 whole process node shrinks later we’re finally to the point where games are using DX10’s functionality as a baseline rather than an addition. Not surprisingly BF3 is one of the best looking games in our suite, but as with past Battlefield games that beauty comes with a high performance cost.

Battlefield 3 has been NVIDIA’s crown jewel; a widely played multiplayer game with a clear lead for NVIDIA hardware. And with multi-GPU thrown into the picture that doesn’t change, leading to the GTX 690 once again taking a very clear lead here over the 7970CF at all resolutions. With that said, we see something very interesting at 5760, with NVIDIA’s lead shrinking by quite a bit. What was a 21% lead at 2560 is only a 10% at 5760. So far we haven’t seen any strong evidence of NVIDIA being VRAM limited with only 2GB of VRAM and while this isn’t strong evidence that the situation has changed is does warrant consideration. If anything is going to be VRAM limited after all it’s BF3.

Meanwhile compared to the GTX 680 SLI the GTX 690 is doing okay here. It’s only achieving 93% of the GTX 680 SLI’s performance at 2560, but for some reason pulls ahead at 5760, covering that to 96% of the performance of the dual video card setup.

Portal 2 Starcraft II
Comments Locked

200 Comments

View All Comments

  • CeriseCogburn - Thursday, May 3, 2012 - link

    Keep laughing, this card cannot solid v-sync 60 at that "tiny panel" with only 4xaa in the amd fans revived favorite game crysis.
    Can't do it at 1920X guy.
    I guess you guys all like turning down your tiny cheap cards settings all the time, even with your cheapo panels?
    I mean this one can't even keep up at 1920X, gotta turn down the in game settings, keep the CP tweaked and eased off, etc.
    What's wrong with you guys ?
    What don't you get ?
  • nathanddrews - Thursday, May 3, 2012 - link

    Currently the only native 120Hz displays (true 120Hz input, not 60Hz frame doubling) are 1920x1080. If you want VSYNC @ 120Hz, then you need to be able to hit at least 120fps @ 1080p. Even the GTX690 fails to do that at maximum quality settings on some games...
  • CeriseCogburn - Thursday, May 3, 2012 - link

    It can't do 60 v-sync at 1920 in crysis, and that's only on 4xaa.
    These people don't own a single high end card, that's for sure, or something is wrong with their brains.
  • nathanddrews - Thursday, May 3, 2012 - link

    You must be talking about minimum fps, because on Page 5 the GTX690 is clearly averaging 85fps @1080p.

    Tom's Hardware (love 'em or hate 'em) has benchmarks with AA enabled and disabled. Maximum quality with AA disabled seems to be the best way to get 120fps in nearly every game @ 1080p with this card.
  • CeriseCogburn - Friday, May 4, 2012 - link

    You must be ignoring v-sync and stutter with frames that drop below 60, and forget 120 frames a sec.
    Just turn down the eye candy... on the 3 year old console ports, that are "holding us back"... at 1920X resolutions.
    Those are the facts, combined with the moaning about ported console games.
    Ignore those facts and you can rant and wide eye spew like others - now not only is there enough money for $500 card(s)/$1000dual, there's extra money for high end monitors when the current 1920X pukes out even the 690 and CF 7970 - on the old console port games.
    Whatever, everyone can continue to bloviate that these cards destroy 1920X, until they look at the held back settings benches and actually engage their brains for once.
  • hechacker1 - Thursday, May 3, 2012 - link

    Well not if you want to do consistent 120FPS gaming. Then you need all the horsepower you can get.

    Hell my 6970 struggles to maintain 120FPS, and thus makes the game choppy, even though it's only dipping to 80fps or so.

    So now that I have a 120FPS monitor, it's incredibly easy to see stutters in game performance.

    Time for an upgrade (1080p btw).
  • Sabresiberian - Thursday, May 3, 2012 - link

    Actually, they use the 5760x1200 because most of us Anandtech readers prefer the 1920x1200 monitors, not because they are trying to play favorites.
  • CeriseCogburn - Thursday, May 3, 2012 - link

    Those monitors are very rare. Of course none of you have even one.
  • Traciatim - Thursday, May 3, 2012 - link

    My monitor runs 1920x1200, and I specifically went out of my way to get 16:10 instead of 16:9. You fail.
  • CeriseCogburn - Friday, May 4, 2012 - link

    Yes you went out of your way, why did you have to they are so common, I'm sure you did.
    In any case, since they are so rare the bias is still present here as shown.

Log in

Don't have an account? Sign up now