Battlefield 3

Its popularity aside, Battlefield 3 may be the most interesting game in our benchmark suite for a single reason: it’s the first AAA DX10+ game. It’s been 5 years since the launch of the first DX10 GPUs, and 3 whole process node shrinks later we’re finally to the point where games are using DX10’s functionality as a baseline rather than an addition. Not surprisingly BF3 is one of the best looking games in our suite, but as with past Battlefield games that beauty comes with a high performance cost.

Battlefield 3 has been NVIDIA’s crown jewel; a widely played multiplayer game with a clear lead for NVIDIA hardware. And with multi-GPU thrown into the picture that doesn’t change, leading to the GTX 690 once again taking a very clear lead here over the 7970CF at all resolutions. With that said, we see something very interesting at 5760, with NVIDIA’s lead shrinking by quite a bit. What was a 21% lead at 2560 is only a 10% at 5760. So far we haven’t seen any strong evidence of NVIDIA being VRAM limited with only 2GB of VRAM and while this isn’t strong evidence that the situation has changed is does warrant consideration. If anything is going to be VRAM limited after all it’s BF3.

Meanwhile compared to the GTX 680 SLI the GTX 690 is doing okay here. It’s only achieving 93% of the GTX 680 SLI’s performance at 2560, but for some reason pulls ahead at 5760, covering that to 96% of the performance of the dual video card setup.

Portal 2 Starcraft II
Comments Locked

200 Comments

View All Comments

  • Death666Angel - Thursday, May 3, 2012 - link

    The few reviews I've seen have 4GB GTX 680 card between 5% and 10% faster at high resolutions (starting at 2560x1440 to 7860x1600). Adding, on top of that some more memory bandwidth would have been the gaming card most people expected from nVidia.
    As it stands, the GTX 680 is good, but also very expensive (I can have t he 7970 for 65€ less). The GTX 690 is a good product for people who want SLI but don't have the space, PSU, SLI enabled mainboard or want 4 GPUs.
  • CeriseCogburn - Saturday, May 5, 2012 - link

    Sure, link us to a single review that shows us that. Won't be HardOcp nor any as popular, as every review has shown the exact opposite.
  • kallogan - Thursday, May 3, 2012 - link

    Where are the middle range gpus ?

    Wait. Nvidia don't release them cause they can't provide enough quantities.
  • CeriseCogburn - Thursday, May 3, 2012 - link

    They're being held back like the "real 680" top nVidia core, because nVidia is making more money selling the prior launches and the new 2nd tier now top dog cards.
    It's a conspiracy of profit and win.
  • silverblue - Thursday, May 3, 2012 - link

    Yes, because making a small number of full size Kepler cores is obviously going to make them more money than a large number of less complex Kepler cores. *rolleyes*

    NVIDIA, assuming they had the ability to get them manufactured in large enough quantities, would make far more profit off a 660 or 670 than they ever would off a 680.
  • silverblue - Thursday, May 3, 2012 - link

    (I mean making far more profit off the 660/670 series than the 680 series, not specific cards nor the profit per card)
  • CeriseCogburn - Tuesday, May 8, 2012 - link

    A lot of prior gen stock moving, take a look you're on the internet, not that hard to do, wonder why you people are always so clueless.
  • CeriseCogburn - Tuesday, May 8, 2012 - link

    For instance the entire lot of 7870's and 7850's on the egg are outsold by a single GTX680 by EVGA - confirmed purchasers reviews.
    So it appears nVidia knows what it's doing in a greatly superior manner than your tiny mind spewing whatever comes to it in a few moments of raging rebuttal whilst you try to "point out what's obvious" - yet is incorrect.
  • zcat - Thursday, May 3, 2012 - link

    Ditto.

    Every time my Anandtech feed updates, the first thing I'm hoping to see is reviews for the more-reasonably priced, and less power-hoggy GTX 640 (w/GDDR5) and GTX 660 Ti. If we see a review, then at least we know it'll show up at retail very soon after.

    All I want for xmas is a mid-range NVidia card with a higher idle wattage to maximum performance ration than AMD (because NVidia > AMD wrt drivers, esp under linux).
  • zcat - Thursday, May 3, 2012 - link

    correction: idle Watts <= 10 && max performance >= AMD.

Log in

Don't have an account? Sign up now