Battlefield 3

Its popularity aside, Battlefield 3 may be the most interesting game in our benchmark suite for a single reason: it’s the first AAA DX10+ game. It’s been 5 years since the launch of the first DX10 GPUs, and 3 whole process node shrinks later we’re finally to the point where games are using DX10’s functionality as a baseline rather than an addition. Not surprisingly BF3 is one of the best looking games in our suite, but as with past Battlefield games that beauty comes with a high performance cost

Battlefield 3 - 2560x1600 - Ultra Quality + FXAA-High

Battlefield 3 - 1920x1200 - Ultra Quality + 4xMSAA

Battlefield 3 - 1920x1200 - Ultra Quality + FXAA-High

Battlefield 3 - 1680x1050 - High Quality + FXAA-High

NVIDIA’s cards have always done well at Battlefield 3, which puts the Radeon HD 7900 series in a bad position from the beginning. Short of the GTX 680’s massive lead in the Portal 2 bonus round, this is the single biggest victory for the GTX 680 over the 7970, beating AMD’s best by 28% at 2560, and by continually higher amounts at lower resolutions. Based on our experience with BF3 I’d hesitate to call the 680 fully fluid at 2560 as large firefights can significantly tear into performance relative to Thunder Run, but if it’s not fully fluid then it’s going to be very, very close.

What’s also interesting here is that once again the GTX 680 is doing very well compared to the dual-GPU cards. The GTX 590 and 6990 never pull away from the GTX 680, and at 1920 with FXAA the GTX 680 finally squeaks by and takes the top of the chart. Performance relative to the GTX 580 is also once again good for that matter, with the GTX 680 beating its predecessor by 48% at almost every resolution.

Portal 2 Starcraft II
Comments Locked

404 Comments

View All Comments

  • chizow - Thursday, March 22, 2012 - link

    Did not see OC results listed anywhere, will they be added to this article or an addendum/supplement later?
  • Ryan Smith - Thursday, March 22, 2012 - link

    It will be added to this article. We may also do a separate pipeline article, but everything will be added here for future reference.
  • SlyNine - Thursday, March 22, 2012 - link

    Adaptive v-sync, is that like tripple buffering? never heard of it but it sounds interesting.
  • iwod - Thursday, March 22, 2012 - link

    In those high res test it seems it is seriously limited by bandwidth. If you could get 7Ghz GDDR5 or MemoryCube i guess it would have performed MUCH better.

    I think we finally got back to basic. What GPU is all about. Graphics. We have been spending too much time doing GPGPU which only benefits a VERY small percentage of the consumer market.
  • PeteRoy - Thursday, March 22, 2012 - link

    Unfortunately there is no game that can benefit from the power this card has to give.

    We are in 2012 and the PC gaming graphic level is stuck in the year 2007 with Crysis as the last game to push the limits of video cards.

    Since 2006 when the Playstation 3 and Xbox 360 took over the gaming industry all games are made for these consoles therefore all games have the graphic technology of DirectX 9.

    Battlefield 3 in the end of 2011 still doesn't look as good as Crysis from 2007.
  • Sabresiberian - Thursday, March 22, 2012 - link

    YOU have no use for this card because you are clearly playing Tetris at 1024x768. Other people have much higher resolutions and play much more demanding games.

    Some of them even - GASP! - run more than one monitor. Imagine that.

    ;)
  • CeriseCogburn - Thursday, March 22, 2012 - link

    I agree. The 7970 isn't doing it for me @ 19x12 w SB@4800x4. MOAR.
  • SlyNine - Thursday, March 22, 2012 - link

    Crysis 2 will full DX11 plus texture pack tottaly tears my 5870 a new one. unless these cards are now pumping out 3x the FPS than the performance is needed.
  • Ahnilated - Thursday, March 22, 2012 - link

    I waited all this time hoping the GTX680's would be an awesome card to upgrade from my GTX480's that I run in SLI. Well I am very disappointed now. I guess I continue waiting.
  • Pessimism - Thursday, March 22, 2012 - link

    Do not forget that NVIDIA knowingly and willingly peddled defective chips to multiple large vendors. The real question is whether they have mastered the art of producing chips that will remain bonded to the product they are powering, and whether they will turn to slag once reaching normal operating temperatures.

Log in

Don't have an account? Sign up now