Battlefield 3

Our final action game of our benchmark suite is Battlefield 3, DICE’s 2011 multiplayer military shooter. Its ability to pose a significant challenge to GPUs has been dulled some by time and drivers, but it’s still a challenge if you want to hit the highest settings at the highest resolutions at the highest anti-aliasing levels. Furthermore while we can crack 60fps in single player mode, our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, so hitting high framerates here may not be high enough.

Battlefield 3 is another game that NVIDIA traditionally does well in, despite the fact that both sides have wrung out some rather impressive performance increases over the last year. At 2560 the GTX 780 enjoys a 33% lead over the 7970GE, a 27% lead over the GTX 680, and a massive 85% lead over the GTX 580. Furthermore this is fast enough to get it past 60fps at 2560, which means our minimum framerates should dip no lower than the mid-30s even in the most hectic multiplayer maps.

 

Battlefield 3 - Delta Percentages - 2560x1440 - Ultra

Battlefield 3 - 95th Percentile FT - 2560x1440 - Ultra

Far Cry 3 Civilization V
Comments Locked

155 Comments

View All Comments

  • mac2j - Thursday, May 23, 2013 - link

    The problem with $650 vs $500 for this price point is this:

    I can get 2 x 7950s for <$600 - that's a setup that destroys a 780 for less money.

    Even if you're single-GPU limited $250 is a lot of extra cash for a relative small amount of performance gain.
  • Ytterbium - Thursday, May 23, 2013 - link

    I'm disappointed they decided to cut the compute to 1/24 vs 1/3 in Titan, AMD is much better value for compute tasks.
  • BiffaZ - Friday, May 24, 2013 - link

    Except much consumer (@home type) compute is SP not DP so it won't make much difference. SP performance is around equal or higher than AMD's in 780.
  • Nighyal - Thursday, May 23, 2013 - link

    I don't know if this is possible but it would be great to see a benchmark that showed power, noise and temperature at a standard work load. We can get an inferred idea of clock per watt performance but when you're measuring a whole system other factors come into play (you mentioned CPU loads scaling with increased GPU performance).

    My interest in this comes from living in a hot climate (Australia) where a computer can throw out a very noticeable amount of heat. The large majority of my usage is light gaming (LoL) but I occasionally play quite demanding single player titles which stretches the legs of my GPU. The amount of heat thrown out is directly proportional to power draw so to be able to clearly see how many less watts a system requires for a controlled work load would be a handy comparison for me.

    TL:DR - Please also measure temperature, noise and power at a controlled workload to isolate clock per watt performance.
  • BiggieShady - Friday, May 24, 2013 - link

    Kudos on the FCAT and the delta percentages metrics. So 32,2% for 7990 means that on average one frame is present 32,2% more time than the next. Still, it is only an average. Great extra info would be to show same metrics that averages only the deltas higher then the threshold delta, and display it on the graph with varying thresholds.
  • flexy - Friday, May 24, 2013 - link

    NV releases a card with a ridiculous price point of $1000. Then they castrate the exact same card and give it a new name, making it look like it's a "new card" and sell it cheaper than their way overpriced high end card. Which, of course, is a "big deal" (sarcasm) given the crazy price of Titan. So or so, I don't like what NV does, in the slightest.

    Many ages ago, people could buy *real* top of the line cards which always cost about $400-$500, today you pay $600 for "trash cards" which didn't make it into production for Titan due to sub-par chips. Nvidia:"Hey, let's just make-up a new card and sell those chips too, lols"

    Please AMD, help us!!
  • bds71 - Friday, May 24, 2013 - link

    for what it's worth, I would have like to have seen the 780 *truly* fill the gap between the 680 and titan by offering not only the gaming performance, but ALSO the compute performance - if they would have done a 1/6 or even 1/12!! to better fill the gap and round out the performance all around I would HAPPILY pay 650 for this card. as it is, I already have a 690, so I will simply get another for 4k gaming - but a comparison between 3x 780's and 2 690's (both very close to $2k) at 8Mpixels+ resolution would be extremely interesting. note: 3x 30" monitors could easily be configured for 4800x2560 resolution via NVidia surround or eyefinity - and I, for one, would love to see THAT review!!
  • flexy - Friday, May 24, 2013 - link

    Well compute performance is the other thing, along with their questionable GPU throttle aka "boost" (yeah right) technology. Paying premium for such a card and then weak compute performance in exchange compared to older gen cards or the AMD offerings... Seriously, there is a lot to not like about Kepler, at least from an enthusiast point of view. I hope that NV doesn't continue that route in the future with their cards becoming less attractive while prices go up.
  • EJS1980 - Wednesday, May 29, 2013 - link

    Cynical much?
  • ChefJeff789 - Friday, May 24, 2013 - link

    Glad to see the significant upgrade. I just hope that AMD forces the prices back down again soon. I hope the AMD release "at the end of the year" is closer to September than December. It'll be interesting to see how they stack up. BTW, I have shied away from AMD cards ever since I owned an X800 and had SERIOUS issues with the catalyst drivers (constant blue-screens, had to do a Windows clean-install to even get the card working for longer than a few minutes). I know this was a long time ago, and I've heard from numerous people that they're better now. Is this true?

Log in

Don't have an account? Sign up now