Battlefield 4

Kicking off our 2015 benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has since become a challenging game in its own right and a showcase title for low-level graphics APIs. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.

Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

Battlefield 4 - 3840x2160 - Medium Quality

Battlefield 4 - 2560x1440 - Ultra Quality

After stripping away the Frostbite engine’s expensive (and not wholly effective) MSAA, what we’re left with for BF4 at 4K with Ultra quality puts the GTX Titan X in a pretty good light. At 58.3fps it’s not quite up to the 60fps mark, but it comes very close, close enough that the GTX Titan X should be able to stay above 30fps virtually the entire time, and never drop too far below 30fps in even the worst case scenario. Alternatively, dropping to Medium quality should give the GTX Titan X plenty of headroom, with an average framerate of 94.8fps meaning even the lowest framerate never drops below 45fps.

From a benchmarking perspective Battlefield 4 at this point is a well optimized title that’s a pretty good microcosm of overall GPU performance. In this case we find that the GTX Titan X performs around 33% better than the GTX 980, which is almost exactly in-line with our earlier performance predictions. Keeping in mind that while GTX Titan X has 50% more execution units than GTX 980, it’s also clocked at around 88% of the clockspeed, so 33% is right where we should be in a GPU-bound scenario.

Otherwise compared to the GTX 780 Ti and the original GTX Titan, the performance advantage at 4K is around 50% and 66% respectively. GTX Titan X is not going to double the original Titan’s performance – there’s only so much you can do without a die shrink – but it continues to be amazing just how much extra performance NVIDIA has been able to wring out without increasing power consumption and with only a minimal increase in die size.

On the broader competitive landscape, this is far from the Radeon R9 290X/290XU’s best title, with GTX Titan X leading by 50-60%. However this is also a showcase title for when AFR goes right, as the R9 295X2 and GTX 980 SLI both shoot well past the GTX Titan X, demonstrating the performance/consistency tradeoff inherent in multi-GPU setups.

Finally, shifting gears for a moment, gamers looking for the ultimate 1440p card will not be disappointed. GTX Titan X will not get to 120fps here (it won’t even come close), but at 78.7fps it’s well suited for driving 1440p144 displays. In fact it’s the only single-GPU card to do better than 60fps at this resolution.

Our 2015 GPU Benchmark Suite & The Test Crysis 3
Comments Locked

276 Comments

View All Comments

  • dragonsqrrl - Tuesday, March 17, 2015 - link

    Had no idea that non reference Hawaii cards were generally undervolted resulting in lower power consumption. Source?
  • chizow - Tuesday, March 17, 2015 - link

    There is some science behind it, heat results in higher leakage resulting in higher power consumption. But yes I agree, the reviews show otherwise, in fact, they show the cards that dont' throttle and boost unabated draw even more power, closer to 300W. So yes, that increased perf comes at the expense of higher power consumption, not sure why the AMD faithful believe otherwise.
  • FlushedBubblyJock - Saturday, March 21, 2015 - link

    Duh. It's because they hate Physx.
  • Kutark - Tuesday, March 17, 2015 - link

    Yes, some of the new designs from aftermarket are cooler and quiter, but they dont use less power, the GPU is generating the power, the aftermarket companies can't alter that. They can only tame the beast, so to speak.
  • Yojimbo - Tuesday, March 17, 2015 - link

    Would be a good point if the performance were the same. But the Titan X is 50% faster. The scores are also total system power usage under gaming load, not card usage. Running at 50% faster frame rates is going to tax other parts of the system more, as well.
  • Kutark - Tuesday, March 17, 2015 - link

    You're kidding right. Your framerate in no way affects your power usage.
  • nevcairiel - Tuesday, March 17, 2015 - link

    Actually, it might. If the GPU is faster, it might need more CPU power, which in turn can increase power draw from the CPU.
  • DarkXale - Tuesday, March 17, 2015 - link

    Of course. Its the entire point of DX12/Mantle/Vulcan/Metal to reduce per-frame CPU work, and as a consequence per-frame CPU power consumption.
  • Yojimbo - Tuesday, March 17, 2015 - link

    The main point of my post is that Titan X gets 50% more performance/system watt. But yes, your frame rate should affect your power usage if you are GPU-bound. The CPU, for instance, will be working harder maintaining the higher frame rates. How much harder, I have no idea, but it's a variable that needs to be considered before testbug00's antecedent can be considered true.
  • dragonsqrrl - Wednesday, March 18, 2015 - link

    Actually frame rates have a lot to do with power usage.

    I don't think that needs any further explanation, anyone who's even moderately informed knows this, and even if they didn't could probably figure out why this might be the case in about 10 seconds.

Log in

Don't have an account? Sign up now