Battlefield 4

Kicking off our 2015 benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has since become a challenging game in its own right and a showcase title for low-level graphics APIs. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.

Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

Battlefield 4 - 3840x2160 - Medium Quality

Battlefield 4 - 2560x1440 - Ultra Quality

After stripping away the Frostbite engine’s expensive (and not wholly effective) MSAA, what we’re left with for BF4 at 4K with Ultra quality puts the GTX Titan X in a pretty good light. At 58.3fps it’s not quite up to the 60fps mark, but it comes very close, close enough that the GTX Titan X should be able to stay above 30fps virtually the entire time, and never drop too far below 30fps in even the worst case scenario. Alternatively, dropping to Medium quality should give the GTX Titan X plenty of headroom, with an average framerate of 94.8fps meaning even the lowest framerate never drops below 45fps.

From a benchmarking perspective Battlefield 4 at this point is a well optimized title that’s a pretty good microcosm of overall GPU performance. In this case we find that the GTX Titan X performs around 33% better than the GTX 980, which is almost exactly in-line with our earlier performance predictions. Keeping in mind that while GTX Titan X has 50% more execution units than GTX 980, it’s also clocked at around 88% of the clockspeed, so 33% is right where we should be in a GPU-bound scenario.

Otherwise compared to the GTX 780 Ti and the original GTX Titan, the performance advantage at 4K is around 50% and 66% respectively. GTX Titan X is not going to double the original Titan’s performance – there’s only so much you can do without a die shrink – but it continues to be amazing just how much extra performance NVIDIA has been able to wring out without increasing power consumption and with only a minimal increase in die size.

On the broader competitive landscape, this is far from the Radeon R9 290X/290XU’s best title, with GTX Titan X leading by 50-60%. However this is also a showcase title for when AFR goes right, as the R9 295X2 and GTX 980 SLI both shoot well past the GTX Titan X, demonstrating the performance/consistency tradeoff inherent in multi-GPU setups.

Finally, shifting gears for a moment, gamers looking for the ultimate 1440p card will not be disappointed. GTX Titan X will not get to 120fps here (it won’t even come close), but at 78.7fps it’s well suited for driving 1440p144 displays. In fact it’s the only single-GPU card to do better than 60fps at this resolution.

Our 2015 GPU Benchmark Suite & The Test Crysis 3
Comments Locked

276 Comments

View All Comments

  • stun - Tuesday, March 17, 2015 - link

    I hope AMD announces R9 390X fast.
    I am finally upgrading my Radeon 6870 to either GTX 980, TITAN X, or R9 390X.
  • joeh4384 - Tuesday, March 17, 2015 - link

    I do not think Nvidia will have that long with this being the only mega GPU on the market. I really wish they allowed partner models of the Titan. I think a lot of people would go nuts over a MSI Lightning Titan or something like that.
  • farealstarfareal - Tuesday, March 17, 2015 - link

    Yes, a big mistake like the last Titan to not allow custom AIB cards. Good likelihood the 390X will blow the doors off the card with many custom models like MSI Lightning, DCU2 etc.

    Also $1000 for this ??! lol is the only sensible response, none of the dual precision we saw in the original Titan to justify that price, but all of the price. Nvidia trying to cash in here, 390X will force them to do a card probably with less VRAM so people will actually buy this overpriced/overhyped card.
  • chizow - Tuesday, March 17, 2015 - link

    Titan and NVTTM are just as much about image, style and quality as much as performance. Its pretty obvious Nvidia is proud of the look and performance of this cooler, and isn't willing to strap on a hunking mass of Al/Cu to make it look like something that fell off the back of a Humvee.

    They also want to make sure it fits in the SFF and Lanboxes that have become popular. In any case I'm quite happy they dropped the DP nonsense with this card and went all gaming, no cuts, max VRAM.

    It is truly a card made for gamers, by gamers! 100% GeForce, 100% gaming, no BS compute.
  • ratzes - Tuesday, March 17, 2015 - link

    What do you think they give up when they add DP? Its the same fabrication, was for titan vs 780ti. If I'm mistaken, the only difference between cards are whether the process screwed up 1 or more of the smps, then they get sold as gaming cards at varying decreasing prices...
  • MrSpadge - Tuesday, March 17, 2015 - link

    Lot's of die space, since they used dedicated FP64 ALUs.
  • chizow - Wednesday, March 18, 2015 - link

    @ratzes, its well documented, even in the article. DP/FP64 requires extra registers for the higher precision, which means more transistors allocated to that functionality. GM200 is only 1Bn more transistors than GK210 on the same process node, yet they managed to cram in a ton more functional units. Now compare to GM204 to GK204 3.5Bn to 5.2Bn and you can see, its pretty amazing they were even able to logically increase by 1.5x over the GM204, which we know is all gaming, no DP compute also.
  • hkscfreak - Wednesday, March 18, 2015 - link

    Someone didn't read...
  • nikaldro - Tuesday, March 17, 2015 - link

    fanboysm to the Nth p0waH..
  • furthur - Wednesday, March 18, 2015 - link

    which meant fuck all when Hawaii was released

Log in

Don't have an account? Sign up now