Battlefield 4

Kicking off our 2015 benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has since become a challenging game in its own right and a showcase title for low-level graphics APIs. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.

Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

Battlefield 4 - 3840x2160 - Medium Quality

Battlefield 4 - 2560x1440 - Ultra Quality

After stripping away the Frostbite engine’s expensive (and not wholly effective) MSAA, what we’re left with for BF4 at 4K with Ultra quality puts the GTX Titan X in a pretty good light. At 58.3fps it’s not quite up to the 60fps mark, but it comes very close, close enough that the GTX Titan X should be able to stay above 30fps virtually the entire time, and never drop too far below 30fps in even the worst case scenario. Alternatively, dropping to Medium quality should give the GTX Titan X plenty of headroom, with an average framerate of 94.8fps meaning even the lowest framerate never drops below 45fps.

From a benchmarking perspective Battlefield 4 at this point is a well optimized title that’s a pretty good microcosm of overall GPU performance. In this case we find that the GTX Titan X performs around 33% better than the GTX 980, which is almost exactly in-line with our earlier performance predictions. Keeping in mind that while GTX Titan X has 50% more execution units than GTX 980, it’s also clocked at around 88% of the clockspeed, so 33% is right where we should be in a GPU-bound scenario.

Otherwise compared to the GTX 780 Ti and the original GTX Titan, the performance advantage at 4K is around 50% and 66% respectively. GTX Titan X is not going to double the original Titan’s performance – there’s only so much you can do without a die shrink – but it continues to be amazing just how much extra performance NVIDIA has been able to wring out without increasing power consumption and with only a minimal increase in die size.

On the broader competitive landscape, this is far from the Radeon R9 290X/290XU’s best title, with GTX Titan X leading by 50-60%. However this is also a showcase title for when AFR goes right, as the R9 295X2 and GTX 980 SLI both shoot well past the GTX Titan X, demonstrating the performance/consistency tradeoff inherent in multi-GPU setups.

Finally, shifting gears for a moment, gamers looking for the ultimate 1440p card will not be disappointed. GTX Titan X will not get to 120fps here (it won’t even come close), but at 78.7fps it’s well suited for driving 1440p144 displays. In fact it’s the only single-GPU card to do better than 60fps at this resolution.

Our 2015 GPU Benchmark Suite & The Test Crysis 3
Comments Locked

276 Comments

View All Comments

  • chizow - Wednesday, March 18, 2015 - link

    And custom-cooled, higher clocked cards should? It took months for AMD to bring those to market and many of them cost more than the original reference cards and are also overclocked.

    http://www.newegg.com/Product/ProductList.aspx?Sub...

    Like I said, AMD fanboys made this bed, time to lie in it.
  • Witchunter - Wednesday, March 18, 2015 - link

    I hope you do realize calling out AMD fanboys in each and every one of your comments essentially paints you as Nvidia fanboy in the eyes of other readers. I'm here to read some constructive comments and all I see is you bitching about fanboys and being one yourself.
  • chizow - Wednesday, March 18, 2015 - link

    @Witchunter, the difference is, I'm not afraid to admit I'm a fan of the best, but I'm going to at least be consistent on my views and opinions. Whereas these AMD fanboys are crying foul for the same thing they threw a tantrum over a few years ago, ultimately leading to this policy to begin with. You don't find that ironic, that what they were crying about 4 years ago is suddenly a problem when the shoe is on the other foot? Maybe that tells you something about yourself and where your own biases reside? :)
  • Crunchy005 - Wednesday, March 18, 2015 - link

    @chizow either way you don't really offer constructive criticism and you call people dishonest without proving them wrong in any way and offering facts. You are one of the biggest fanboys out there and it kind of makes you lose credibility.
  • Crunchy005 - Wednesday, March 18, 2015 - link

    Ok wanted to add to this, I do like some of the comments you make but you are so fan boyish I am unable to take much stock in what you say. If you could offer more facts and stop just bashing AMD and praising the all powerful Nvidia is better in every way, despite the fact that AMD has advantages and has outperformed Nvidia in many ways, so has Nvidia outperformed AMD, they leap frog...if you did that we might all like to hear what you have to say.
  • FlushedBubblyJock - Thursday, April 2, 2015 - link

    I know what the truth is so I greatly enjoy what he says.
    If you can't handle the truth, that should be your problem, not everyone else's, obviously.
  • chizow - Monday, March 23, 2015 - link

    Like I said, I'm not here to sugarcoat things or keep it constructive, I'm here to set the record straight and keep the discussion honest. If that involves bruising some fragile AMD fanboy egos and sensibilities, so be it.

    I'm completely comfortable in my own skin knowing I'm a fan of the best, and that just happens to be Nvidia for graphics cards for the last near-decade since G80, and I'm certainly not afraid to tell you why that's the case backed with my usual facts, references etc. etc. You're free to verify my sources and references if you like to come to your own conclusion, but at the end of the day, that's the whole point of the internet, isn't it? Lay out the facts, let informed people make their own conclusions?

    In any case, the entire discussion and you can be the judge of whether my take on the topic is fair, you can clearly see, AMD fanboys caused this dilemma for themselves, many of which are the ones you see crying in this thread. Queue that Alanis Morissette song....

    http://anandtech.com/comments/3987/amds-radeon-687...
    http://anandtech.com/show/3988/the-use-of-evgas-ge...
  • Phartindust - Wednesday, March 18, 2015 - link

    Um, AMD doesn't manufacture after market cards.
  • dragonsqrrl - Tuesday, March 17, 2015 - link

    "use less power"

    ...right, and why would these non reference cards consume less power? Just hypothetically speaking, ignoring for a moment all the benchmarks out there that suggest otherwise.
  • squngy - Tuesday, March 17, 2015 - link

    Undervolting?

Log in

Don't have an account? Sign up now