Battlefield 4

Kicking off our benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has since become a challenging game in its own right and a showcase title for low-level graphics APIs. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.

Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

Battlefield 4 - 3840x2160 - Medium Quality

Battlefield 4 - 2560x1440 - Ultra Quality

As we briefly mentioned in our testing notes, our Battlefield 4 testing has been slightly modified as of this review to accommodate the changes in how AMD is supporting Mantle. This benchmark still defaults to Mantle for GCN 1.0 and GCN 1.1 cards (7970, 290X), but we’re using Direct3D for GCN 1.2 cards like the R9 Fury X. This is due to the lack of Mantle driver optimizations on AMD’s part, and as a result the R9 Fury X sees poorer performance here, especially at 2560x1440 (65.2fps vs. 54.3fps).

In any case, regardless of the renderer you pick, our first test does not go especially well for AMD and the R9 Fury X. The R9 Fury X does not take the lead at any resolution, and in fact this is one of the worse games for the card. At 4K AMD trails by 8-10%, and at 1440p that’s 16%. In fact the latter is closer to the GTX 980 than it is the GTX 980 Ti. Even with the significant performance improvement from the R9 Fury X, it’s not enough to catch up to NVIDIA here.

Meanwhile the performance improvement over the R9 290X “Uber” stands at between 23% and 32% depending on the resolution. AMD not only scales better than NVIDIA with higher resolutions, but R9 Fury X is scaling better than R9 290X as well.

The State of Mantle, The Drivers, & The Test Crysis 3
Comments Locked

458 Comments

View All Comments

  • TallestJon96 - Saturday, July 4, 2015 - link

    This card is not the disappointment people make it out to be. One month ago this card would have been a MASSIVE success. What is strange to me is that they didn't reduce price, even slightly to compete with the new 980 ti. I suspect it was to avoid a price war, but I would say at $600 this card is attractive, but at $650 you only really want it for water cooling. I suspect the price will drop more quickly than the 980 ti.
  • mccoy3 - Saturday, July 4, 2015 - link

    So it is as expensive as the 980Ti by delivering less performance and requires watercooling. Once Nvidia settles for a TITAN Y including HBM, its all over for the red guys.
  • just4U - Saturday, July 4, 2015 - link

    Well that would be great news for AMD though wouldn't it since Nvidia would have to pay for the use of HBM in some form or another..
  • Oxford Guy - Saturday, July 4, 2015 - link

    AMD could have released a hot leaf blower like the GTX 480 and chose not to.
  • chizow - Monday, July 6, 2015 - link

    No, they couldn't have. Fury X is already a 275W and that's with the benefit of low temp leakage using a WC *AND* the benefit of a self-professed 15-20W TDP surplus from HBM. That means in order for Fury X to still fall 10% short of 980Ti, it is already using 25+20W, so 45W more power.

    Their CUSTOM cooled 7/8th cut Fury is going to be 275W typical board power as well and its cut down, so yeah the difference in functional unit power is most likely going to be the same as the difference in thermal leakage due to operating temperatures between water and custom air cooling. A hot leaf blower, especially one as poor as AMD's reference would only be able to cool a 6/8 cut Fiji or lower, but at that point you might as well get a Hawaii based card.
  • Oxford Guy - Thursday, July 9, 2015 - link

    Your posts don't even try to sound sane. I wrote about the GTX 480, which was designed to run hot and loud. Nvidia also couldn't release a fully-enabled chip.

    Ignore the point about the low-grade cooler on the 480 which ran hot and was very loud.

    Ignore the point about the card being set to run hot, which hurt performance per watt (see this article if you don't get it).

    How much is Nvidia paying you to astroturf? Whatever it is, it's too much.
  • Margalus - Monday, July 6, 2015 - link

    this AMD card pumps out more heat than any NVidia card. Just because it runs a tad cooler with water cooling doesn't mean the heat is not there. It's just removed faster with water cooling, but the heat is still generated and the card will blow out a lot more hot air into the room than any NVidia card.
  • Oxford Guy - Friday, July 10, 2015 - link

    If you can't afford AC then stick with something like a 750 Ti. Otherwise the extra heat is hardly a big deal.
  • zodiacfml - Saturday, July 4, 2015 - link

    My excitement with HBM has subsided as I realized that this is too costly to be implemented in AMD's APUs even next year. Yet, I hope they do as soon as possible even if it would mean HBM on a narrower bus.
  • jburns - Saturday, July 4, 2015 - link

    Probably the best graphics card review I've ever read! Detailed and balanced... Thanks Ryan for an excellent review.

Log in

Don't have an account? Sign up now