Battlefield 4

Kicking off our benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has since become a challenging game in its own right and a showcase title for low-level graphics APIs. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.

Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

Battlefield 4 - 3840x2160 - Medium Quality

Battlefield 4 - 2560x1440 - Ultra Quality

When the R9 Fury X launched, one of the games it struggled with was Battlefield 4, where the GTX 980 Ti took a clear lead. However for the launch of the R9 Fury, things are much more in AMD’s favor. The two R9 Fury cards have a lead just shy of 10% over the GTX 980, roughly in-line with their price tag difference. As a result of that difference AMD needs to win in more or less every game by 10% to justify the R9 Fury’s higher price, and we’re starting things off exactly where AMD needs to be for price/performance parity.

Looking at the absolute numbers, we’re going to see AMD promote the R9 Fury as a 4K card, but even with Battlefield 4 I feel this is a good example of why it’s better suited for high quality 1440p gaming. The only way the R9 Fury can maintain an average framerate over 50fps (and thereby reasonable minimums) with a 4K resolution is to drop to a lower quality setting. Otherwise at just over 60fps, it’s in great shape for a 1440p card.

As for the R9 Fury X comparison, it’s interesting how close the R9 Fury gets. The cut-down card is never more than 7% behind the R9 Fury X. Make no mistake, the R9 Fury X is meaningfully faster, but scenarios such as these question whether it’s worth the extra $100.

The Test Crysis 3
Comments Locked

288 Comments

View All Comments

  • siliconwars - Saturday, July 11, 2015 - link

    Any concept of performance per dollar?
  • D. Lister - Saturday, July 11, 2015 - link

    The Fury is 8% faster than a stock 980 and 10% more expensive. How does that "performance per dollar" thing work again? :p
  • Nagorak - Sunday, July 12, 2015 - link

    By that token the 980 is not good performance per dollar either. It's sonething like a 390 non-x topping the charts. These high end cards are always a rip off.
  • D. Lister - Tuesday, July 14, 2015 - link

    "These high end cards are always a rip off."

    That, is unfortunately a fact. :(
  • siliconwars - Saturday, July 11, 2015 - link

    The Asus Strix is 9.4% faster than the 980 with 20% worse power consumption. I wouldn't call that "nowhere near" Maxwell tbh and the Nano will be even closer if not ahead.
  • Dazmillion - Saturday, July 11, 2015 - link

    Nobody is talking about the fact that the Fury cards which AMD claims is for 4k gaming doesnt have a 4k@60Hz port!!
  • David_K - Saturday, July 11, 2015 - link

    So the displayport 1.2 connector isn't capable of sending 2160p60hz. That's new.
  • Dazmillion - Saturday, July 11, 2015 - link

    The fury cards dont come with HDMI 2.0
  • ES_Revenge - Sunday, July 12, 2015 - link

    Which is true but not the only way to get that resolution & refresh. Lack of HDMI 2.0 and full HEVC features is certainly another sore point for Fury. For the most part HDMI 2.0 affects the consumer AV/HT world though, not so much the PC world. In the PC world, gaming monitors capable of those res/refresh rates are going to have DP on them which makes HDMI 2.0 extraneous.
  • mdriftmeyer - Sunday, July 12, 2015 - link

    I'll second ES_Revenge on the DP for PC Gaming. The world of 4K Home Monitors being absent with HDMI 2.0 is something we'll live with until the next major revision.

    I don't even own a 4K Home Monitor. Not very popular in sales either.

    Every single one of them showing up on Amazon are handicapped with that SMART TV crap.

    I want a 4K Dumb Device that is the output Monitor with FreeSync and nothing else.

    I'll use the AppleTV for the `smart' part.

Log in

Don't have an account? Sign up now