Battlefield 4

Kicking off our benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has since become a challenging game in its own right and a showcase title for low-level graphics APIs. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.

Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

Battlefield 4 - 3840x2160 - Medium Quality

Battlefield 4 - 2560x1440 - Ultra Quality

When the R9 Fury X launched, one of the games it struggled with was Battlefield 4, where the GTX 980 Ti took a clear lead. However for the launch of the R9 Fury, things are much more in AMD’s favor. The two R9 Fury cards have a lead just shy of 10% over the GTX 980, roughly in-line with their price tag difference. As a result of that difference AMD needs to win in more or less every game by 10% to justify the R9 Fury’s higher price, and we’re starting things off exactly where AMD needs to be for price/performance parity.

Looking at the absolute numbers, we’re going to see AMD promote the R9 Fury as a 4K card, but even with Battlefield 4 I feel this is a good example of why it’s better suited for high quality 1440p gaming. The only way the R9 Fury can maintain an average framerate over 50fps (and thereby reasonable minimums) with a 4K resolution is to drop to a lower quality setting. Otherwise at just over 60fps, it’s in great shape for a 1440p card.

As for the R9 Fury X comparison, it’s interesting how close the R9 Fury gets. The cut-down card is never more than 7% behind the R9 Fury X. Make no mistake, the R9 Fury X is meaningfully faster, but scenarios such as these question whether it’s worth the extra $100.

The Test Crysis 3
Comments Locked

288 Comments

View All Comments

  • Oxford Guy - Saturday, July 11, 2015 - link

    2% cost difference is likely to be erased by sale pricing at various times.
  • darkfalz - Saturday, July 11, 2015 - link

    My 980 is about 15% from stock, and it's a poor overclocker despite running cool. These cards struggle to hit 10%. I also can't go back 6 months ago and buy a R9 Fury. And Nvidia's next release is likely around the corner. I think they're approximately equal value - which is good for AMD fans, but it's been a long wait for them to have a card comparable to what NVIDIA enthusiasts have been enjoying for a year!
  • Flunk - Friday, July 10, 2015 - link

    It's nice to see AMD win a segment. I'm not sure that the Fury X matters that much in the grand scheme of things, seeing that it's the same price as the better performing Geforce 980 TI.

    The Fury seems to overclock to almost match the Fury X, making it a good enthusiast buy.
  • cmikeh2 - Friday, July 10, 2015 - link

    If you're willing to over clock though, you can get a good 15+ percent out of the 980 and pretty much bring it even with an OCed Fury for a little less money.
  • looncraz - Friday, July 10, 2015 - link

    But as soon as voltage control is unlocked the fury will probably eek out at least another 100MHz or more, which will put it healthily out of reach of the 980. And, once a few more driver issues (such as GTA V performance) the performance of the Fury will improve even more.

    HBM has a different performance profile, and AMD is still accommodating that. And, of course, if you turn the nVidia image quality up to AMD levels, nVidia loses a few extra percent of performance.

    The GTX 980 vs R9 Fury question is easy to answer (until a 980 price drop). The Fury X vs 980 Ti question is slightly more difficult (but the answer tends to go the other way, the AIO cooler being the Fury X's main draw).
  • D. Lister - Saturday, July 11, 2015 - link

    "if you turn the nVidia image quality up to AMD levels, nVidia loses a few extra percent of performance."

    Surely we have some proof to go along with that allegation... ?
  • silverblue - Saturday, July 11, 2015 - link

    I've heard the same thing, although I believe it was concerning the lack of anisotropic filtering on the NVIDIA side. However, anisotropic filtering is very cheap nowadays as far as I'm aware, so it's not really going to shake things up much whether it's on OR off, though image quality does improve noticeably.
  • D. Lister - Saturday, July 11, 2015 - link

    Err...

    http://international.download.nvidia.com/webassets...

    You mean to say that it doesn't work like it is supposed to?
  • silverblue - Monday, July 13, 2015 - link

    I'm not sure what you're getting at. In any case, I was trying to debunk the myth that turning off AF makes a real difference to performance.
  • FlushedBubblyJock - Wednesday, July 15, 2015 - link

    no, there's no proof, the proof of course is inside the raging gourd of the amd fanboy, never be unlocked by merely sane mortal beings.

Log in

Don't have an account? Sign up now