Battlefield 4

Kicking off our benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has since become a challenging game in its own right and a showcase title for low-level graphics APIs. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.

Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

Battlefield 4 - 3840x2160 - Medium Quality

Battlefield 4 - 2560x1440 - Ultra Quality

When the R9 Fury X launched, one of the games it struggled with was Battlefield 4, where the GTX 980 Ti took a clear lead. However for the launch of the R9 Fury, things are much more in AMD’s favor. The two R9 Fury cards have a lead just shy of 10% over the GTX 980, roughly in-line with their price tag difference. As a result of that difference AMD needs to win in more or less every game by 10% to justify the R9 Fury’s higher price, and we’re starting things off exactly where AMD needs to be for price/performance parity.

Looking at the absolute numbers, we’re going to see AMD promote the R9 Fury as a 4K card, but even with Battlefield 4 I feel this is a good example of why it’s better suited for high quality 1440p gaming. The only way the R9 Fury can maintain an average framerate over 50fps (and thereby reasonable minimums) with a 4K resolution is to drop to a lower quality setting. Otherwise at just over 60fps, it’s in great shape for a 1440p card.

As for the R9 Fury X comparison, it’s interesting how close the R9 Fury gets. The cut-down card is never more than 7% behind the R9 Fury X. Make no mistake, the R9 Fury X is meaningfully faster, but scenarios such as these question whether it’s worth the extra $100.

The Test Crysis 3
Comments Locked

288 Comments

View All Comments

  • Oxford Guy - Thursday, July 16, 2015 - link

    "What exactly is the logic there?"

    I really need to spell it out for you?

    The logic is that the 480 was a successful product despite having horrid performance per watt and a very inefficient (both in terms of noise and temps) cooler. It didn't get nearly the gnashing of teeth the recent AMD cards are getting and people routinely bragged about running more than one of them in SLI.
  • CiccioB - Thursday, July 16, 2015 - link

    No, it was not a successful product at all, though it was still the fastest card on market.
    The successful card was the 460 launched few months later and surely the 570/580 cards which brought the corrections to the original GF100 that nvidia itself said it was bugged.
    Here, instead, we have a card which uses a lot of power, it is not on top of the charts and there's really no fix at the horizont for it.
    The difference was that with GF100 nvidia messed up the implementation of the architecture which was then fixxed, here we are seeing what is the most advanced implementation of a really not so good architecture that for 3 years has struggled to keep the pace of the competitions which at the end has decided to go with a 1024 shaders + 128bit wide bus in a 220mm^2 die space against a 1792 shader + 256bit wide bus in a 356mm^2 die space instead of trying to have the latest fps longer bar war.
    AMD, please, review your architecture completely or we are doomed with next PP.
  • Oxford Guy - Tuesday, July 21, 2015 - link

    "No, it was not a successful product at all"

    It was successful. Enthusiasts bought them in a significant number and review sites showed off their two and three card rigs. The only site that even showed their miserable performance per watt was techpowerup
  • Count Vladimir - Thursday, July 16, 2015 - link

    So we are discussing 6 year old products now? Is that your version of logic? Yes, it was hot, yes, it was buggy but it was still the fastest video card in its era, that's why people bragged about SLI'ing it. Fury X isn't.
  • Oxford Guy - Tuesday, July 21, 2015 - link

    "So we are discussing 6 year old products now?" strawman
  • celebrevida - Thursday, July 16, 2015 - link

    Looks like Jason Evangelho of PCWorld has the matter settled. In his article:
    http://www.pcworld.com/article/2947547/components-...

    He shows that R9 Fury x2 is on par with GTX 980 Ti x 2 and blows away GTX 980 x2. Considering that R9 Fury x2 is much cheaper than GTX 980 Ti x2 and also R9 Fury is optimized for upcoming DX12, it looks like R9 Fury is the clear winner in cost/performance.
  • xplane - Saturday, October 17, 2015 - link

    So with this GPU I could use 5 monitors simultaneously? Right?
  • kakapoopoo - Wednesday, January 4, 2017 - link

    i got the sapphire version up to 1150 stably using msi after burner w/o changing anything else

Log in

Don't have an account? Sign up now