Battlefield 4

Kicking off our benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has since become a challenging game in its own right and a showcase title for low-level graphics APIs. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.

Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

Battlefield 4 - 3840x2160 - Medium Quality

Battlefield 4 - 2560x1440 - Ultra Quality

When the R9 Fury X launched, one of the games it struggled with was Battlefield 4, where the GTX 980 Ti took a clear lead. However for the launch of the R9 Fury, things are much more in AMD’s favor. The two R9 Fury cards have a lead just shy of 10% over the GTX 980, roughly in-line with their price tag difference. As a result of that difference AMD needs to win in more or less every game by 10% to justify the R9 Fury’s higher price, and we’re starting things off exactly where AMD needs to be for price/performance parity.

Looking at the absolute numbers, we’re going to see AMD promote the R9 Fury as a 4K card, but even with Battlefield 4 I feel this is a good example of why it’s better suited for high quality 1440p gaming. The only way the R9 Fury can maintain an average framerate over 50fps (and thereby reasonable minimums) with a 4K resolution is to drop to a lower quality setting. Otherwise at just over 60fps, it’s in great shape for a 1440p card.

As for the R9 Fury X comparison, it’s interesting how close the R9 Fury gets. The cut-down card is never more than 7% behind the R9 Fury X. Make no mistake, the R9 Fury X is meaningfully faster, but scenarios such as these question whether it’s worth the extra $100.

The Test Crysis 3
Comments Locked

288 Comments

View All Comments

  • CiccioB - Monday, July 13, 2015 - link

    For a GPU that was expected to beat Titan X hands down, just being faster than 980 is quite a fail.
    Also due to the high cost technology involved in producing it.
    Be happy for that, and just wait or DX12 to have some hope to gain few FPS with respect to the competitor.
    I just think DX12 is not going to change anything (whatever these cards will gain will be the same for nvidia cards) and few FPS more or less is not what we expected from this top ties class (expensive) GPU.
    Despite the great steps ahead made by AMD in power consumption, it still is a fail.
    Large, expensive, still consuming more, and badly scaling.
    Hope that with the new 16nm FinFet PP things will change radically, or we will witness a 2 year dominance again by nvidia with high prices.
  • superjim - Monday, July 13, 2015 - link

    Used 290's are going for sub-$200 (new for $250). Crossfire those and you get better performance for much less.
  • P39Airacobra - Tuesday, July 14, 2015 - link

    Ok compared to the Fury X, The Regular R9 Fury makes a bit more sense than the X model. It is priced better (But still priced a bit too much) And it has almost even performance with the X model. However the power consumption is still insane and unreasonable for todays standards! And the temps are way too high for a triple fan card! With a 70c temp running triple fans I doubt there is any room at all for overclocking! I do respect this card's performance! But it is just not worth it for the price you have to pay for a hefty PSU, And the very loud and expensive cooling setup you will have to put inside your case! To be honest: If I was stuck with a old GTX 660 Ti, And someone offered me a R9 Fury for even trade, I would not do it!
  • ES_Revenge - Tuesday, July 14, 2015 - link

    The power consumption is not insane or unreasonable for "today's standards". Only the GTX 960, 970, 980, Titan X are better. So it's unreasonable for Nvidia's new standard but it's actually an improvement over Hawaii, etc. of the past.

    Compared to current Nvidia offerings, it's bad yeah but we can't really established standards on their cards alone. R9 390/X, 380, etc. are still power hungry for their performance and they are still "today's" cards, like it or not.

    Don't get me wrong I agree they really need to start focusing on power/heat reduction, but we're not going to see that from AMD until their next gen cards (if they make it that far, lol).
  • Gunbuster - Wednesday, July 15, 2015 - link

    AMD thread with no Chizow comments? My world is falling apart :P
  • Oxford Guy - Wednesday, July 15, 2015 - link

    I'm sure this person has more than one alias.
  • FlushedBubblyJock - Thursday, July 16, 2015 - link

    We'd know him by his words, his many lengthy words with links and facts up the wazoo, and he is so proud he would not hide with another name, like a lousy, incorrect, uninformed, amd fanboy failure.
  • FlushedBubblyJock - Wednesday, July 15, 2015 - link

    Just think about placing your bare hand on 3 plugged in 100 Watt light bulbs ... that's AMD's housefire for you !

    My god you could cook a steak on the thing.

    3X 100 watter light bulbs frying everything in your computer case... awesome job amd.
  • Oxford Guy - Wednesday, July 15, 2015 - link

    Because the GTX 480 was quieter, had better performance per watt, and was a fully-enabled chip.
  • FlushedBubblyJock - Thursday, July 16, 2015 - link

    So the 480 being hot makes this heated furnace ok ?
    What exactly is the logic there ?
    Are you a problematic fanboy for amd ?

Log in

Don't have an account? Sign up now