Battlefield 4

Kicking off our benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has since become a challenging game in its own right and a showcase title for low-level graphics APIs. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.

Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

Battlefield 4 - 3840x2160 - Medium Quality

Battlefield 4 - 2560x1440 - Ultra Quality

When the R9 Fury X launched, one of the games it struggled with was Battlefield 4, where the GTX 980 Ti took a clear lead. However for the launch of the R9 Fury, things are much more in AMD’s favor. The two R9 Fury cards have a lead just shy of 10% over the GTX 980, roughly in-line with their price tag difference. As a result of that difference AMD needs to win in more or less every game by 10% to justify the R9 Fury’s higher price, and we’re starting things off exactly where AMD needs to be for price/performance parity.

Looking at the absolute numbers, we’re going to see AMD promote the R9 Fury as a 4K card, but even with Battlefield 4 I feel this is a good example of why it’s better suited for high quality 1440p gaming. The only way the R9 Fury can maintain an average framerate over 50fps (and thereby reasonable minimums) with a 4K resolution is to drop to a lower quality setting. Otherwise at just over 60fps, it’s in great shape for a 1440p card.

As for the R9 Fury X comparison, it’s interesting how close the R9 Fury gets. The cut-down card is never more than 7% behind the R9 Fury X. Make no mistake, the R9 Fury X is meaningfully faster, but scenarios such as these question whether it’s worth the extra $100.

The Test Crysis 3
Comments Locked

288 Comments

View All Comments

  • Goty - Friday, July 10, 2015 - link

    Can you imagine the hassle upgrades would be with having to deal with two sockets instead of one?
  • Oxford Guy - Saturday, July 11, 2015 - link

    Not if the GPU socket standard is universal and backward compatible like PCI-E. It's only if companies get to make incompatible/proprietary sockets that that would be an issue.
  • FlushedBubblyJock - Wednesday, July 15, 2015 - link

    Yeah, let's put an additional 300 watts inside a socket laying flat on the motherboard - we can have a huge tube to flow the melting heat outside the case...

    Yep, that gigantic 8.9B trans core die, slap some pins on it... amd STILL loves pinned sockets...

    Yeah, time to move to the motherboard ... ROFLMAO

    I just can't believe it ... the smartest people in the world.
  • ant6n - Saturday, July 11, 2015 - link

    I'm definitely interested to see how well these cards would do in a rotated atx Silverstone case. I have one of those, and I'm concerned about the alignment of the fins. You basically want the heat to be able to move up vertically, out the back/top of the card.
  • ajlueke - Friday, July 10, 2015 - link

    Priced in between the GTX 980 and the Fury X it is substantially faster than the former, and hardly any slower than the latter. Price performance wise this card is a fantastic option if it can be found around the MSRP, or found at all.
  • FlushedBubblyJock - Wednesday, July 15, 2015 - link

    NO, actually if you read, ha ha, and paid attention, lol, 10% more price for only 8% more performance... so it's ratio sucks compared to the NVIDIA GTX 980.

    Not a good deal, not good price perf compared to NVIDIA.

    Thanks for another amd fanboy blowout
  • Nagorak - Friday, July 10, 2015 - link

    One interesting thing from this review is looking at the performance of the older AMD cards. The improvement of the Fury vs the older cards was mentioned by Ryan Smith in the review, noting that performance hasn't improved that much. But there's a lot more to it than that. The relative performance of AMD's cards seem to have moved up a lot compared to their Nvidia competitors.

    Look at how the 290X stacks up against the GTX 780 in this review. It pretty much just blows it away. The 290X is performing close to the GTX 980 (which explains why the 390X which has faster memory is competitive with it). Meanwhile, the HD 7970 is now stacking up against the GTX 780.

    Now, look back at the performance at the time the 290X was released: http://www.anandtech.com/show/7457/the-radeon-r9-2...

    It looks like performance on AMD's GCN chips has increased significantly. Meanwhile the GTX 780's performance has at best stayed the same, but actually looks to have decreased.

    Anandtech should really do a review of how performance has changed over time on these cards, because it seems the change has been pretty significant.
  • Nagorak - Friday, July 10, 2015 - link

    I don't know, maybe it's just different benchmark settings but the AMD cards look to be a bit more competitive to their counterparts than they were at release.
  • Stuka87 - Friday, July 10, 2015 - link

    Its been the case with all GCN cards. AMD continues to make driver optimizations. The 7970 is significantly faster now that it was at launch. Its one advantage of them all sharing a similar architecture.
  • FlushedBubblyJock - Wednesday, July 15, 2015 - link

    nvidia CARDS GAIN 10-20% AND MORE over their release drivers... but that all comes faster, on game release days, and without massive breaking of prior fixes, UNLIKE AMD, who takes forever and breaks half of what it prior bandaided, and it takes a year or two or three or even EOL for that fix to come.

Log in

Don't have an account? Sign up now