Battlefield 4

Kicking off our benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has since become a challenging game in its own right and a showcase title for low-level graphics APIs. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.

Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

Battlefield 4 - 3840x2160 - Medium Quality

Battlefield 4 - 2560x1440 - Ultra Quality

As we briefly mentioned in our testing notes, our Battlefield 4 testing has been slightly modified as of this review to accommodate the changes in how AMD is supporting Mantle. This benchmark still defaults to Mantle for GCN 1.0 and GCN 1.1 cards (7970, 290X), but we’re using Direct3D for GCN 1.2 cards like the R9 Fury X. This is due to the lack of Mantle driver optimizations on AMD’s part, and as a result the R9 Fury X sees poorer performance here, especially at 2560x1440 (65.2fps vs. 54.3fps).

In any case, regardless of the renderer you pick, our first test does not go especially well for AMD and the R9 Fury X. The R9 Fury X does not take the lead at any resolution, and in fact this is one of the worse games for the card. At 4K AMD trails by 8-10%, and at 1440p that’s 16%. In fact the latter is closer to the GTX 980 than it is the GTX 980 Ti. Even with the significant performance improvement from the R9 Fury X, it’s not enough to catch up to NVIDIA here.

Meanwhile the performance improvement over the R9 290X “Uber” stands at between 23% and 32% depending on the resolution. AMD not only scales better than NVIDIA with higher resolutions, but R9 Fury X is scaling better than R9 290X as well.

The State of Mantle, The Drivers, & The Test Crysis 3
Comments Locked

458 Comments

View All Comments

  • Chaser - Friday, July 3, 2015 - link

    Oh yeah that invalidated the entire review. /facepalm
  • Strychn9ne - Saturday, July 4, 2015 - link

    Great review here! It was a good read going through all the technical details of the card I must say. The Fury X is an awesome card for sure. I am trying to wait for next gen to buy a new card as my 280X is holding it's own for now, but this thing makes it tempting not to wait. As for the performance, I expect it will perform better with the next driver release. The performance is more than fine even now despite the few losses it had in the benches. I suspect that AMD kind of rushed the driver out for this thing and didn't get enough time to polish it fully. The scaling down to lower resolutions kind of points that way for me anyways.
  • Peichen - Saturday, July 4, 2015 - link

    AMD/ATI, what a fail. Over the past 15 years I have only gone Nvidia twice for 6600GT and 9800GT but now I am using a GTX 980. Not a single mid-range/high-end card in AMD/ATI's line up is correctly priced. Lower price by 15-20% to take into account the power usage, poor driver and less features will make them more competitive
  • just4U - Saturday, July 4, 2015 - link

    At the high end you "may" have a point.. but what is the 960 bringing to the table against the 380? Not much.. not much at all. How about the 970 vs the 390? Again.. not much.. and in crossfire/sli situations the 390 (in theory..) should be one helluva bang for the buck 4k setup.

    There will be a market for the FuryX.. and considering the efforts they put into it I don't believe it's going to get the 15-20% price drop your hoping for.
  • TheinsanegamerN - Saturday, July 4, 2015 - link

    Slightly better performance while pulling less power and putting out less heat, and in the 970's case, is currently about $10 cheaper. Given that crossfire is less reliable than SLI, why WOULD you buy an AMD card?
  • Oxford Guy - Saturday, July 4, 2015 - link

    Maybe because people want decent performance above 3.5 GB of VRAM? Or they don't appreciate bait and switch, being lied to (ROP count, VRAM speed, nothing about the partitioning in the specs, cache size).
  • medi03 - Sunday, July 5, 2015 - link

    Freesync?
    Built-in water cooling?
    Disgust for nVidia's shitty buisness practices?
    A brain?
  • chizow - Monday, July 6, 2015 - link

    How do you feel about the business practice of sending out a card with faults that you claimed were fixed?

    Or claims that you had the world's fastest GPU enabled by HBM?

    Or claims/benches that your card was faster than 980Ti?

    Or claims that your card was an Overclocker's Dream when it is anything but that and OCs 10% max?

    A brain right? :)
  • sa365 - Tuesday, July 7, 2015 - link

    How do you feel about the business practice of sending out a card with faulty, cheating drivers that lower IQ despite what you set in game so you can win/cheat in those said benchmarks. It's supposed to be apples to apples not apples to mandarins?

    How about we wait until unwinder writes the software for voltage unlocks before we test overclocking, those darn fruits again huh?

    Nvidia will cheat their way through anything it seems.

    It's pretty damning when you look at screens side by side, no AF Nvidia.
  • Margalus - Monday, July 6, 2015 - link

    freesync? not as good as gsync and is still not free. It takes similar hardware added to the monitor just like gsync.

    built in water cooling? just something else to go wrong and be more expensive to repair, with the possibility of it ruining other computer components.

    Disgust for NVidia's shitty business practices? what are those? Do you mean like not giving review samples of your cards to honest review sites because they told the truth about their cards so now you are afraid that they will tell the truth about your newest pos? Sounds like you should really hate AMD's shitty business practices.

Log in

Don't have an account? Sign up now