Battlefield 4

Our latest addition to our benchmark suite and our current major multiplayer action game of our benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has finally reached a point where it’s stable enough for benchmark use, giving us the ability to profile one of the most popular and strenuous shooters out there. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.

Battlefield 4 - 1920x1080 - High Quality

Battlefield 4 - 1920x1080 - Medium Quality

Battlefield 4 - 1920x1080 - Low Quality

Our first review with Battlefield 4 finds AMD firmly in the driver’s seat, easily surpassing NVIDIA’s closest competitors while often putting more expensive NVIDIA cards in a bind. To this end we see that both the R7 265 and R7 260 are best suited for 1080p at medium quality, as the R7 265’s additional performance comes up a bit short of making high quality playable.

Ultimately we have the R7 260 beating the GTX 650 Ti by 19% and the R7 265 even beating the GTX 660, in the latter case by 12%. Unfortunately for AMD, Mantle is a bust here. In this GPU bound test Mantle is just as likely to cause a minor performance regression as it is to cause a minor performance improvement. Mantle’s strength is in CPU bound scenarios so this isn’t a big surprise, but it does reiterate the fact that Direct3D isn’t dead on AMD cards when you have the CPU power to keep it happy.

Bioshock Infinite Crysis 3
Comments Locked

52 Comments

View All Comments

  • just4U - Thursday, February 13, 2014 - link

    While you may be right... AMD/Ati does like throwing popular configurations into the mix.. The 265 reminds me a lot of the 4830 and while that card was fairly short lived it was a hot seller for them as it straddled two performance areas but came in at a nicer price point.
  • jabber - Friday, February 14, 2014 - link

    Indeed I swapped from being a longtime Nvidia user to AMD back in 2009 as I got fed up with Nvidia regurgitating the old 8800 chips three times in a row for the mid level.

    Stuff doesn't have to change radically performance wise but its nice to know new features are added and other things get revised and tweaked. A simple name change isn't enough really.
  • MrSpadge - Thursday, February 13, 2014 - link

    I'm actually happy they're finally making use of that last digit in their 3-number scheme. From my point of view they could have ditched the X altogether and make the R9-270X an R9-275 (or whatever is appropriate). And speaking of R9: they could have given the R7 265 the rating R9 265 to more closely connect it with R9 270. Or just drop that prefix as well, if the numbers don't overlap anyway and the R9/7/3 is not related to features either!

    Speaking about the cards:
    - boost clocks additional 25 MHz again? I have no idea why these are there. Make it 100+ MHz of leave it.
    - 1.175 V for a mere 925 MHz? The chip should be able to do 1.0 GHz at ~1.0 V, maybe 1.10 V for guaranteed clocks
    - same for R7 260 - that voltage is ridiculously high

    Anyway, the cards themselves are fine (just like the 7000 series) and the coolers really fit them.
  • silverblue - Thursday, February 13, 2014 - link

    The single GPU frame latency issue has been fixed for more than six months. I doubt it's going to become a problem again like with AMD's handling of 2D a while back.

    There are remarks concerning the availability of the R9 270 series and the inability for these parts to keep to their RRP, both of which may not be present if this was some sort of fanboy review.
  • Spuke - Thursday, February 13, 2014 - link

    Has it been 6 months? I thought they recently fixed that problem.
  • silverblue - Thursday, February 13, 2014 - link

    It was fixed in Cat 13.8 Beta 1, dated 1st August.
  • silverblue - Thursday, February 13, 2014 - link

    My bad - that's when CrossFire had its first fix. Apparently, single-GPU was fixed beforehand, though I can't find which driver version it was.
  • Solid State Brain - Thursday, February 13, 2014 - link

    Anandtech: it would be interesting if you tested idle power consumption in multi monitor scenarios. I think you will find out some surprises.
  • creed3020 - Thursday, February 13, 2014 - link

    Excellent point!

    I had a friend with a 6950 and he was furious that his video card would never idle down in gpu/memory frequencies when he had a second monitor connected.

    I personally have a 6850 and two 20" LCDs connected over DVI. I have not looked for the same behaviour but would not be surprised if it were the same.

    Power efficiencies are out the window once the user chooses to go multi-monitor to be more productive.e
  • Solid State Brain - Thursday, February 13, 2014 - link

    I have the same issue with my HD7770 to a lesser extent and my workaround for that is connecting my two secondary displays on the integrated Intel GPU. This saves a significant amount of power.

Log in

Don't have an account? Sign up now