Battlefield 4

Kicking off our benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has since become a challenging game in its own right and a showcase title for low-level graphics APIs. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.

Battlefield 4 - 2560x1440 - Ultra Quality

Battlefield 4 - 1920x1080 - Ultra Quality

Though not doing poorly, Battlefield 4 has not been a game AMD’s products have excelled at lately. Case in point, at 1080p even the referenced clocked R9 380X can’t unseat the GeForce GTX 960; it takes the ASUS factory overclock to do that. Overall while the 380X is on average 10% faster than the GTX 960, as we’ll see as we work through our games it will not take the top spot in every single game, so this will not be a clean sweep.

Meanwhile Battlefield 4 is a good example of why AMD wishes to focus on 1440p, despite the fact that Tonga is going to come up a bit short in overall performance. As we’ve seen time and time again, AMD’s performance hit with resolution increases is less than NVIDIA’s, so a loss for the R9 380X at 1080p is a win at 1440p. There are a few cases where the R9 380X is fast enough for 1440p, but by and large you’d have to take a quality hit to reach the necessary performance. So unfortunately for AMD this bulk of the focus on the R9 380X is going to be at 1080p.

As for comparisons with past cards, we’ve gone ahead and thrown in the Radeon HD 7850 and the GeForce GTX 660, 2GB cards that launched at $249 and $229 respectively in 2012. Part of AMD’s marketing focus for the R9 380X will be as an upgrade for early 28nm cards, where the R9 380X is a significant step up. Between the greater shader/ROP throughput, greater memory bandwidth, and doubled memory, the R9 380X is around 82% faster than the 7850, which traditionally is around the area where a lot of gamers look for an upgrade.

Finally, at the other end of the spectrum, it’s worth pointing out just how far ahead of the R9 380X the R9 390 and GTX 970 are. In the introduction we called them spoilers, and this is exactly why. They cost more, but the performance advantage of the next step up is quite significant.

The Test Crysis 3
Comments Locked

101 Comments

View All Comments

  • SpartyOn - Monday, November 23, 2015 - link

    My 770 is at 1400 MHz core / 7940 MHz memory; trust me, neither the GTX 960 or this 380x are beating me and I'm not digging into my wallet until Pascal comes out. It was tough when the GTX 980 Ti was released, but I'm sticking to my guns.

    At 1080p, which is where the 960 and 380x should be competing (because if you buy either of these for 1440p+, you're a moron), if they had gotten a 960 4GB for comparison, there wouldn't be much difference. You can get a 960 4GB, which is a one year old card, for less than $200 and it's essentially just as good at stock. The few frames the 380x wins in this review is mostly due to the VRAM limit on the 960 2GB.

    Plus you can overclock a 960 to insane levels, so why spend $229 on the 380x when you can spend $180 on a GTX 960 4GB and overclock it if you want more speed?
  • Sushisamurai - Monday, November 23, 2015 - link

    Errr... Isn't the 960 a rebadge of the 770?
  • Sushisamurai - Monday, November 23, 2015 - link

    Note: rebadge in the sense that the hardware is super similar, minus the maxwell gen 2 features
  • Sushisamurai - Monday, November 23, 2015 - link

    Oops I lied. The 770 is not comparable to the 960; I'm assuming it's better. Mind u, the 280X and 770 were comparable back in the day.
  • silverblue - Monday, November 23, 2015 - link

    Yep, as the 770 is essentially a tweaked 680, which traded blows with the 7970/7970GE,
  • CiccioB - Tuesday, November 24, 2015 - link

    The sad thing is how all you make comparisons on this kind of technology. GPU scales well when made fat. So the point of "performance" is really moot when doing comparisons. It's like saying that the 750Ti is the same as a GTX480 because they perform similarly.
    This card (like all the new AMD 300 series) are simply fat, bloated, clocked at their limit GPUs that are sold under cost to compete with smaller more efficient architectures created by the competition (that is selling them at premium prices).
    This 380X card is a complete fail in trying to make AMD advance in its fight. Competition has done marvelous things meanwhile: they came with a GPU, the GM106, which is half the GK104 in term of size and power consumption, and has the same performances. This is the progress the competition did while AMD passed from GCN 1.0 to GCN 1.2, which has only few tricks and hacks but nothing really good to bring that already obsolete architecture to the new level of competition.
    Sorry, but if you are excited by this kind of "evolution" and you do not understand where this has brought "your favorite company" to, you really deserve to stay a generation back in terms of innovations. And be happy of this Tonga which will be sold for few bucks in few month and be completely forgotten when Pascal will annihilate it at it first iteration.
  • britjh22 - Monday, November 23, 2015 - link

    Comparing a 2.5 year old card that cost $450-500 against a $230 card.... and complaining if AMD is even trying... your bias is showing sir. You shouldn't feel the need to upgrade yet in my opinion, unless of course your card is being crippled by NVIDIA's drivers, whoops!
  • tviceman - Monday, November 23, 2015 - link

    GTX 770 launched at $399, not $450. Interestingly, the GTX 770 was a smaller chip and drew less power. So, tossing the consumer economics aside, SpartyOn raises a good point.
  • britjh22 - Monday, November 23, 2015 - link

    The 770 2GB launched at $399, but the 4gb launched at anywhere from $450 to $500 depending on the model.
  • 200380051 - Monday, November 23, 2015 - link

    The power consumtion of the 380X under load is lower with Furmark than it is with Crysis 3, while it is the opposite with the GTX 960. Any thoughts on that?

Log in

Don't have an account? Sign up now