Battlefield 4

Kicking off our benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has since become a challenging game in its own right and a showcase title for low-level graphics APIs. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.

Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

Battlefield 4 - 3840x2160 - Medium Quality

Battlefield 4 - 2560x1440 - Ultra Quality

As we briefly mentioned in our testing notes, our Battlefield 4 testing has been slightly modified as of this review to accommodate the changes in how AMD is supporting Mantle. This benchmark still defaults to Mantle for GCN 1.0 and GCN 1.1 cards (7970, 290X), but we’re using Direct3D for GCN 1.2 cards like the R9 Fury X. This is due to the lack of Mantle driver optimizations on AMD’s part, and as a result the R9 Fury X sees poorer performance here, especially at 2560x1440 (65.2fps vs. 54.3fps).

In any case, regardless of the renderer you pick, our first test does not go especially well for AMD and the R9 Fury X. The R9 Fury X does not take the lead at any resolution, and in fact this is one of the worse games for the card. At 4K AMD trails by 8-10%, and at 1440p that’s 16%. In fact the latter is closer to the GTX 980 than it is the GTX 980 Ti. Even with the significant performance improvement from the R9 Fury X, it’s not enough to catch up to NVIDIA here.

Meanwhile the performance improvement over the R9 290X “Uber” stands at between 23% and 32% depending on the resolution. AMD not only scales better than NVIDIA with higher resolutions, but R9 Fury X is scaling better than R9 290X as well.

The State of Mantle, The Drivers, & The Test Crysis 3
Comments Locked

458 Comments

View All Comments

  • chizow - Friday, July 3, 2015 - link

    Pretty much, AMD supporters/fans/apologists love to parrot the meme that Intel hasn't innovated since original i7 or whatever, and while development there has certainly slowed, we have a number of 18 core e5-2699v3 servers in my data center at work, Broadwell Iris Pro iGPs that handily beat AMD APU and approach low-end dGPU perf, and ultrabooks and tablets that run on fanless 5W Core M CPUs. Oh, and I've upgraded also managed to find meaningful desktop upgrades every few years for no more than $300 since Core 2 put me back in Intel's camp for the first time in nearly a decade.
  • looncraz - Friday, July 3, 2015 - link

    None of what you stated is innovation, merely minor evolution. The core design is the same, gaining only ~5% or so IPC per generation, same basic layouts, same basic tech. Are you sure you know what "innovation" means?

    Bulldozer modules were an innovative design. A failure, but still very innovative. Pentium Pro and Pentium 4 were both innovative designs, both seeking performance in very different ways.

    Multi-core CPUs were innovative (AMD), HBM is innovative (AMD+Hynix), multi-GPU was innovative (3dfx), SMT was innovative (IBM, Alpha), CPU+GPU was innovative (Cyrix, IIRC)... you get the idea.

    Doing the exact same thing, more or less the exact same way, but slightly better, is not innovation.
  • chizow - Sunday, July 5, 2015 - link

    Huh? So putting Core level performance in a passive design that is as thin as a legal pad and has 10 hours of battery life isn't innovation?

    Increasing iGPU performance to the point it not only provides top-end CPU performance, and close to dGPU performance, while convincingly beating AMD's entire reason for buying ATI, their Fusion APUs isn't innovation?

    And how about the data center where Intel's *18* core CPUs are using the same TDP and sockets, in the same U rack units as their 4 and 6 core equivalents of just a few years ago?

    Intel is still innovating in different ways, that may not directly impact the desktop CPU market but it would be extremely ignorant to claim they aren't addressing their core growth and risk areas with new and innovative products.

    I've bought more Intel products in recent years vs. prior strictly because of these new innovations that are allowing me to have high performance computing in different form factors and use cases, beyond being tethered to my desktop PC.
  • looncraz - Friday, July 3, 2015 - link

    Show me intel CPU innovations since after the pentium 4.

    Mind you, innovations can be failures, they can be great successes, or they can be ho-hum.

    P6->Core->Nehalem->Sandy Bridge->Haswell->Skylake

    The only changes are evolutionary or as a result of process changes (which I don't consider CPU innovations).

    This is not to say that they aren't fantastic products - I'm rocking an i7-2600k for a reason - they just aren't innovative products. Indeed, nVidia's Maxwell is a wonderfully designed and engineered GPU, and products based on it are of the highest quality and performance. That doesn't make them innovative in any way. Nothing technically wrong with that, but I wonder how long before someone else came up with a suitable RAM just for GPUs if AMD hadn't done it?
  • chizow - Sunday, July 5, 2015 - link

    I've listed them above and despite slowing the pace of improvements on the desktop CPU side you are still looking at 30-45% improvement clock for clock between Nehalem and Haswell, along with pretty massive improvements in stock clock speed. Not bad given they've had literally zero pressure from AMD. If anything, Intel dominating in a virtual monopoly has afforded me much cheaper and consistent CPU upgrades, all of which provided significant improvements over the previous platform:

    E6600 $284
    Q6600 $299
    i7 920 $199!
    i7 4770K $229
    i7 5820K $299

    All cheaper than the $450 AMD wanted for their ENTRY level Athlon 64 when they finally got the lead over Intel, which made it an easy choice to go to Intel for the first time in nearly a decade after AMD got Conroe'd in 2006.
  • silverblue - Monday, July 6, 2015 - link

    I could swear that you've posted this before.

    I think the drop in prices were more of an attempt to strangle AMD than anything else. Intel can afford it, after all.
  • chizow - Monday, July 6, 2015 - link

    Of course I've posted it elsewhere because it bears repeating, the nonsensical meme AMD fanboys love to parrot about AMD being necessary for low prices and strong competition is a farce. I've enjoyed unparalleled stability at a similar or higher level of relative performance in the years that AMD has become UNCOMPETITIVE in the CPU market. There is no reason to expect otherwise in the dGPU market.
  • zoglike@yahoo.com - Monday, July 6, 2015 - link

    Really? Intel hasn't innovated? I really hope you are trolling because if you believe that I fear for you.
  • chizow - Thursday, July 2, 2015 - link

    Let's not also discount the fact that's just stock comparisons, once you overclock the cards as many are interested in doing in this $650 bracket, especially with AMD's clams Fury X is an "Overclocker's Dream", we quickly see the 980Ti cannot be touched by Fury X, water cooler or not.

    Fury X wouldn't have been the failure it is today if not for AMD setting unrealistic and ultimately, unattained expectations. 390X WCE at $550-$600 and its a solid alternative. $650 new "Premium" Brand that doesn't OC at all, has only 4GB, has pump whine issues and is slower than Nvidia's same priced $650 980Ti that launched 3 weeks before it just doesn't get the job done after AMD hyped it from the top brass down.
  • andychow - Thursday, July 2, 2015 - link

    Yeah, "Overclocker's dream", only overclocks by 75 MHz. Just by that statement, AMD has totally lost me.

Log in

Don't have an account? Sign up now