Middle Earth: Shadow of Mordor

Our next benchmark is Monolith’s popular open-world action game, Middle Earth: Shadow of Mordor. One of our current-gen console multiplatform titles, Shadow of Mordor is plenty punishing on its own, and at Ultra settings it absolutely devours VRAM, showcasing the knock-on effect of current-gen consoles have on VRAM requirements.

Shadow of Mordor - 3840x2160 - Ultra Quality

Shadow of Mordor - 3840x2160 - Very High Quality

Shadow of Mordor - 2560x1440 - Ultra Quality

With Shadow of Mordor things finally start looking up for AMD, as the R9 Fury X scores its first win. Okay, it’s more of a tie than a win, but it’s farther than the R9 Fury X has made it so far.

At 4K with Ultra settings the R9 Fury X manages an average of 48.3fps, a virtual tie with the GTX 980 Ti and its 47.9fps. Dropping down to Very High quality does see AMD pull back just a bit, but with a difference between the two cards of just 0.7fps, it’s hardly worth worrying about. Even 2560 looks good for AMD here, trailing the GTX 980 Ti by just over 1fps, at an average framerate of over 80fps. Overall the R9 Fury X delivers 98% to 101% of the performance of the GTX 980 Ti, more or less tying the direct competitor to AMD’s latest card.

Meanwhile compared to the R9 290X, the R9 Fury X doesn’t see quite the same gains. Performance is a fairly consistent 26-28% ahead of the R9 290X, less than what we’ve seen elsewhere. Earlier we discussed how the R9 Fury X’s performance gains will depend on which part of the GPU is getting stressed the most; tasks that stress the shaders show the most gains, and tasks that stress geometry or the ROPs potentially show the lowest gains. In the case of SoM, I believe we’re seeing at least a partial case of being geometry/ROP influenced.

Shadow of Mordor - Min Frame Rate - 3840x2160 - Ultra Quality

Shadow of Mordor - Min Frame Rate - 3840x2160 - Very High Quality

Shadow of Mordor - Min Frame Rate - 2560x1440 - Ultra Quality

Unfortunately for AMD, the minimum framerate situation isn’t quite as good as the averages. These framerates aren’t bad – the R9 Fury X is always over 30fps – but even accounting for the higher variability of minimum framerates, they’re trailing the GTX 980 Ti by 13-15% with Ultra quality settings. Interestingly at 4K with Very High quality settings the minimum framerate gap is just 3%, in which case what we are most likely seeing is the impact of running Ultra settings with only 4GB of VRAM. The 4GB cards don’t get punished too much for it, but for R9 Fury X and its 4GB of HBM, it is beginning to crack under the pressure of what is admittedly one of our more VRAM-demanding games.

Crysis 3 Civilization: Beyond Earth
Comments Locked

458 Comments

View All Comments

  • TallestJon96 - Sunday, July 5, 2015 - link

    This card and the 980 ti meet two interesting milestones in my mind. First, this is the first time 1080p isn't even considered. Pretty cool to be at the point where 1080p is considered at bit of a low resolution for high end cards.

    Second, it's the point where we have single cards can play games at 4k, with higher graphical settings, and have better performance than a ps4. So at this point, if a ps4 is playable, than 4k gaming is playable.

    It's great to see higher and higher resolutions.
  • XtAzY - Sunday, July 5, 2015 - link

    Geez these benchies are making my 580 looking ancient.
  • MacGyver85 - Sunday, July 5, 2015 - link

    Idle power does not start things off especially well for the R9 Fury X, though it’s not too poor either. The 82W at the wall is a distinct increase over NVIDIA’s latest cards, and even the R9 290X. On the other hand the R9 Fury X has to run a CLLC rather than simple fans. Further complicating factors is the fact that the card idles at 300MHz for the core, but the memory doesn’t idle at all. HBM is meant to have rather low power consumption under load versus GDDR5, but one wonders just how that compares at idle.

    I'd like to see you guys post power consumption numbers with power to the pump cut at idle, to answer the questions you pose. I'm pretty sure the card is competitive without the pump running (but still with the fan to have an equal comparison). If not it will give us more of an insight in what improvements AMD can give to HBM in the future with regards to power consumption. But I'd be very suprised if they haven't dealt with that during the design phase. After all, power consumption is THE defining limit for graphics performance.
  • Oxford Guy - Sunday, July 5, 2015 - link

    Idle power consumption isn't the defining limit. The article already said that the cooler keeps the temperature low while also keeping noise levels in check. The result of keeping the temperature low is that AMD can more aggressively tune for performance per watt.
  • Oxford Guy - Sunday, July 5, 2015 - link

    This is a gaming card, not a card for casuals who spend most of their time with the GPU idling.
  • Oxford Guy - Sunday, July 5, 2015 - link

    The other point which wasn't really made in the article is that the idle noise is higher but consider how many GPUs exhaust their heat into the case. That means higher case fan noise which could cancel out the idle noise difference. This card's radiator can be set to exhaust directly out of the case.
  • mdriftmeyer - Sunday, July 5, 2015 - link

    It's an engineering card as much as it is for gaming. It's a great solid modeling card with OpenCL. The way AMD is building its driver foundation will pay off big in the next quarter.
  • Nagorak - Monday, July 6, 2015 - link

    I don't know that I agree about that. Even people who game a lot probably use their computer for other things and it sucks to be using more watts while idle. That being said, the increase is not a whole lot.
  • Oxford Guy - Thursday, July 9, 2015 - link

    Gaming is a luxury activity. People who are really concerned about power usage would, at the very least, stick with a low-wattage GPU like a 750 Ti or something and turn down the quality settings. Or, if you really want to be green, don't do 3D gaming at all.
  • MacGyver85 - Wednesday, July 15, 2015 - link

    That's not really true. I don't mind my gfx card pulling a lot of power while I'm gaming. But I want it to sip power when it's doing nothing. And since any card spends most of its time idling, idling is actually very important (if not most important) in overal (yearly) power consumption.

    Btw I never said that idle power consumption is the defining limit, I said power consumption is the defining limit. It's a give that any Watt you save while idling is generally a Watt of extra headroom when running at full power. The lower the baseline load the more room for actual, functional (graphics) power consumption. And as it turns out I was right in my assumption that the actual graphics card minus the cooler pump idle power consumption is competitive with nVidia's.

Log in

Don't have an account? Sign up now