Middle Earth: Shadow of Mordor

Our next benchmark is Monolith’s popular open-world action game, Middle Earth: Shadow of Mordor. One of our current-gen console multiplatform titles, Shadow of Mordor is plenty punishing on its own, and at Ultra settings it absolutely devours VRAM, showcasing the knock-on effect that current-gen consoles have on VRAM requirements.

Shadow of Mordor - 3840x2160 - Ultra Quality

Shadow of Mordor - 3840x2160 - Very High Quality

Shadow of Mordor - 2560x1440 - Ultra Quality

Shadow of Mordor ends up being a big win for AMD, with the R9 Fury cards shooting well past the GTX 980. Based on our earlier R9 Fury X review this was not an unexpected result, but at the end of the day with a 20%+ performance advantage, it’s a great situation for AMD to be in.

Meanwhile the R9 Fury’s performance relative to its X-rated sibling is yet again in the 7% range. So far the performance difference between the two cards is surprisingly consistent.

Finally, since AMD’s last two $550 cards were the R9 290X and HD 7970, let’s take a look at those comparisons quickly. At 1440p the R9 Fury only has a 17% lead over the R9 290X “Uber”, which for a card almost 2 years old is more than a bit surprising. The R9 Fury has more efficient front-ends and back-ends and significant advantages in shader throughput and memory bandwidth, and yet the performance gains compared to 290X are fairly small. On the other hand 7970 owners looking to upgrade to another Radeon should like what they’re seeing, as the R9 Fury’s 79% performance advantage is approaching upgrade territory.

Shadow of Mordor - Min Frame Rate - 3840x2160 - Ultra Quality

Shadow of Mordor - Min Frame Rate - 3840x2160 - Very High Quality

Shadow of Mordor - Min Frame Rate - 2560x1440 - Ultra Quality

Shifting gears to minimum framerates, the situation is similarly in AMD’s favor at 4K. One of the outcomes of going up against the GTX 980 is that it’s just as VRAM-limited as R9 Fury is, so in a VRAM intensive game like Shadow of Mordor, neither card has an advantage. However it’s quite interesting that once we back off to 1440p, the GTX 980 surges forward.

Crysis 3 Civilization: Beyond Earth
Comments Locked

288 Comments

View All Comments

  • Shadow7037932 - Friday, July 10, 2015 - link

    Yes! Been waiting for this review for a while.
  • Drumsticks - Friday, July 10, 2015 - link

    Indeed! Good that it came out so early too :D

    I'm curious @anandtech in general, given the likely newer state of the city/X's drivers, do you think that the performance deltas between each fury card and the respective nvidia will swing further or into AMD's favor as they solidify their drivers?
  • Samus - Friday, July 10, 2015 - link

    So basically if you have $500 to spend on a video card, get the Fury, if you have $600, get the 980 Ti. Unless you want something liquid cooled/quiet, then the Fury X could be an attractive albeit slower option.

    Driver optimizations will only make the Fury better in the long run as well, since the 980Ti (Maxwell 2) drivers are already well optimized as it is a pretty mature architecture.

    I find it astonishing you can hack off 15% of a cards resources and only lose 6% performance. AMD clearly has a very good (but power hungry) architecture here.
  • witeken - Friday, July 10, 2015 - link

    No, not at all. You must look at it the other way around: Fury X has 15% more resources, but is <<15% faster.
  • 0razor1 - Friday, July 10, 2015 - link

    Smart , you :) :D This thing is clearly not balanced. That's all there is to it. I'd say x for the WC at 100$ more make prime logic.
  • thomascheng - Saturday, July 11, 2015 - link

    Balance is not very conclusive. There are games that take advantage of the higher resources and blows past the 980Ti and there are games that don't and therefore slower. Most likely due to developers not having access to Fury and it's resources before. I would say, no games uses that many shading units and you won't see a benefit until games do. The same with HBM.
  • FlushedBubblyJock - Wednesday, July 15, 2015 - link

    What a pathetic excuse, apologists for amd are so sad.

    AMD got it wrong, and the proof is already evident.

    No, NONE OF US can expect anandtech to be honest about that, nor it's myriad of amd fanboys,
    but we can all be absolutely certain that if it was nVidia whom had done it, a full 2 pages would be dedicated to their massive mistake.

    I've seen it a dozen times here over ten years.

    When will you excuse lie artists ever face reality and stop insulting everyone else with AMD marketing wet dreams coming out of your keyboards ?
    Will you ever ?
  • redraider89 - Monday, July 20, 2015 - link

    And you are not an nividia fanboy are you? Hypocrite.
  • redraider89 - Monday, July 20, 2015 - link

    Typical fanboy, ignore the points and go straight to name calling. No, you are the one people shold be sad about, delusional that they are not a fanboy when they are.
  • redraider89 - Monday, July 20, 2015 - link

    Proof that intel and nvidia wackos are the worst type of people, arrogant, snide, insulting, childish. You are the poster boy for an intel/nvidia sophomoric fanboy.

Log in

Don't have an account? Sign up now