Grand Theft Auto V

The final game in our review of the R9 Fury X is our most recent addition, Grand Theft Auto V. The latest edition of Rockstar’s venerable series of open world action games, Grand Theft Auto V was originally released to the last-gen consoles back in 2013. However thanks to a rather significant facelift for the current-gen consoles and PCs, along with the ability to greatly turn up rendering distances and add other features like MSAA and more realistic shadows, the end result is a game that is still among the most stressful of our benchmarks when all of its features are turned up. Furthermore, in a move rather uncharacteristic of most open world action games, Grand Theft Auto also includes a very comprehensive benchmark mode, giving us a great chance to look into the performance of an open world action game.

On a quick note about settings, as Grand Theft Auto V doesn't have pre-defined settings tiers, I want to quickly note what settings we're using. For "Very High" quality we have all of the primary graphics settings turned up to their highest setting, with the exception of grass, which is at its own very high setting. Meanwhile 4x MSAA is enabled for direct views and reflections. This setting also involves turning on some of the advanced redering features - the game's long shadows, high resolution shadows, and high definition flight streaming - but not increasing the view distance any further.

Otherwise for "High" quality we take the same basic settings but turn off all MSAA, which significantly reduces the GPU rendering and VRAM requirements.

Grand Theft Auto V - 3840x2160 - Very High Quality

Grand Theft Auto V - 3840x2160 - High Quality

Grand Theft Auto V - 2560x1440 - Very High Quality

Closing out our gaming benchmarks, the R9 Fury is once again in the lead, besting the GTX 980 by as much as 15%. However GTA V also serves as a reminder that the R9 Fury doesn’t have quite enough power to game at 4K without compromises. And if we do shift back to 1440p, a more comfortable resolution for this card, AMD’s lead is down to just 5%. At that point the R9 Fury isn’t quite covering its price advantage.

Meanwhile compared to the R9 Fury X, we close out roughly where we started. The R9 Fury trails the more powerful R9 Fury X by 5-7% depending on the resolution, a difference that has more to do with GPU clockspeeds than the cut-down CU count. Overall the gap between the two cards has been remarkably consistent and surprisingly narrow.

Grand Theft Auto V - 99th Percentile Framerate - 3840x2160 - Very High Quality

Grand Theft Auto V - 99th Percentile Framerate - 3840x2160 - High Quality

Grand Theft Auto V - 99th Percentile Framerate - 2560x1440 - Very High Quality

99th percentile framerates however are simply not in AMD’s favor here. Despite AMD’s driver optimizations and the fact that the GTX 980 only has 4GB of VRAM, the R9 Fury X could not pull ahead of the GTX 980, so the R9 Fury understandably fares worse. Even at 1440p the R9 Fury cards can’t quite muster 30fps, though in all fairness even the GTX 980 falls just short of this mark as well.

GRID Autosport Synthetics
Comments Locked

288 Comments

View All Comments

  • Shadow7037932 - Friday, July 10, 2015 - link

    Yes! Been waiting for this review for a while.
  • Drumsticks - Friday, July 10, 2015 - link

    Indeed! Good that it came out so early too :D

    I'm curious @anandtech in general, given the likely newer state of the city/X's drivers, do you think that the performance deltas between each fury card and the respective nvidia will swing further or into AMD's favor as they solidify their drivers?
  • Samus - Friday, July 10, 2015 - link

    So basically if you have $500 to spend on a video card, get the Fury, if you have $600, get the 980 Ti. Unless you want something liquid cooled/quiet, then the Fury X could be an attractive albeit slower option.

    Driver optimizations will only make the Fury better in the long run as well, since the 980Ti (Maxwell 2) drivers are already well optimized as it is a pretty mature architecture.

    I find it astonishing you can hack off 15% of a cards resources and only lose 6% performance. AMD clearly has a very good (but power hungry) architecture here.
  • witeken - Friday, July 10, 2015 - link

    No, not at all. You must look at it the other way around: Fury X has 15% more resources, but is <<15% faster.
  • 0razor1 - Friday, July 10, 2015 - link

    Smart , you :) :D This thing is clearly not balanced. That's all there is to it. I'd say x for the WC at 100$ more make prime logic.
  • thomascheng - Saturday, July 11, 2015 - link

    Balance is not very conclusive. There are games that take advantage of the higher resources and blows past the 980Ti and there are games that don't and therefore slower. Most likely due to developers not having access to Fury and it's resources before. I would say, no games uses that many shading units and you won't see a benefit until games do. The same with HBM.
  • FlushedBubblyJock - Wednesday, July 15, 2015 - link

    What a pathetic excuse, apologists for amd are so sad.

    AMD got it wrong, and the proof is already evident.

    No, NONE OF US can expect anandtech to be honest about that, nor it's myriad of amd fanboys,
    but we can all be absolutely certain that if it was nVidia whom had done it, a full 2 pages would be dedicated to their massive mistake.

    I've seen it a dozen times here over ten years.

    When will you excuse lie artists ever face reality and stop insulting everyone else with AMD marketing wet dreams coming out of your keyboards ?
    Will you ever ?
  • redraider89 - Monday, July 20, 2015 - link

    And you are not an nividia fanboy are you? Hypocrite.
  • redraider89 - Monday, July 20, 2015 - link

    Typical fanboy, ignore the points and go straight to name calling. No, you are the one people shold be sad about, delusional that they are not a fanboy when they are.
  • redraider89 - Monday, July 20, 2015 - link

    Proof that intel and nvidia wackos are the worst type of people, arrogant, snide, insulting, childish. You are the poster boy for an intel/nvidia sophomoric fanboy.

Log in

Don't have an account? Sign up now