Grand Theft Auto V

The final game in our review of the R9 Fury X is our most recent addition, Grand Theft Auto V. The latest edition of Rockstar’s venerable series of open world action games, Grand Theft Auto V was originally released to the last-gen consoles back in 2013. However thanks to a rather significant facelift for the current-gen consoles and PCs, along with the ability to greatly turn up rendering distances and add other features like MSAA and more realistic shadows, the end result is a game that is still among the most stressful of our benchmarks when all of its features are turned up. Furthermore, in a move rather uncharacteristic of most open world action games, Grand Theft Auto also includes a very comprehensive benchmark mode, giving us a great chance to look into the performance of an open world action game.

On a quick note about settings, as Grand Theft Auto V doesn't have pre-defined settings tiers, I want to quickly note what settings we're using. For "Very High" quality we have all of the primary graphics settings turned up to their highest setting, with the exception of grass, which is at its own very high setting. Meanwhile 4x MSAA is enabled for direct views and reflections. This setting also involves turning on some of the advanced redering features - the game's long shadows, high resolution shadows, and high definition flight streaming - but it not increasing the view distance any further.

Otherwise for "High" quality we take the same basic settings but turn off all MSAA, which significantly reduces the GPU rendering and VRAM requirements.

Grand Theft Auto V - 3840x2160 - Very High Quality

Grand Theft Auto V - 3840x2160 - High Quality

Grand Theft Auto V - 2560x1440 - Very High Quality

Our final game sees the R9 Fury X go out on either an average or slightly worse than average note, depending on the settings and resolution we are looking at. At our highest 4K settings the R9 Fury X trails the GTX 980 Ti once again, this time by 10%. Worse, at 1440p it’s now 15%. On the other hand if we run at our lower, more playable 4K settings, then the gap is only 5%, roughly in line with the overall average 4K performance gap between the GTX 980 Ti and R9 Fury X.

In this case it’s probably to AMD’s benefit that our highest 4K settings aren’t actually playable on a single GPU card, as the necessary drop in quality gets them closer to NVIDIA’s performance. On the other hand this does reiterate the fact that right now many games will force a tradeoff between resolution and quality if you wish to pursue 4K gaming.

Finally, the performance gains relative to the R9 290X are pretty good. 29% at 1440p, and 44% at the lower quality playable 4K resolution setting.

Grand Theft Auto V - 99th Percentile Framerate - 3840x2160 - Very High Quality

Grand Theft Auto V - 99th Percentile Framerate - 3840x2160 - High Quality

Grand Theft Auto V - 99th Percentile Framerate - 2560x1440 - Very High Quality

Shifting gears to 99th percentile frametimes however – a much-welcome feature of the game’s built-in benchmark – finds that AMD doesn’t fare nearly as well. At the 99th percentile the R9 Fury X trails the GTX 980 Ti at all times, and significantly so. The deficit is anywhere between 26% at 1440p to 40% at 4K Very High.

What’s happening here is a combination of multiple factors. First and foremost, next to Shadow of Mordor, GTAV is our other VRAM busting game. This, I believe, is why 99th percentile performance dives so hard at 4K Very High for the R9 Fury X, as it only has 4GB of VRAM compared to 6GB on the GTX 980 Ti. But considering where the GTX 980 places – above the R9 Fury X – I also believe there’s more than just VRAM bottlenecking occurring here. The GTX 980 sees at least marginally better framerates with the same size VRAM pool (and a lot less of almost everything else), which leads me to believe that AMD’s drivers may be holding them back here. Certainly the R9 290X comparison lends some possible credit to that, as the 99th percentile gains are under 20%. Regardless, one wouldn’t expect to be VRAM limited at 1440p or 4K without MSAA, especially as this test was not originally designed to bust 4GB cards.

GRID Autosport Synthetics
Comments Locked

458 Comments

View All Comments

  • chizow - Sunday, July 5, 2015 - link

    @piiman - I guess we'll see soon enough, I'm confident it won't make any difference given GPU prices have gone up and up anyways. If anything we may see price stabilization as we've seen in the CPU industry.
  • medi03 - Sunday, July 5, 2015 - link

    Another portion of bulshit from nVidia troll.

    AMD never ever had more than 25% of CPU share. Doom to Intel, my ass.
    Even in Prescott times Intell was selling more CPUs and for higher price.
  • chizow - Monday, July 6, 2015 - link

    @medi03 AMD was up to 30% a few times and they did certainly have performance leadership at the time of K8 but of course they wanted to charge anyone for the privilege. Higher price? No, $450 for entry level Athlon 64, much more than what they charged in the past and certainly much more than Intel was charging at the time going up to $1500 on the high end with their FX chips.
  • Samus - Monday, July 6, 2015 - link

    Best interest? Broken up for scraps? You do realize how important AMD is to people who are Intel\NVidia fans right?

    Without AMD, Intel and NVidia are unchallenged, and we'll be back to paying $250 for a low-end video card and $300 for a mid-range CPU. There would be no GTX 750's or Pentium G3258's in the <$100 tier.
  • chizow - Monday, July 6, 2015 - link

    @Samus, they're irrelevant in the CPU market and have been for years, and yet amazingly, prices are as low as ever since Intel began dominating AMD in performance when they launched Core 2. Since then I've upgraded 5x and have not paid more than $300 for a high-end Intel CPU. How does this happen without competition from AMD as you claim? Oh right, because Intel is still competing with itself and needs to provide enough improvement in order to entice me to buy another one of their products and "upgrade".

    The exact same thing will happen in the GPU sector, with or without AMD. Not worried at all, in fact I'm looking forward to the day a company with deep pockets buys out AMD and reinvigorates their products, I may actually have a reason to buy AMD (or whatever it is called after being bought out) again!
  • Iketh - Monday, July 6, 2015 - link

    you overestimate the human drive... if another isn't pushing us, we will get lazy and that's not an argument... what we'll do instead to make people upgrade is release products in steps planned out much further into the future that are even smaller steps than how intel is releasing now
  • silverblue - Friday, July 3, 2015 - link

    I think this chart shows a better view of who was the underdog and when:

    http://i59.tinypic.com/5uk3e9.jpg

    ATi were ahead for the 9xxx series, and that's it. Moreover, NVIDIA's chipset struggles with Intel were in 2009 and settled in early 2011, something that would've benefitted NVIDIA far more than Intel's settlement with AMD as it would've done far less damage to NVIDIA's financials over a much shorter period of time.

    The lack of higher end APUs hasn't helped, nor has the issue with actually trying to get a GPU onto a CPU die in the first place. Remember that when Intel tried it with Clarkdale/Arrandale, the graphics and IMC were 45nm, sitting alongside everything else which was 32nm.
  • chizow - Friday, July 3, 2015 - link

    I think you have to look at a bigger sample than that, riding on the 9000 series momentum, AMD was competitive for years with a near 50/50 share through the X800/X1900 series. And then G80/R600 happened and they never really recovered. There was a minor blip with Cypress vs. Fermi where AMD got close again but Nvidia quickly righted things with GF106 and GF110 (GTX 570/580).
  • Scali - Tuesday, July 7, 2015 - link

    nVidia wasn't the underdog in terms of technology. nVidia was the choice of gamers. ATi was big because they had been around since the early days of CGA and Hercules, and had lots of OEM contracts.
    In terms of technology and performance, ATi was always struggling to keep up with nVidia, and they didn't reach parity until the Radeon 8500/9700-era, even though nVidia was the newcomer and ATi had been active in the PC market since the mid-80s.
  • Frenetic Pony - Thursday, July 2, 2015 - link

    Well done analysis, though the kick in the head was Bulldozer and it's utter failure. Core 2 wasn't really AMD's downfall so much as Core/Sandy Bridge, which came at the exact wrong time for the utter failure of Bulldozer. This combined with AMD's dismal failure to market its graphics card has cost them billions. Even this article calls the 290x problematic, a card that offered the same performance as the original Titan at a fraction of the price. Based on empirical data the 290/x should have been almost continuously sold until the introduction of Nvidia's Maxwell architecture.

    Instead people continued to buy the much less performant per dollar Nvidia cards and/or waited for "the good GPU company" to put out their new architecture. AMD's performance in marketing has been utterly appalling at the same time Nvidia's has been extremely tight. Whether that will, or even can, change next year remains to be seen.

Log in

Don't have an account? Sign up now