Grand Theft Auto V

The final game in our review of the R9 Fury X is our most recent addition, Grand Theft Auto V. The latest edition of Rockstar’s venerable series of open world action games, Grand Theft Auto V was originally released to the last-gen consoles back in 2013. However thanks to a rather significant facelift for the current-gen consoles and PCs, along with the ability to greatly turn up rendering distances and add other features like MSAA and more realistic shadows, the end result is a game that is still among the most stressful of our benchmarks when all of its features are turned up. Furthermore, in a move rather uncharacteristic of most open world action games, Grand Theft Auto also includes a very comprehensive benchmark mode, giving us a great chance to look into the performance of an open world action game.

On a quick note about settings, as Grand Theft Auto V doesn't have pre-defined settings tiers, I want to quickly note what settings we're using. For "Very High" quality we have all of the primary graphics settings turned up to their highest setting, with the exception of grass, which is at its own very high setting. Meanwhile 4x MSAA is enabled for direct views and reflections. This setting also involves turning on some of the advanced redering features - the game's long shadows, high resolution shadows, and high definition flight streaming - but it not increasing the view distance any further.

Otherwise for "High" quality we take the same basic settings but turn off all MSAA, which significantly reduces the GPU rendering and VRAM requirements.

Grand Theft Auto V - 3840x2160 - Very High Quality

Grand Theft Auto V - 3840x2160 - High Quality

Grand Theft Auto V - 2560x1440 - Very High Quality

Our final game sees the R9 Fury X go out on either an average or slightly worse than average note, depending on the settings and resolution we are looking at. At our highest 4K settings the R9 Fury X trails the GTX 980 Ti once again, this time by 10%. Worse, at 1440p it’s now 15%. On the other hand if we run at our lower, more playable 4K settings, then the gap is only 5%, roughly in line with the overall average 4K performance gap between the GTX 980 Ti and R9 Fury X.

In this case it’s probably to AMD’s benefit that our highest 4K settings aren’t actually playable on a single GPU card, as the necessary drop in quality gets them closer to NVIDIA’s performance. On the other hand this does reiterate the fact that right now many games will force a tradeoff between resolution and quality if you wish to pursue 4K gaming.

Finally, the performance gains relative to the R9 290X are pretty good. 29% at 1440p, and 44% at the lower quality playable 4K resolution setting.

Grand Theft Auto V - 99th Percentile Framerate - 3840x2160 - Very High Quality

Grand Theft Auto V - 99th Percentile Framerate - 3840x2160 - High Quality

Grand Theft Auto V - 99th Percentile Framerate - 2560x1440 - Very High Quality

Shifting gears to 99th percentile frametimes however – a much-welcome feature of the game’s built-in benchmark – finds that AMD doesn’t fare nearly as well. At the 99th percentile the R9 Fury X trails the GTX 980 Ti at all times, and significantly so. The deficit is anywhere between 26% at 1440p to 40% at 4K Very High.

What’s happening here is a combination of multiple factors. First and foremost, next to Shadow of Mordor, GTAV is our other VRAM busting game. This, I believe, is why 99th percentile performance dives so hard at 4K Very High for the R9 Fury X, as it only has 4GB of VRAM compared to 6GB on the GTX 980 Ti. But considering where the GTX 980 places – above the R9 Fury X – I also believe there’s more than just VRAM bottlenecking occurring here. The GTX 980 sees at least marginally better framerates with the same size VRAM pool (and a lot less of almost everything else), which leads me to believe that AMD’s drivers may be holding them back here. Certainly the R9 290X comparison lends some possible credit to that, as the 99th percentile gains are under 20%. Regardless, one wouldn’t expect to be VRAM limited at 1440p or 4K without MSAA, especially as this test was not originally designed to bust 4GB cards.

GRID Autosport Synthetics
Comments Locked

458 Comments

View All Comments

  • testbug00 - Sunday, July 5, 2015 - link

    You don't need architecture improvements to use DX12/Vulkan/etc. The APIs merely allow you to implement them over DX11 if you choose to. You can write a DX12 game without optimizing for any GPUs (although, not doing so for GCN given consoles are GCN would be a tad silly).

    If developers are aiming to put low level stuff in whenever they can than the issue becomes that due to AMD's "GCN everywhere" approach developers may just start coding for PS4, porting that code to Xbox DX12 and than porting that to PC with higher textures/better shadows/effects. In which Nvidia could take massive performance deficites to AMD due to not getting the same amount of extra performance from DX12.

    Don't see that happening in the next 5 years. At least, not with most games that are console+PC and need huge performance. You may see it in a lot of Indie/small studio cross platform games however.
  • RG1975 - Thursday, July 2, 2015 - link

    AMD is getting there but, they still have a little bit to go to bring us a new "9700 Pro". That card devastated all Nvidia cards back then. That's what I'm waiting for to come from AMD before I switch back.
  • Thatguy97 - Thursday, July 2, 2015 - link

    would you say amd is now the "geforce fx 5800"
  • piroroadkill - Thursday, July 2, 2015 - link

    Everyone who bought a Geforce FX card should feel bad, because the AMD offerings were massively better. But now AMD is close to NVIDIA, it's still time to rag on AMD, huh?

    That said, of course if I had $650 to spend, you bet your ass I'd buy a 980 Ti.
  • Thatguy97 - Thursday, July 2, 2015 - link

    oh believe me i remember they felt bad lol but im not ragging on amd but nvidia stole their thunder with the 980 ti
  • KateH - Thursday, July 2, 2015 - link

    C'mon, Fury isn't even close to the Geforce FX level of fail. It's really hard to overstate how bad the FX5800 was, compared to the Radeon 9700 and even the Geforce 4600Ti.

    The Fury X wins some 4K benchmarks, the 980Ti wins some. The 980Ti uses a bit less power but the Fury X is cooler and quieter.

    Geforce FX level of fail would be if the Fury X was released 3 months from now to go up against the 980Ti with 390X levels of performance and an air cooler.
  • Thatguy97 - Thursday, July 2, 2015 - link

    To be fair the 5950 ultra was actually decent
  • Morawka - Thursday, July 2, 2015 - link

    your understating nvidia's scores.. the won 90% of all benchmarks, not just "some". a full 120W more power under furmark load and they are using HBM!!
  • looncraz - Thursday, July 2, 2015 - link

    Furmark power load means nothing, it is just a good way to stress test and see how much power the GPU is capable of pulling in a worst-case scenario and how it behaves in that scenario.

    While gaming, the difference is miniscule and no one will care one bit.

    Also, they didn't win 90% of the benchmarks at 4K, though they certainly did at 1440. However, the real world isn't that simple. A 10% performance difference in GPUs may as well be zero difference, there are pretty much no game features which only require a 10% higher performance GPU to use... or even 15%.

    As for the value argument, I'd say they are about even. The Fury X will run cooler and quieter, take up less space, and will undoubtedly improve to parity or beyond the 980Ti in performance with driver updates. For a number of reasons, the Fury X should actually age better, as well. But that really only matters for people who keep their cards for three years or more (which most people usually do). The 980Ti has a RAM capacity advantage and an excellent - and known - overclocking capacity and currently performs unnoticeably better.

    I'd also expect two Fury X cards to outperform two 980Ti cards with XFire currently having better scaling than SLI.
  • chizow - Thursday, July 2, 2015 - link

    The differences in minimums aren't miniscule at all, and you also seem to be discounting the fact 980Ti overclocks much better than Fury X. Sure XDMA CF scales better when it works, but AMD has shown time and again, they're completely unreliable for timely CF fixes for popular games to the point CF is clearly a negative for them right now.

Log in

Don't have an account? Sign up now