GRID Autosport

For the racing game in our benchmark suite we have Codemasters’ GRID Autosport. Codemasters continues to set the bar for graphical fidelity in racing games, delivering realistic looking environments with layed with additional graphical effects. Based on their in-house EGO engine, GRID Autosport includes a DirectCompute based advanced lighting system in its highest quality settings, which incurs a significant performance penalty on lower-end cards but does a good job of emulating more realistic lighting within the game world.

GRID Autosport - 3840x2160 - Ultra Quality

GRID Autosport - 2560x1440 - Ultra Quality

Unfortunately for AMD, after a streak of wins and ties for AMD, things start going off the rails with GRID, very off the rails.

At 4K Ultra this is AMD’s single biggest 4K performance deficit; the card trails the GTX 980 Ti by 14%. The good news is that in the process the card cracks 60fps, so framerates are solid on an absolute basis, though there are still going to be some frames below 60fps for racing purists to contend with.

Where things get really bad is at 1440p, in a situation we have never seen before in a high-end AMD video card review. The R9 Fury X gets pummeled here, trailing the GTX 980 Ti by 30%, and even falling behind the GTX 980 and GTX 780 Ti. The reason it’s getting pummeled is because the R9 Fury X is CPU bottlenecked here; no matter what resolution we pick, the R9 Fury X can’t spit out more than about 82fps here at Ultra quality.

With GPU performance outgrowing CPU performance year after year, this is something that was due to happen sooner or later, and is a big reason that low-level APIs are about to come into the fold. And if it was going to happen anywhere, it would happen with a flagship level video card. Still, with an overclocked Core i7-4960X driving our testbed, this is also one of the most powerful systems available with respect to CPU performance, so AMD’s drivers are burning an incredible amount of CPU time here.

Ultimately GRID serves to cement our concerns about AMD’s performance at 1440p, as it’s very possible that this is the tip of the iceberg. DirectX 11 will go away eventually, but it will still take some time. In the meantime there are a number of 1440p gamers out there, especially with R9 Fury X otherwise being such a good fit for high frame rate 1440p gaming. Perhaps the biggest issue here is that this makes it very hard to justify pairing 1440p 144Hz monitors with AMD’s GPUs, as although 82.6fps is fine for a 60Hz monitor, these CPU issues are making it hard for AMD to deliver framerates more suitable/desirable for those high performance monitors.

Total War: Attila Grand Theft Auto V
Comments Locked

458 Comments

View All Comments

  • Scali - Tuesday, July 7, 2015 - link

    Even better, there are various vendors that sell a short version of the GTX970 (including Asus and Gigabyte for example), so it can take on the Nano card directly, as a good choice for a mini-ITX based HTPC.
    And unlike the Nano, the 970 DOES have HDMI 2.0, so you can get 4k 60 Hz on your TV.
  • Oxford Guy - Thursday, July 9, 2015 - link

    28 GB/s + XOR contention is fast performance indeed, at half the speed of a midrange card from 2007.
  • Gothmoth - Monday, July 6, 2015 - link

    so in short another BULLDOZER.... :-(

    after all the hype not enough and too late.

    i agree the card is not bad.. but after all the HYPE it IS a disappointment.

    OC results are terrible... and AMD said it will be an overclockers dream.

    add to that that i read many complains about the noisy watercooler (yes for retail versions not early preview versions).
  • iamserious - Monday, July 6, 2015 - link

    It looks ugly. Lol
  • iamserious - Monday, July 6, 2015 - link

    Also. I understand it's a little early but I thought this card was supposed to blow the GTX 980Ti out of the water with it's new memory. The performance to price ratio is decent but I was expecting a bit larger jump in performance increase. Perhaps with the driver updates things will change.
  • Scali - Tuesday, July 7, 2015 - link

    Hum, unless I missed it, I didn't see any mention of the fact that this card only supports DX12 level 12_0, where nVidia's 9xx-series support 12_1.
    That, combined with the lack of HDMI 2.0 and the 4 GB limit, makes the Fury X into a poor choice for the longer term. It is a dated architecture, pumped up to higher performance levels.
  • FMinus - Tuesday, July 7, 2015 - link

    Whilst it's beyond me why they skimped on HDMI 2.0 - there's adapters if you really want to run this card on a TV. It's not such a huge drama tho, the cards will drive DP monitors in the vast majority, so, I'm much more sad at the missing DVI out.
  • Scali - Wednesday, July 8, 2015 - link

    I think the reason why there's no HDMI 2.0 is simple: they re-used their dated architecture, and did not spend time on developing new features, such as HDMI 2.0 or 12_1 support.

    With nVidia already having this technology on the market for more than half a year, AMD is starting to drop behind. They were losing sales to nVidia, and their new offerings don't seem compelling enough to regain their lost marketshare, hence their profits will be limited, hence their investment in R&D for the next generation will be limited. Which is a problem, since they need to invest more just to get where nVidia already is.
    It looks like they may be going down the same downward spiral as their CPU division.
  • sa365 - Tuesday, July 7, 2015 - link

    Well at least AMD aren't cheating by allowing the driver to remove AF despite what settings are selected in game. Just so they can win benchmarks.
    How about some fair, like for like benchmarking and see where these cards really stand.
  • FourEyedGeek - Tuesday, July 7, 2015 - link

    As for the consoles having 8 GB of RAM, not only is that shared, but the OS uses 3 GB to 3.5 GB, meaning there is only a max of 5 GB for the games on those consoles. A typical PC being used with this card will have 8 to 16 GB plus the 4 GB on the card. Giving a total of 12 GB to 20 GB.

    In all honesty at 4K resolutions, how important is Anti-Aliasing on the eye? I can't imagine it being necessary at all, let alone 4xMSAA.

Log in

Don't have an account? Sign up now