GRID Autosport

For the racing game in our benchmark suite we have Codemasters’ GRID Autosport. Codemasters continues to set the bar for graphical fidelity in racing games, delivering realistic looking environments with layed with additional graphical effects. Based on their in-house EGO engine, GRID Autosport includes a DirectCompute based advanced lighting system in its highest quality settings, which incurs a significant performance penalty on lower-end cards but does a good job of emulating more realistic lighting within the game world.

GRID Autosport - 3840x2160 - Ultra Quality

GRID Autosport - 2560x1440 - Ultra Quality

Unfortunately for AMD, after a streak of wins and ties for AMD, things start going off the rails with GRID, very off the rails.

At 4K Ultra this is AMD’s single biggest 4K performance deficit; the card trails the GTX 980 Ti by 14%. The good news is that in the process the card cracks 60fps, so framerates are solid on an absolute basis, though there are still going to be some frames below 60fps for racing purists to contend with.

Where things get really bad is at 1440p, in a situation we have never seen before in a high-end AMD video card review. The R9 Fury X gets pummeled here, trailing the GTX 980 Ti by 30%, and even falling behind the GTX 980 and GTX 780 Ti. The reason it’s getting pummeled is because the R9 Fury X is CPU bottlenecked here; no matter what resolution we pick, the R9 Fury X can’t spit out more than about 82fps here at Ultra quality.

With GPU performance outgrowing CPU performance year after year, this is something that was due to happen sooner or later, and is a big reason that low-level APIs are about to come into the fold. And if it was going to happen anywhere, it would happen with a flagship level video card. Still, with an overclocked Core i7-4960X driving our testbed, this is also one of the most powerful systems available with respect to CPU performance, so AMD’s drivers are burning an incredible amount of CPU time here.

Ultimately GRID serves to cement our concerns about AMD’s performance at 1440p, as it’s very possible that this is the tip of the iceberg. DirectX 11 will go away eventually, but it will still take some time. In the meantime there are a number of 1440p gamers out there, especially with R9 Fury X otherwise being such a good fit for high frame rate 1440p gaming. Perhaps the biggest issue here is that this makes it very hard to justify pairing 1440p 144Hz monitors with AMD’s GPUs, as although 82.6fps is fine for a 60Hz monitor, these CPU issues are making it hard for AMD to deliver framerates more suitable/desirable for those high performance monitors.

Total War: Attila Grand Theft Auto V
Comments Locked

458 Comments

View All Comments

  • bennyg - Saturday, July 4, 2015 - link

    Marketing performance. Exactly.

    Except efficiency was not good enough across the generations of 28nm GCN in an era where efficiency + thermal/power limits constrain performance, and look what Nvidia did over a similar era from Fermi (which was at market when GCN 1.0 was released) to Kepler to Maxwell. Plus efficiency is kind of the ultimate marketing buzzword in all areas of tech and not having any ability to mention it (plus having generally inferor products) hamstrung their marketing all along
  • xenol - Monday, July 6, 2015 - link

    Efficiency is important because of three things:

    1. If your TDP is through the rough, you'll have issues with your cooling setup. Any time you introduce a bigger cooling setup because your cards run that hot, you're going to be mocked for it and people are going to be weary of it. With 22nm or 20nm nowhere in sight for GPUs, efficiency had to be a priority, otherwise you're going to ship cards that take up three slots or ship with water coolers.

    2. You also can't just play to the desktop market. Laptops are still the preferred computing platform and even if people are going for a desktop, AIOs are looking much more appealing than a monitor/tower combo. So you want to have any shot in either market, you have to build an efficient chip. And you have to convince people they "need" this chip, because Intel's iGPUs do what most people want just fine anyway.

    3. Businesses and such with "always on" computers would like it if their computers ate less power. Even if you can save a handful of watts, multiplying that by thousands and they add up to an appreciable amount of savings.
  • xenol - Monday, July 6, 2015 - link

    (Also by "computing platform" I mean the platform people choose when they want a computer)
  • medi03 - Sunday, July 5, 2015 - link

    ATI is the reason both Microsoft and Sony use AMDs APUs to power their consoles.
    It might be the reason why APUs even exist.
  • tipoo - Thursday, July 2, 2015 - link

    That was then, this is now. Now, AMD together with the acquisition, has a lower market cap than Nvidia.
  • Murloc - Thursday, July 2, 2015 - link

    yeah, no.
  • ddriver - Thursday, July 2, 2015 - link

    ATI wasn't bigger, AMD just paid a preposterous and entirely unrealistic amount of money for it. Soon after the merger, AMD + ATI was worth less than what they paid for the latter, ultimately leading to the loss of its foundries, putting it in an even worse position. Let's face it, AMD was, and historically has always been betrayed, its sole purpose is to create the illusion of competition so that the big boys don't look bad for running unopposed, even if this is what happens in practice.

    Just when AMD got lucky with Athlon a mole was sent to make sure AMD stays down.
  • testbug00 - Sunday, July 5, 2015 - link

    foundries didn't go because AMD bought ATI. That might have accelerated it by a few years however.

    Foundry issue and cost to AMD dates back to the 1990's and 2000-2001.
  • 5150Joker - Thursday, July 2, 2015 - link

    True, AMD was at a much better position in 2006 vs NVIDIA, they just got owned.
  • 3DVagabond - Friday, July 3, 2015 - link

    When was Intel the underdog? Because that's who's knocked them down (The aren't out yet.).

Log in

Don't have an account? Sign up now