Civilization: Beyond Earth

Shifting gears from action to strategy, we have Civilization: Beyond Earth, the latest in the Civilization series of strategy games. Civilization is not quite as GPU-demanding as some of our action games, but at Ultra quality it can still pose a challenge for even high-end video cards. Meanwhile as the first Mantle-enabled strategy title Civilization gives us an interesting look into low-level API performance on larger scale games, along with a look at developer Firaxis’s interesting use of split frame rendering with Mantle to reduce latency rather than improving framerates.

Civilization: Beyond Earth - 3840x2160 - Ultra Quality

Civilization: Beyond Earth - 2560x1440 - Ultra Quality

Unlike Battlefield 4 where we needed to switch back to DirectX for performance reasons on the R9 Fury X, AMD’s latest card still holds up rather well on Mantle here, probably due to the fact that Civilization is a newer game. Though not drawn in this chart, what we find is that AMD loses a frame or two per second for running Mantle, but in return they see far, far better minimums (more on that later).

Overall then the R9 Fury X looks pretty good at 4K. Even at Ultra quality it can deliver a better than 60fps average and is within 2% of the GTX 980 Ti. On the other hand AMD struggles a bit more at 1440p, where the absolute framerate is still rather high, but relative to the GTX 980 Ti it’s now an 11% performance gap. This being a Mantle game, the fact that AMD does fall behind is a bit surprising, as at a high level they should be enjoying the CPU benefits of the low-level API. We’ll revisit 1440p performance a bit later on, but this is going to be a recurring quirk for AMD, and a detriment for 1440p 144Hz monitor owners.

Civilization: Beyond Earth - Min. Frame Rate - 3840x2160 - Ultra Quality

Civilization: Beyond Earth - Min. Frame Rate - 2560x1440 - Ultra Quality

The bigger advantage of Mantle is really the minimum framerates, and here the R9 Fury X soars. At 4K the R9 Fury X delivers a minimum framerate of 50.5fps, some 20% better than the GTX 980 Ti. Both cards do well enough here, but it goes without saying that this is a very distinct difference, and one that is well in AMD’s favor. The only downside for AMD here is that they can’t keep this advantage at 1440p, where they go back to trailing the GTX 980 Ti in minimum framerates by 7%.

On that note I do have one concern here with AMD’s support plans for Mantle. Mainly I’m worried that as well as the R9 Fury X does here, there’s a risk Mantle may stop working in the future. The GCN 1.2 based R9 285 can’t use the Mantle path at all (it crashes), and the R9 Fury X is not all that different in architecture.

Middle Earth: Shadow of Mordor Dragon Age: Inquisition
Comments Locked

458 Comments

View All Comments

  • Scali - Tuesday, July 7, 2015 - link

    Even better, there are various vendors that sell a short version of the GTX970 (including Asus and Gigabyte for example), so it can take on the Nano card directly, as a good choice for a mini-ITX based HTPC.
    And unlike the Nano, the 970 DOES have HDMI 2.0, so you can get 4k 60 Hz on your TV.
  • Oxford Guy - Thursday, July 9, 2015 - link

    28 GB/s + XOR contention is fast performance indeed, at half the speed of a midrange card from 2007.
  • Gothmoth - Monday, July 6, 2015 - link

    so in short another BULLDOZER.... :-(

    after all the hype not enough and too late.

    i agree the card is not bad.. but after all the HYPE it IS a disappointment.

    OC results are terrible... and AMD said it will be an overclockers dream.

    add to that that i read many complains about the noisy watercooler (yes for retail versions not early preview versions).
  • iamserious - Monday, July 6, 2015 - link

    It looks ugly. Lol
  • iamserious - Monday, July 6, 2015 - link

    Also. I understand it's a little early but I thought this card was supposed to blow the GTX 980Ti out of the water with it's new memory. The performance to price ratio is decent but I was expecting a bit larger jump in performance increase. Perhaps with the driver updates things will change.
  • Scali - Tuesday, July 7, 2015 - link

    Hum, unless I missed it, I didn't see any mention of the fact that this card only supports DX12 level 12_0, where nVidia's 9xx-series support 12_1.
    That, combined with the lack of HDMI 2.0 and the 4 GB limit, makes the Fury X into a poor choice for the longer term. It is a dated architecture, pumped up to higher performance levels.
  • FMinus - Tuesday, July 7, 2015 - link

    Whilst it's beyond me why they skimped on HDMI 2.0 - there's adapters if you really want to run this card on a TV. It's not such a huge drama tho, the cards will drive DP monitors in the vast majority, so, I'm much more sad at the missing DVI out.
  • Scali - Wednesday, July 8, 2015 - link

    I think the reason why there's no HDMI 2.0 is simple: they re-used their dated architecture, and did not spend time on developing new features, such as HDMI 2.0 or 12_1 support.

    With nVidia already having this technology on the market for more than half a year, AMD is starting to drop behind. They were losing sales to nVidia, and their new offerings don't seem compelling enough to regain their lost marketshare, hence their profits will be limited, hence their investment in R&D for the next generation will be limited. Which is a problem, since they need to invest more just to get where nVidia already is.
    It looks like they may be going down the same downward spiral as their CPU division.
  • sa365 - Tuesday, July 7, 2015 - link

    Well at least AMD aren't cheating by allowing the driver to remove AF despite what settings are selected in game. Just so they can win benchmarks.
    How about some fair, like for like benchmarking and see where these cards really stand.
  • FourEyedGeek - Tuesday, July 7, 2015 - link

    As for the consoles having 8 GB of RAM, not only is that shared, but the OS uses 3 GB to 3.5 GB, meaning there is only a max of 5 GB for the games on those consoles. A typical PC being used with this card will have 8 to 16 GB plus the 4 GB on the card. Giving a total of 12 GB to 20 GB.

    In all honesty at 4K resolutions, how important is Anti-Aliasing on the eye? I can't imagine it being necessary at all, let alone 4xMSAA.

Log in

Don't have an account? Sign up now