Dragon Age: Inquisition

Our RPG of choice for 2015 is Dragon Age: Inquisition, the latest game in the Dragon Age series of ARPGs. Offering an expansive world that can easily challenge even the best of our video cards, Dragon Age also offers us an alternative take on EA/DICE’s Frostbite 3 engine, which powers this game along with Battlefield 4.

Dragon Age: Inquisition - 3840x2160 - Ultra Quality - 0x MSAA

Dragon Age: Inquisition - 3840x2160 - High Quality

Dragon Age: Inquisition - 2560x1440 - Ultra Quality - 0x MSAA

Similar to Battlefield 4, we have swapped out Mantle for DirectX here; the R9 Fury X didn’t suffer too much from Mantle, but it certainly was not in the card’s favor.

Perhaps it’s a Frostbite thing or maybe AMD just got unlucky here, but Dragon Age is the second-worst showing for the R9 Fury X. The card trails the GTX 980 Ti at all times, by anywhere between 13% and 18%. At this point AMD is straddling the line between the GTX 980 and GTX 980 Ti, and at 1440p they fall closer to the GTX 980.

Meanwhile I feel this is another good example of why single-GPU cards aren’t quite ready yet for no-compromises 4K gaming. Even without MSAA the R9 Fury X can’t break out of the 30s, we have to drop to High quality to do that. On the other hand going to 1440p immediately gets Ultra quality performance over 60fps.

Finally, the R9 Fury X’s performance gains over its predecessor are also among their lowest here. The Fiji based card picks up just 22% at 4K, and less at 1440p. Once again we are likely looking at a bottleneck closer to geometry or ROP performance, which leaves the shaders underutilized.

Civilization: Beyond Earth The Talos Principle
Comments Locked

458 Comments

View All Comments

  • just4U - Saturday, July 4, 2015 - link

    I thought it was great as well.. It had a lot more meat to it then I was expecting. Ryan might have been late to the party but he's getting more feedback than most other sites on his review so that shows that it was highly anticipated.
  • B3an - Saturday, July 4, 2015 - link

    I don't understand why the Fury X doesn't perform better... It's specs are considerably better than a 290X/390X and it's memory bandwidth is far higher than any other card out there... yet it still can't beat the 980 Ti and should also be faster than it already is compared to the 290X. It just doesn't make sense.
  • just4U - Saturday, July 4, 2015 - link

    Early drivers and perhaps the change over into a new form of memory tech has a bit of a tech curve that isn't fully realized yet.
  • Oxford Guy - Saturday, July 4, 2015 - link

    Perhaps DX11 is holding it back. As far as I understand it, Maxwell is more optimized for DX11 than AMD's cards are. AMD really should have sponsored a game engine or something so that there would have been a DX12 title available for benchmarkers with this card's launch.
  • dominopourous - Saturday, July 4, 2015 - link

    Great stuff. Can we get a benchmarks with these cards overclocked? I'm thinking the 980 Ti and the Titan X will scale much better with overclocking compared to Fury X.
  • Mark_gb - Saturday, July 4, 2015 - link

    Great review. With 1 exception.

    Once again, the 400 AMP number is tossed around as how much power the Fury X can handle. But think about that for one second. Even a EVGA SuperNOVA 1600 G2 Power Supply is extreme overkill for a system with a single Fury X in it, and its +12V rail only provides 133.3 amps.

    That 400 AMP number is wrong. Very wrong. It should be 400 watts. Push 400 Amps into a Fury X and it most likely would literally explode. I would not want to be anywhere near that event.
  • AngelOfTheAbyss - Saturday, July 4, 2015 - link

    The operating voltage of the Fury chip is probably around 1V, so 400A sounds correct (1V*400A = 400W).
  • meacupla - Saturday, July 4, 2015 - link

    okay, see, it's not 12V * 400A = 4800W. It's 1V (or around 1V) * 400A = 400W
    4800W would trip most 115VAC circuit breakers, as that would be 41A on 115VAC, before you even start accounting for conversion losses.
  • bugsy1339 - Saturday, July 4, 2015 - link

    Anyone hear about Nvidia lowering thier graphics quality to get a higher frame rate in reviews vs Fury? Reference is semi accurate forum 7/3 (Nvidia reduces IQ to boost performance on 980TI? )
  • sa365 - Sunday, July 5, 2015 - link

    I too would like to know more re:bugsy1939 comment.

    Have Nvidia been caught out with lower IQ levels forced in the driver?

Log in

Don't have an account? Sign up now