Total War: Attila

The second strategy game in our benchmark suite, Total War: Attila is the latest game in the Total War franchise. Total War games have traditionally been a mix of CPU and GPU bottlenecks, so it takes a good system on both ends of the equation to do well here. In this case the game comes with a built-in benchmark that plays out over a large area with a fortress in the middle, making it a good GPU stress test.

Total War: Attila - 3840x2160 - Max Quality + Perf Shadows

Total War: Attila - 3840x2160 - Quality + Perf Shadows

Total War: Attila - 2560x1440 - Max Quality + Perf Shadows

In creating Attila, the developers at Creative Assembly sought to push the limit of current generation video cards, and this is no more evident than at 4K Max Quality. At 23.5fps even the GTX Titan X is foiled here, never mind the GTX 980 and GK110 cards. To get single card performance above 30fps we have to drop a notch to the “Quality” setting, which gets the GTX Titan X up to 44.9fps. In any case, at these settings the GTX Titan X makes easy work of the single-GPU competition, beating everything else by 30-66%.

Alternatively we can drop from 4K to 1440p and still run Max Quality, in which case the GTX Titan X delivers a very similar 47.1fps.

Far Cry 4 GRID Autosport
Comments Locked

276 Comments

View All Comments

  • cmoney408 - Tuesday, March 24, 2015 - link

    can you please post the settings you used for the 295x2? not the in game settings, but what you used in catalyst.
  • FlushedBubblyJock - Thursday, April 2, 2015 - link

    " and the Radeon R9 295X2, the latter of which is down to ~$699 these days and "

    I knew it wouldn't be $699 when i clicked the link...

    its frikkin $838 , $ 1,176 $990, $978 ...

    Yep, that's the real amd card price, not the fantasy one.
  • gianluca - Sunday, April 5, 2015 - link

    Hi!
    Just a question: Do you suggest me to buy r9 295x2?
    Thx
  • Kyururin - Wednesday, April 8, 2015 - link

    Umm I find it pointless to compare AMD R9 290x with GTX 980, R9 290x is build to be competitive to Nvidia's stock 780 not 780ti and sure as hell not GTX 980, it's dumb, it's like trying to ask a grandma(R9 290x) to compete with supermodel(GTX 980) in a beauty pageant, of course Nvidia is going to win, but it's not like the winning gap is spectacular or something to be astonished about. Last but not least GTX 980's lead over the grandma is the largest sub 2k, let's not forget that both the GTX 980 and the grandma are build to handle 4k so given the time Nvidia has to prepare the GTX980, it should had obliterated the grandma in 4k but the performance gap is not that fricking big and deserved to be woved, especially FarCry 4. Fanboys always bash AMD for their terrible drivers but it's not like they are ignored you dumb witt, they are slowly improving their drivers. Did AMD ever said We are going to pretend that our driver don't suck and so we are not going to fix it.
  • alexreffand - Monday, May 18, 2015 - link

    Why is the GTX 580 in the tests? Why not the Titan Z or even the 970?
  • ajboysen - Monday, July 25, 2016 - link

    I'm not sure if the specs have changed since this post but they list the boost clock speed as 1531 MHz, Not 1002

Log in

Don't have an account? Sign up now