Total War: Rome 2

The second strategy game in our benchmark suite, Total War: Rome 2 is the latest game in the Total War franchise. Total War games have traditionally been a mix of CPU and GPU bottlenecks, so it takes a good system on both ends of the equation to do well here. In this case the game comes with a built-in benchmark that plays out over a forested area with a large number of units, definitely stressing the GPU in particular.

For this game in particular we’ve also gone and turned down the shadows to medium. Rome’s shadows are extremely CPU intensive (as opposed to GPU intensive), so this keeps us from CPU bottlenecking nearly as easily.

Total War: Rome 2 - 3840x2160 - Very High Quality + Med. Shadows

Total War: Rome 2 - 2560x1440 - Extreme Quality + Med. Shadows

For the moment we are including Total War: Rome II as a “freebie” in this review, as neither AMD nor NVIDIA is able to properly render this game. A recent patch for the game made it AFR friendly, unlocking multi-GPU scaling that hasn’t been available for the several months prior. However due to what’s presumably an outstanding bug in the game, when using CF/SLI we’re seeing different rendering artifacts on both AMD and NVIDIA cards.

Given the nature of the artifacting we suspect that performance will remain roughly the same once the problem is resolved, in which case the 295X2 will hold a small but significant lead, but there is no way to know for sure until the rendering issue is corrected. In the meantime this is progress for all multi-GPU cards, even if the game’s developers don’t have it perfected quite yet.

Crysis: Warhead Thief
Comments Locked

131 Comments

View All Comments

  • eotheod - Tuesday, April 8, 2014 - link

    Same performance as crossfire 290X? Might be time to do a Mini-ITX build. Half the price of Titan Z also makes it a winner.
  • Torrijos - Tuesday, April 8, 2014 - link

    A lot of compute benchmark see no improvement from a single 290X...
    What is happening?
  • Ryan Smith - Tuesday, April 8, 2014 - link

    Most of these compute benchmarks do not scale with multiple GPUs. We include them for completeness, if only to not so subtly point out that not everything scales well.
  • CiccioB - Tuesday, April 8, 2014 - link

    Why not adding more real life computing tests like iRay that runs both for CUDA and OpenCL?
    Syntethic tests are really meaningless as they depends more on the particular istructions used to do... ermm.. nothing?
  • fourzeronine - Tuesday, April 8, 2014 - link

    iRay runs on CUDA only. LuxRender should be used for GPU raytrace benchmarking. http://www.luxrender.net/wiki/LuxMark

    Although the best renderers that support OpenCL are hybrid systems that only solve some of the problems on GPU and a card like this would never be fully utilized.

    The best OpenCL bench mark to have would be an agisoft photoscan dense point cloud generation.
  • Musaab - Wednesday, April 9, 2014 - link

    I have one question why didn't you use 2 R9 290X with water cool or 2 GTX 780Ti with water cool. I hate this marketing Mumbo Jumbo. if I want to pay this money I will chose two cards from above with water cool and with some OC work they will feed this card the dust and for the same money I can buy 2 R9 290 or 2 GTX 780.
  • Musaab - Wednesday, April 9, 2014 - link

    Sorry I mean three R9290 or three GTX 780
  • spartaman64 - Sunday, June 1, 2014 - link

    i doubt you can afford 3 of them and water cool them and 3 of them would have a very high tdp also many people would run into space restraints and the r9 295x2 out performs 2 780 ti in sli
  • krutou - Tuesday, April 22, 2014 - link

    Because water blocks and radiators don't grow on trees. Reviewers only test what they're given, all of which are stock.
  • patrickjp93 - Friday, May 2, 2014 - link

    They pretty much do grow on trees. You can get even a moderately good liquid cooling loop for 80 bucks.

Log in

Don't have an account? Sign up now