Total War: Rome 2

The second strategy game in our benchmark suite, Total War: Rome 2 is the latest game in the Total War franchise. Total War games have traditionally been a mix of CPU and GPU bottlenecks, so it takes a good system on both ends of the equation to do well here. In this case the game comes with a built-in benchmark that plays out over a forested area with a large number of units, definitely stressing the GPU in particular.

For this game in particular we’ve also gone and turned down the shadows to medium. Rome’s shadows are extremely CPU intensive (as opposed to GPU intensive), so this keeps us from CPU bottlenecking nearly as easily.

With Rome 2 no one is getting 60fps at 2560, but then again as a strategy game it’s hardly necessary. In which case the 290X once again beats the GTX 780 by a smaller than average 6%, essentially sitting in the middle of the gap between the GTX 780 and GTX Titan.

Meanwhile at 4K we can actually get some relatively strong results out of even our single card configurations, but we have to drop our settings down by 2 notches to Very High to do so. Though like all of our 4K game tests, it turns out well for AMD, with the 290X’s lead growing to 13%.

AFR performance is a completely different matter though. It’s not unusual for strategy games to scale poorly or not at all, but Rome 2 is different yet. The GTX 780 SLI consistently doesn’t scale at all, however with the 290X CF we see anything from massive negative scaling at 2560 to a small performance gain at 4K. Given the nature of the game we weren’t expecting anything here at all, and though getting any scaling is a nice turn of events to have negative scaling like this is a bit embarrassing for AMD. At least NVIDIA can claim to be more consistent here.

Without working AFR scaling, our deltas are limited to single-GPU configurations and as a result are unremarkable. Sub-3% for everyone, everywhere, which is a solid result for any single-GPU setup.

Crysis Hitman: Absolution
Comments Locked

396 Comments

View All Comments

  • itchyartist - Thursday, October 24, 2013 - link

    Incredible performance and value from AMD!

    The fastest single chip video card in the world. Overall it is faster than the nvidia Titan and only $549! Almost half the price!

    Truly great to see the best performance around at a cost that is not bending you over. Battlefield 4 with AMD Mantle just around the corner. These new 290X GPUs are going to be uncontested Kings of the Hill for the Battlefield 4 game. Free battlefield game with the 290X too.Must buy.

    Incredible!
  • Berzerker7 - Thursday, October 24, 2013 - link

    ...really? The card is $600. You reek of AMD PR.
  • Novulux - Thursday, October 24, 2013 - link

    It says $549 in this very review?
  • Berzerker7 - Thursday, October 24, 2013 - link

    It does indeed. His article still smells like pre-written script.
  • siliconwizard - Thursday, October 24, 2013 - link

    Like all the reviews state GTX Titan is now irrelevant. 290X took the crown and saved the wallet.
  • siliconwizard - Thursday, October 24, 2013 - link

    Thinking that sphere toucher' s comment is accurate. Bit of salt here over amd taking over the high end slot and ridiculing the titan card. Only going to get worse once the Mantle enabled games are rleased. Nvidia is finished for battlefield 4. Crushed by amd, 290x and mantle.
  • MousE007 - Thursday, October 24, 2013 - link

    Mantle.....lol , nvidia Gsync just killed AMD
  • ninjaquick - Thursday, October 24, 2013 - link

    lol? a G-Sync type solution is a good candidate for being integrated into a VESA standard, and make it part of the Display's Information that is exchanged though DP/HDMI/DVI, so all AMD would need to do is make sure their drivers are aware that they can send frames to the screen as soon as they are finished. The best part would be that, with the whole Mantle deal, AMD would probably expose this to the developer, allowing them to determine when frames are 'G-Sync'd' and when they are not.
  • MousE007 - Thursday, October 24, 2013 - link

    No, there is a "hand- shake" between GPU and the monitor or tv, will not be supported with any other brand.
  • inighthawki - Thursday, October 24, 2013 - link

    You do realize that it can still be put into the VESA standard, right? Then only GPUs supporting the standard can take advantage of it. Also ANYONE who believes that GSync OR Mantle is going to "kill the other" is just an idiot.

Log in

Don't have an account? Sign up now