Total War: Rome 2

The second strategy game in our benchmark suite, Total War: Rome 2 is the latest game in the Total War franchise. Total War games have traditionally been a mix of CPU and GPU bottlenecks, so it takes a good system on both ends of the equation to do well here. In this case the game comes with a built-in benchmark that plays out over a forested area with a large number of units, definitely stressing the GPU in particular.

For this game in particular we’ve also gone and turned down the shadows to medium. Rome’s shadows are extremely CPU intensive (as opposed to GPU intensive), so this keeps us from CPU bottlenecking nearly as easily.

With Rome 2 no one is getting 60fps at 2560, but then again as a strategy game it’s hardly necessary. In which case the 290X once again beats the GTX 780 by a smaller than average 6%, essentially sitting in the middle of the gap between the GTX 780 and GTX Titan.

Meanwhile at 4K we can actually get some relatively strong results out of even our single card configurations, but we have to drop our settings down by 2 notches to Very High to do so. Though like all of our 4K game tests, it turns out well for AMD, with the 290X’s lead growing to 13%.

AFR performance is a completely different matter though. It’s not unusual for strategy games to scale poorly or not at all, but Rome 2 is different yet. The GTX 780 SLI consistently doesn’t scale at all, however with the 290X CF we see anything from massive negative scaling at 2560 to a small performance gain at 4K. Given the nature of the game we weren’t expecting anything here at all, and though getting any scaling is a nice turn of events to have negative scaling like this is a bit embarrassing for AMD. At least NVIDIA can claim to be more consistent here.

Without working AFR scaling, our deltas are limited to single-GPU configurations and as a result are unremarkable. Sub-3% for everyone, everywhere, which is a solid result for any single-GPU setup.

Crysis Hitman: Absolution
Comments Locked

396 Comments

View All Comments

  • mr_tawan - Tuesday, November 5, 2013 - link

    AMD card may suffer from loud cooler. Let's just hope that the OEM versions would be shipped with quieter coolers.
  • 1Angelreloaded - Monday, November 11, 2013 - link

    I have to be Honest here, it is beast, in fact the only thing in my mind holding this back is lack of feature sets compared to NVidia, namely PhysX, to me this is a bit of a deal breaker compared for 150$ more the 780 Ti gives me that with lower TDP/and sound profile, as we are only able to so much pull from 1 120W breaker without tripping it and modification for some people is a deal breaker due to wear they live and all. Honestly What I really need to see from a site is 4k gaming at max, 1600p/1200p/1080p benchmarks with single cards as well as SLI/Crossfire to see how they scale against each other. To be clear as well a benchmark using Skyrim Modded to the gills in texture resolutions as well to fully see how the VRAM might effect the cards in future games from this next Gen era, where the Consoles can manage a higher texture resolution natively now, and ultimately this will affect PC performance when the standard was 1-2k texture resolutions now becomes double to 4k or even in a select few up to 8k depth. With a native 64 bit architecture as well you will be able to draw more system RAM into the equation where Skyrim can use a max of 3.5 before it dies with Maxwell coming out and a shared memory pool with a single core microprocessor on the die itself with Gsync for smoothness we might see an over engineered GPU card capable of much much more than we thought, ATI as well has their own ideas which will progress, I have a large feeling Hawaii is actually a reject of sorts because they have to compete with Maxwell and engineer more into the cards themselves.
  • marceloviana - Monday, November 25, 2013 - link

    I Just wondering why does this card came with 32Gb gddr5 and see only 4Gb. The PCB show 16 Elpida EDW2032BBBG (2G each). This amount of memory will help a lot in large scenes wit Vray-RT.
  • Mat3 - Thursday, March 13, 2014 - link

    I don't get it. It's supposed to have 11 compute units per shader engine, making 44 on the entire chip. But the 2nd picture says each shader engine can only have up to 9 compute units....?
  • Mat3 - Thursday, March 13, 2014 - link

    2nd picture on page three I mean.
  • sanaris - Monday, April 14, 2014 - link

    Who cares? This card was never meant to compute something.

    It supposed to be "cheap but decent".
    Initially they made this ridiculous price, but now it is around 200-350 at ebay.
    For $200 it worth its price, because it can be used only to play games.
    Who wants to play games at medium quality (not the future ones), may prefer it.

Log in

Don't have an account? Sign up now