Total War: Rome 2

The second strategy game in our benchmark suite, Total War: Rome 2 is the latest game in the Total War franchise. Total War games have traditionally been a mix of CPU and GPU bottlenecks, so it takes a good system on both ends of the equation to do well here. In this case the game comes with a built-in benchmark that plays out over a forested area with a large number of units, definitely stressing the GPU in particular.


For this game in particular we’ve also gone and turned down the shadows to medium. Rome’s shadows are extremely CPU intensive (as opposed to GPU intensive), so this keeps us from CPU bottlenecking nearly as easily.

Total War: Rome 2 - 3840x2160 - Extreme Quality + Med. Shadows

Total War: Rome 2 - 3840x2160 - Very High Quality + Med. Shadows

Total War: Rome 2 - 2560x1440 - Extreme Quality + Med. Shadows

Total War: Rome 2 - 1920x1080 - Extreme Quality + Med. Shadows

Of all of our games, there is no better set of benchmarks for the GTX 980 than Total War: Rome II. Against both AMD and NVIDIA’s last-generation cards it never wins by as much as it wins here.

Compared to the GTX 780 Ti the GTX 980 is a consistent 16-17% ahead at all resolutions. Meanwhile against the R9 280XU this is an 18% lead at 1080p and 1440p. R9 290XU only begins to catch up at 4K Very High quality, where GTX 980 still leads by a respectable 8%.

This is also a very strong showing compared to the GTX 680. The overall lead is 80-95% depending on the resolution. The GTX 980 was not necessarily meant to double the GTX 680’s performance, but it comes very close to doing so here at 1440p.

Given what happens to the GK104 cards in this game, I suspect we’re looking at the results of either the ROP advantage and/or a very good case CUDA core occupancy improvements. The fact that the lead over the GTX 780 Ti is so consistent over all resolutions does point to the CUDA core theory, but we can’t really rule out the ROPs with the information we have.

As for results on an absolute basis, not even mighty GTX 980 is going to crack 30fps at 4K with Extreme settings. In lieu of that Very High quality comes off quite well at 49fps, and we’re just shy of hitting 60fps at 1440p with Extreme.

Crysis: Warhead Thief
Comments Locked

274 Comments

View All Comments

  • Viewgamer - Friday, September 19, 2014 - link

    To Ryan Smith. How can the GTX 980 possibly have a 165W TDP when it actually consumes 8 watts more than the 195W TDP GTX 680 !? please explain ? did Nvidia just play games with the figures to make them look more impressive ?
  • ArmedandDangerous - Friday, September 19, 2014 - link

    TDP =/= Power consumption although they are related. TDP is the amount of heat it will output.
  • Carrier - Friday, September 19, 2014 - link

    You're right, power consumption and heat output are related. That's because they're one and the same! What else could that electricity be converted to? Light? A massive magnetic field? Mechanical energy? (The fan, slightly, but the transistors aren't going anywhere.)
  • Laststop311 - Friday, September 19, 2014 - link

    no they aren't the same. Not all the electricity used is converted to heat. This is where the word EFFICIENCY comes into play. Yes it is related in a way but maxwell is more efficient with the electricity it draws using more of it and losing less of it to converted heat output. It's all in it's design.
  • bernstein - Friday, September 19, 2014 - link

    bullshit. since a gpu doesn't do chemical nor mechanical transformations all the energy used is converted to heat (by way of moving electrons around). efficiency in a gpu means how much energy is used for a fixed set of calculations (for example: flops)
  • Senpuu - Friday, September 19, 2014 - link

    It's okay to be ignorant, but not ignorant and belligerent.
  • bebimbap - Friday, September 19, 2014 - link

    there is "work" being done, as transistors have to "flip" by use of electrons. Even if you don't believe that "input energy =\= output heat" think of it this way
    100w incandescent bulb produces X amount of useful light
    18w florescent bulb also produces X amount of useful light

    in this sense the florescent bulb is much more efficient as it uses only 18w to produce the same light as the 100w incandescent. so if we say they produce the same amount of heat, then
    100w florescent would produce ~5x the light of a 100w incandescent.
  • Laststop311 - Saturday, September 20, 2014 - link

    ur so smart bro
  • Guspaz - Friday, September 19, 2014 - link

    The power draw figures in this article are overall system power draw, not GPU power draw. Since the 980 offers significantly more performance than the 680, it's cranking out more frames, which causes the CPU to work harder to keep up. As as result, the CPU power draw increases, counteracting the benefits of lower GPU power draw.
  • Carrier - Friday, September 19, 2014 - link

    I don't think that can explain the whole difference. It performs similarly to a 780 Ti in Crysis 3, so the difference in power consumption can only come from the card. The 980 is rated 85W less in TDP but consumes only 68W less at the wall. The discrepancy gets worse when you add losses in the power supply.

    My guess is the TDP is rated at nominal clock rate, which is cheating a little because the card consistently runs much higher than nominal because of the boost.

Log in

Don't have an account? Sign up now