Total War: Rome 2

The second strategy game in our benchmark suite, Total War: Rome 2 is the latest game in the Total War franchise. Total War games have traditionally been a mix of CPU and GPU bottlenecks, so it takes a good system on both ends of the equation to do well here. In this case the game comes with a built-in benchmark that plays out over a forested area with a large number of units, definitely stressing the GPU in particular.


For this game in particular we’ve also gone and turned down the shadows to medium. Rome’s shadows are extremely CPU intensive (as opposed to GPU intensive), so this keeps us from CPU bottlenecking nearly as easily.

Total War: Rome 2 - 3840x2160 - Extreme Quality + Med. Shadows

Total War: Rome 2 - 3840x2160 - Very High Quality + Med. Shadows

Total War: Rome 2 - 2560x1440 - Extreme Quality + Med. Shadows

Total War: Rome 2 - 1920x1080 - Extreme Quality + Med. Shadows

Of all of our games, there is no better set of benchmarks for the GTX 980 than Total War: Rome II. Against both AMD and NVIDIA’s last-generation cards it never wins by as much as it wins here.

Compared to the GTX 780 Ti the GTX 980 is a consistent 16-17% ahead at all resolutions. Meanwhile against the R9 280XU this is an 18% lead at 1080p and 1440p. R9 290XU only begins to catch up at 4K Very High quality, where GTX 980 still leads by a respectable 8%.

This is also a very strong showing compared to the GTX 680. The overall lead is 80-95% depending on the resolution. The GTX 980 was not necessarily meant to double the GTX 680’s performance, but it comes very close to doing so here at 1440p.

Given what happens to the GK104 cards in this game, I suspect we’re looking at the results of either the ROP advantage and/or a very good case CUDA core occupancy improvements. The fact that the lead over the GTX 780 Ti is so consistent over all resolutions does point to the CUDA core theory, but we can’t really rule out the ROPs with the information we have.

As for results on an absolute basis, not even mighty GTX 980 is going to crack 30fps at 4K with Extreme settings. In lieu of that Very High quality comes off quite well at 49fps, and we’re just shy of hitting 60fps at 1440p with Extreme.

Crysis: Warhead Thief
Comments Locked

274 Comments

View All Comments

  • garadante - Thursday, September 25, 2014 - link

    Yeah. To be honest nobody except ardent Nvidia fanboys would've believed Nvidia would release cards as performance and price competitive as they did, especially the 970. The 980 is honestly a little overpriced compared to a few generations ago as they'll slap a $200 premium on it for Big Maxwell but $330 MSRP for the 970 (if I remember correctly) wasn't bad at all, for generally what, 290/780/290X performance?
  • tuxRoller - Friday, September 26, 2014 - link

    It's not too surprising as we saw what the 750ti was like.
    What is disappointing, though, is that I thought nvidia had made some fundamental breakthrough in their designs where, instead, it looks as though they "simply" enabled a better governor.
  • garadante - Friday, September 26, 2014 - link

    It'll be interesting to see how the efficiency suffers once nvidia releases a proper compute die with area dedicated to double precision FP. I have to keep in mind that when factoring in the stripped down die compared to AMD's 290/290X cards, the results aren't as competition. Lowing as they first seem. But if AMD can't counter these cards with their own stripped down gaming only cards then nvidia took the win this generation.
  • tuxRoller - Friday, September 26, 2014 - link

    That's an excellent point. I take it you already read the tomshardware review? They're compute performance/W is still good, but not so unbelievable as their gaming performance, but I'm not sure it's b/c this is a gaming only card. Regardless, though, amd needs to offer something better than what's currently available. Unfortunately, I don't think they will be able to do it. There was a lot of driver work than went into making these maxwell cards hum
  • garadante - Friday, September 26, 2014 - link

    One thing that really bothers me though is how Anandtech keeps testing the 290/290X with reference cards. Those cards run at 95 C due to the fan control profile in the BIOS and I remember seeing that when people ran those cards with decent nonreference cooling in the 70 C range that power consumption was 15-20+ watts lower. So an AMD die that sacrifices FP64 performance to focus on FP32(gaming, some compute) performance as well as decreasing die size due to the lack of FP64 resources seems like it could be a lot more competitive with Maxwell than people are making it out to be. I have this feeling that the people saying how badly Maxwell trounces AMD's efficiency and that AMD can't possibly hope to catch up are too biased in their thinking.
  • tuxRoller - Saturday, September 27, 2014 - link

    Do you have a link to those reviews that show non-reference fans make gpus more efficient? I don't know how that could be possible. Given the temps we're looking at the effects on the conductors should be very, very small.
    Regarding the reduction in fp performance and gaming efficiency, that's a good point. That may indeed be part of the reason why nvidia has the gaming/compute split (aside from the prices they can charge).
  • garadante - Sunday, September 28, 2014 - link

    Here's an example of a card with liquid cooling. Factor in the overclock that the nonreference card has and that it draws something like 20 watts less in Furmark and the same in 3Dmark. I could be mistaken on the improved power usage but I do recall seeing shortly after the 290X launch that nonreference coolers helped immensely, and power usage dropped as well. Sadly I don't believe Anandtech ever reviewed a nonreference 290X... which is mind boggling to consider, considering how much nonreference cooling helped that card, even outside of any potential power usage decreases.
  • garadante - Sunday, September 28, 2014 - link

    http://www.tomshardware.com/reviews/lcs-axr9-290x-... Whoops, forgot the link.
  • jman9295 - Friday, September 26, 2014 - link

    I wonder why they still give these cards these boring numbered names like GTX 980. Except for the Titan, these names kinda suck. Why not at least name it the Maxwell 980 or for AMD's R( 290 series the Hawaii 290. That sounds a lot cooler than GTX or R9. Also, for the last several generations, AMD and Nvidia's numbering system seems to be similar up until AMD ended that with the R9/R7 200 series. Before that, they had the GTX 700 and HD 7000 series, the GTX 600 and HD 6000 series and so on. Then, as soon as AMD changed it up, Nvidia decides to skip the GTX 800's for retail desktop GPUs and jump right up to the 900 series. Maybe they will come up with a fancier name for their next gen cards besides the GTX 1000's.
  • AnnonymousCoward - Saturday, September 27, 2014 - link

    Naw, names are much harder to keep track of than numbers that inherently describe relative performance.

Log in

Don't have an account? Sign up now