Total War: Rome 2

The second strategy game in our benchmark suite, Total War: Rome 2 is the latest game in the Total War franchise. Total War games have traditionally been a mix of CPU and GPU bottlenecks, so it takes a good system on both ends of the equation to do well here. In this case the game comes with a built-in benchmark that plays out over a forested area with a large number of units, definitely stressing the GPU in particular.


For this game in particular we’ve also gone and turned down the shadows to medium. Rome’s shadows are extremely CPU intensive (as opposed to GPU intensive), so this keeps us from CPU bottlenecking nearly as easily.

Total War: Rome 2 - 3840x2160 - Extreme Quality + Med. Shadows

Total War: Rome 2 - 3840x2160 - Very High Quality + Med. Shadows

Total War: Rome 2 - 2560x1440 - Extreme Quality + Med. Shadows

Total War: Rome 2 - 1920x1080 - Extreme Quality + Med. Shadows

Of all of our games, there is no better set of benchmarks for the GTX 980 than Total War: Rome II. Against both AMD and NVIDIA’s last-generation cards it never wins by as much as it wins here.

Compared to the GTX 780 Ti the GTX 980 is a consistent 16-17% ahead at all resolutions. Meanwhile against the R9 280XU this is an 18% lead at 1080p and 1440p. R9 290XU only begins to catch up at 4K Very High quality, where GTX 980 still leads by a respectable 8%.

This is also a very strong showing compared to the GTX 680. The overall lead is 80-95% depending on the resolution. The GTX 980 was not necessarily meant to double the GTX 680’s performance, but it comes very close to doing so here at 1440p.

Given what happens to the GK104 cards in this game, I suspect we’re looking at the results of either the ROP advantage and/or a very good case CUDA core occupancy improvements. The fact that the lead over the GTX 780 Ti is so consistent over all resolutions does point to the CUDA core theory, but we can’t really rule out the ROPs with the information we have.

As for results on an absolute basis, not even mighty GTX 980 is going to crack 30fps at 4K with Extreme settings. In lieu of that Very High quality comes off quite well at 49fps, and we’re just shy of hitting 60fps at 1440p with Extreme.

Crysis: Warhead Thief
Comments Locked

274 Comments

View All Comments

  • bernstein - Friday, September 19, 2014 - link

    it's nice having one article with a full review, & it's nice to have early partial results... so in the future if publishing with missing content PLZ put in a big fat bold disclaimer:
    xyz content missing, update coming on 2.2.2222
  • chizow - Friday, September 19, 2014 - link

    @Ryan, thanks for the update, sorry I just scanned through and didn't see the subtext mentioning your issues with the 970. Looking forward to updated results once you get some good samples.
  • nevertell - Friday, September 19, 2014 - link

    You can't read through the article in one sitting yet you complain about the article being rushed ?
  • chizow - Sunday, September 21, 2014 - link

    @nevertell, not sure if that comment was directed at me, but I never read through the entire article in the first sitting, especially in this case where I was actually in the market to buy one of these cards and might need to make a quick buying decision. I generally look at results and jump around a bit before going back to read the entire article, and I did not see any subtext on why the 970 wasn't included on this page about "Launching Today":

    http://www.anandtech.com/show/8526/nvidia-geforce-...

    I expected to see something about why the 970 wasn't launching today, staggered launch, didn't get review sample etc but did not see anything, so I asked bc I saw Ryan was attending the comments here and might get a quick response.
  • boot318 - Thursday, September 18, 2014 - link

    Bye, AMD!

    Amazing card(s) Nvidia bought to market! I've already seen a couple of reviews showing this monster overclocking over 1450+. Just think about when Nvidia drops a big die version........ :)
  • dragonsqrrl - Thursday, September 18, 2014 - link

    AMD is by no means out of it. They're still very competitive in terms of performance, however they're far behind in terms of efficiency, which means to compete with the 980 they'll likely have to launch a far higher TDP card that requires more exotic cooling and will almost certainly be more expensive to manufacture. Even when you take the 285 into consideration, which offers 280 level performance at greatly reduced TDP, it's still at a higher TDP then the 980 which now outperforms the 290X by ~15%. And this isn't even taking noise, build quality, or features into consideration... Not a good position for AMD, in fact it's somewhat reminiscent of their processors (minus the competitive performance part).

    "Just think about when Nvidia drops a big die version........ :)"
    Fortunately for AMD that's just not going to happen on 28nm, otherwise I might be inclined to agree with you. They still have a very real competitive chance with their upcoming cards.
  • arbit3r - Thursday, September 18, 2014 - link

    O god really? 285 has greately reduced TDP? um 280 had a 200watt TDP, the 285 is 190, 10 watts less i wouldn't call that greatly reduced. Before you say 280 had 250watt tdp, no that is the 280x.
  • dragonsqrrl - Friday, September 19, 2014 - link

    I haven't done much searching around, but according to Anandtech's review of the 285, the 280 has a 250W TDP.

    http://www.anandtech.com/show/8460/amd-radeon-r9-2...
  • arbit3r - Friday, September 19, 2014 - link

    plenty sites i know of say its 200, so if there is that much misinfo then likely AMD at fault for that one. Seeing a lot of reviews put real world power usage around 20watts difference.
  • Ryan Smith - Friday, September 19, 2014 - link

    For the record, 250W for R9 280 comes directly from AMD's reviewer's guide for that product.

Log in

Don't have an account? Sign up now