Total War: Shogun 2

One of our goals with this iteration of our benchmark suite was to throw in some additional non-FPS games, so we’re broadening our horizons a bit by adding in Total War: Shogun 2. Shogun 2 is the latest installment of the long-running Total War series of turn based strategy games, and alongside Civilization V is notable for just how many units it can put on a screen at once. As it also turns out, it’s the single most punishing game in our benchmark suite.

Total War: Shogun 2

Total War: Shogun 2

Total War: Shogun 2

When deciding on what settings to use with Shogun, it required a bit more creativity on our part. 2560 is a true moonshot; everything is turned on, and it takes a minimum of 1.5GB of VRAM to run the game at this resolution. Accordingly performance is rather dismal, though as this is a TBS 30fps isn’t quite as critical as it is in other games. In any case the 7970 comes the closest to hitting 30fps, coming in just shy at 28.2fps, which is 29% ahead of the GTX 580 and 48% ahead of the 6970.

Meanwhile for 1920 we turned Shogun’s settings down to Very High, and yet we still had to disable MSAA to make it work with 1GB cards (did we mention that Shogun loves VRAM?). At these lower settings performance rockets up, and at the same time so does the 7970’s lead over the GTX 580. Here it’s 36% ahead of the GTX 580, which will be the greatest lead among all of our gaming benchmarks. As for the 7970 compared to the 6970, it’s still 48% of the 6970 showing us just how similar these video cards are at times.

Finally at 1680 the overall performance goes up yet again, but the 7970’s lead remains. It’s ahead of the GTX 580 by 30%, and 48% ahead of the 6970.

DiRT 3 Batman: Arkham City
Comments Locked

292 Comments

View All Comments

  • Esbornia - Thursday, December 22, 2011 - link

    Fan boy much?
  • CeriseCogburn - Thursday, March 8, 2012 - link

    Finally, piroroadkill, Esbornia - the gentleman ericore merely stated what all the articles here have done as analysis while the radeonite fans repeated it ad infinitum screaming nvidia's giant core count doesn't give the percentage increase it should considering transistor increase.
    Now, when it's amd's turn, we get ericore under 3 attacks in a row...---
    So you three all take it back concerning fermi ?
  • maverickuw - Thursday, December 22, 2011 - link

    I want to know when the 7950 will come out and hopefully it'll come out at $400
  • duploxxx - Thursday, December 22, 2011 - link

    Only the fact that ATI is able to bring a new architecture on a new process and result in such a performance increase for that power consumption is a clear winner.

    looking at the past with Fermy 1st launch and even Cayman VLIW4 they had much more issues to start with.

    nice job, while probably nv680 will be more performing it will take them at least a while to release that product and it will need to be also huge in size.
  • ecuador - Thursday, December 22, 2011 - link

    Nice review, although I really think testing 1680x1050 for a $550 is a big waste of time, which could have to perhaps multi-monitor testing etc.
  • Esbornia - Thursday, December 22, 2011 - link

    Its Anand you should expect this kind of shiet.
  • Ryan Smith - Thursday, December 22, 2011 - link

    In this case the purpose of 1680 is to allow us to draw comparisons to low-end cards and older cards, which is something we consider to be important. The 8800GT and 3870 in particular do not offer meaningful performance at 1920.
  • poohbear - Thursday, December 22, 2011 - link

    Why do you bencmark @ 1920x1200 resolution? according to the Steam December survey only 8% of gamers have that resolution, whereas 24% have 1920x1080 and 18% use 1680x1050 (the 2 most popular). Also, minimum FPS would be nice to know in your benchmarks, that is really useful for us! just a heads up for next time u benchmark a video card! Otherwise nice review! lotsa good info at the beginning!:)
  • Galcobar - Thursday, December 22, 2011 - link

    Page 4, comments section.
  • Esbornia - Thursday, December 22, 2011 - link

    They dont want to show the improvements on min FPS cause they hate AMD, you should know that already.

Log in

Don't have an account? Sign up now