Total War: Shogun 2

Total War: Shogun 2 is the latest installment of the long-running Total War series of turn based strategy games, and alongside Civilization V is notable for just how many units it can put on a screen at once. As it also turns out, it’s the single most punishing game in our benchmark suite (on higher end hardware at least).

Total War: Shogun 2 - 2560x1600 - Ultra Quality + 4xAA/16xAF

Total War: Shogun 2 - 1920x1200 - Very High Quality + 16xAF

Total War: Shogun 2 - 1680x1050 - High Quality + 16xAF

With Shogun 2 the GTX 680 sees its first decisive win at last. At the all-punishing resolution of 2560 the GTX 680 not only becomes the first single-GPU card to crack 30fps, but it takes a 16% lead over the 7970 here. Even at a more practical resolution and setting of 1920 the GTX 680 still leads by 15%. Meanwhile the GTX 580 fares even worse here, with the GTX 680 leading by 51% at 2560 and a whopping 63% at 1920. Even the GTX 590 can only barely beat the GTX 680 at 2560, only to lose at 1920.

At this point we’re not sure what it is about the GTX 680 that improves on the GTX 580 by so much. Shogun 2 does use a lot of VRAM, and while the greater amount of VRAM on the GTX 680 alone wouldn’t seem to explain this, the fact that most of that memory is consumed by textures just might. We may be seeing the benefit of the much greater number of texture units GTX 680 has.

DiRT 3 Batman: Arkham City
Comments Locked

404 Comments

View All Comments

  • will54 - Thursday, March 22, 2012 - link

    I noticed in the review they said this was based on the GF114 not the GF110 but than they mention that this is the flagship card for Nvidia. Does this mean that this will be the top Nvidia card until the GTX780 or are they going to bring out a more powerful in the next couple months based off the GF110 such as a GTX 685.
  • von Krupp - Friday, March 23, 2012 - link

    That depends entirely on how AMD responds. If AMD were to respond with a single GPU solution that convincingly trumps the GTX 680 (this is extremely improbable), then yes, you could expect GK110.

    However, I expect Nvidia to hold on to Gk110 and instead answer the dual-GPU HD 7990 with a dual-GK104 GTX 690.
  • Sq7 - Thursday, March 22, 2012 - link

    ...my 6950 still plays everything smooth as ice at ultra settings :o Eye candy check. Tesselation check. No worries check. To be honest I am not that interested in the current generation of gfx cards. When UE4 comes out I think it will be an optimal time to upgrade.

    But mostly in the end $500 is just too much for a graphics card. And I don't care if the Vatican made it. When I need to upgrade there will always be a sweet little card with my name on it at $300 - $400 be it blue or green. And this launch has just not left me drooling enough to even consider going out of my price range. If Diablo 3 really blows on my current card... Maybe. But somehow I doubt it.
  • ShieTar - Friday, March 23, 2012 - link

    That just means you need a bigger monitor. Or newer games ;-)

    Seriously though, good for you.

    I have two crossfired, overclocked 6950s feeding my 30'', and still find myself playing MMOs like SWTOR or Rift with Shadows and AA switched of, so that i have a chance to stay at > 40 FPS even in scenes with large groups of characters and effects on the screen at once. The same is true for most Offline-RPGs, like DA2 and The Witcher 2.

    I don't think I have played any games that hit 60 FPS @ 2560x1600 @ "Ultra Settings" except for games that are 5-10 years old.

    Of course, I won't be paying the $500 any more than you will (or 500€ in my case), because stepping up just one generation of GPUs never makes much sense. Even if it a solid step up as with this generation, you still pay the full price for only getting an 20% to 25% performance increase. That's why I usually skip at least one generation, like going from 2x260 to 2x6950 last summer. That's when you really get your moneys worth.
  • von Krupp - Friday, March 23, 2012 - link

    Precisely.

    I jumped up from a single GeForce 7800 GT (paired with an Athlon 64 3200+) to dual HD 7970s (paired with an i7-3820). At present, there's nothing I can't crank all the way up at 2560x1440, though I don't foresee being able to continue that within two years. I got 7 years of use out of the previous rig (2005-2012) using a 17" 1280x1024 monitor and I expect to get at least four out of this at 1920x1080 on my U2711.

    Long story short, consoles make it easy to not have to worry about frequent graphics upgrades so that when you finally do upgrade, you can get your money's worth.
  • cmdrdredd - Thursday, March 22, 2012 - link

    Why is Anandtech using Crysis Warhead still and not Crysis 2 with the High Resolution textures and DX11 modification?
  • Malih - Thursday, March 22, 2012 - link

    Pricing is better, but 7970 is not much worse than 680, like some has claimed (well, leaks).

    With similar pricing, AMD is not that far off, although It remains to be seen whether AMD will lower the price.

    For me, I'm a mainstream guy, so I'll see how the mainstream parts perform, and whether AMD will lower the price on their current mainstream (78x0), I was thinking about getting 7870, but AMD's pricing is too high for me, it gets them money on some market, but not from my pocket.
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    AMD is $120 too high. That's not chump change. That's breathe down your throat game changing 1000% at any other time on anandtech !
  • nyran125 - Friday, March 23, 2012 - link

    some games it wins, others it doesnt. But a pretty damn awesome card regardless.
  • asrey1975 - Friday, March 23, 2012 - link

    Your better off with an AMD card.

    Personally, I'm stlil thinking about buying 2x 6870's to replace my 5870 which runs BF3 no problem on my 27" 1900x1200 Dell monitor.

    It will cost me $165 each so for $330 all up, its stlil cheaper than any $500 card (insert brand/model) and will totally kick ass over 680 or 7970!

Log in

Don't have an account? Sign up now