Total War: Rome 2

The second strategy game in our benchmark suite, Total War: Rome 2 is the latest game in the Total War franchise. Total War games have traditionally been a mix of CPU and GPU bottlenecks, so it takes a good system on both ends of the equation to do well here. In this case the game comes with a built-in benchmark that plays out over a forested area with a large number of units, definitely stressing the GPU in particular.
For this game in particular we’ve also gone and turned down the shadows to medium. Rome’s shadows are extremely CPU intensive (as opposed to GPU intensive), so this keeps us from CPU bottlenecking nearly as easily.

Total War: Rome 2 - 1920x1080 - Extreme Quality + Med. Shadows

Total War: Rome 2 - 1920x1080 - Very High Quality + Med. Shadows

Total War: Rome 2 - 1920x1080 - High Quality + Med. Shadows

Total War: Rome 2 is a solid showing for AMD’s new cards, but it’s not a blow-out in either direction. Here we find the R7 260 generally tied with the GTX 650 Ti, while the R7 265 trails the GTX 660 by around 5% at Very High quality settings. Meanwhile the R7 265 also just cracks 30fps at Extreme quality settings, though it’s close enough that I suspect most players will want to err on the side of caution and turn down the quality settings for a framerate that will have a much easier time staying above 30fps.

Crysis: Warhead Hitman: Absolution
Comments Locked

52 Comments

View All Comments

  • AlucardX - Thursday, February 13, 2014 - link

    Since the 7850 was a great overclocker, I wonder how this rebadge product is. Any plans to overclocking with an increased Vcore?
  • Ryan Smith - Thursday, February 13, 2014 - link

    Not with this one. I only had 2 days to put this review together, so unfortunately there wasn't time for overclocking.
  • krumme - Thursday, February 13, 2014 - link

    Intel Core i7-4960X at 4.2GHz with a 260 playing BF4 single player.
    Perhaps not the most realistic scenario in this world.
  • MrSpadge - Thursday, February 13, 2014 - link

    It shows you what the card can do. If you're concerned about your CPU limiting you in BF4 multi player.. well, better read a CPU review.
  • krumme - Friday, February 14, 2014 - link

    A user playing with the s260 will typically have dual core i3. Thats reality. Try that with or without mantle in Multiplayer 64 man on the big maps. It the difference between playable or not playable. Probably more than 50% difference in favor of mantle. Instaed we get this useless talk.
  • Rebel1080 - Friday, February 14, 2014 - link

    What you're getting here is the equivalent of an Xbox One for $119 or PS4 for $149. It took Nvidia and ATI about 12-18 months just to release a video card of equal or better performance for under $199 after the seventh generation's (Xbox 360/PS3) debut. The fact that it only took 3 months to get to this level for under $150 during this generation only shows just how much $ony and M$FT low balled it's customers on specs.
  • silverblue - Friday, February 14, 2014 - link

    Except you then have to factor in the rest of the hardware to that price. Think about it - CPU, cooling, motherboard, memory (I don't think 8GB of GDDR5 is cheap), storage, case, power supply, software and the all-important input devices. Add in the fact that developers will get more out of the console GPUs than with the PC and I think you're ragging on them a bit too much.
  • Antronman - Wednesday, February 19, 2014 - link

    You mean 4GBs of DDR3. And likely high CAS latency too. Low-watt GPUs. Software only costs how much you pay the employees. Input ports are part of the mobo. DEVs do not get more out of the console GPUs. They are actually underclocked so that you don't need desktop-grade cooling. Consoles will never be serious gaming machines. People who buy consoles either won't spend the money on a good PC, can't spend the money, or would rather spend the money on dozens of games that they'll only play a couple of hours of and then just stick to one game.
  • golemite - Saturday, February 15, 2014 - link

    270's inflated prices are directly the result of cryptocoin mining as it has been found to offer an advantageous Kilo-hash to Watt ratio. It would be interesting and helpful to many out there if Anandtech started publishing KH/sec and KH/watt metrics in its review for Scrypt mining.
  • Will Robinson - Monday, February 17, 2014 - link

    Nice addition to the AMD lineup.... and a pretty convincing demolition of NVDA's competing cards.
    Thanx for the review!

Log in

Don't have an account? Sign up now