Total War: Attila

The Total War franchise moves on to Attila, another The Creative Assembly development, and is a stand-alone strategy title set in 395AD where the main story line lets the gamer take control of the leader of the Huns in order to conquer parts of the world. Graphically the game can render hundreds/thousands of units on screen at once, all with their individual actions and can put some of the big cards to task.

For low-end graphics, we test at 720p with performance settings, recording the average frame rate. With mid and high range graphics, we test at 1080p with the quality setting. In both circumstances, unlimited video memory is enabled and the in-game scripted benchmark is used.

Total War: Attila on ASUS GTX 980 Strix 4GB ($560)Total War: Attila on MSI R9 290X Gaming LE 4GB ($380)Total War: Attila on MSI GTX 770 Lightning 2GB ($245)Total War: Attila on MSI R9 285 Gaming 2GB ($240)Total War: Attila on ASUS R7 240 DDR3 2GB ($70)Total War: Attila on Integrated Graphics

Gaming: Alien Isolation Gaming: Grand Theft Auto V
Comments Locked

70 Comments

View All Comments

  • Magichands8 - Tuesday, January 3, 2017 - link

    This Optane looks completely useless. But Optane DRAM sounds like it could be interesting. Depending upon how much slower it is.
  • lopri - Wednesday, January 4, 2017 - link

    I immediately thought of "AMD Memory" which AMD launched after the Bulldozer flop. But then again Intel have been going after this storage caching scheme for years now and I do not think it has taken them anywhere.
  • smilingcrow - Tuesday, January 3, 2017 - link

    Games frame rates to 2 decimal points adds nothing but making it harder to scan the numbers quickly. Enough already.
  • 1_rick - Tuesday, January 3, 2017 - link

    One place where the extra threads of an i7 are useful is if you're using VMs (maybe to run a database server or something.) I've found that an i5 with 8GB can get bogged down pretty drastically just from running a VM.
  • Meteor2 - Tuesday, January 3, 2017 - link

    I kinda think that if you're running db VMs on an i7, you're doing it wrong.
  • t.s - Wednesday, January 4, 2017 - link

    or if you're Android developer using Android Studio.
  • lopri - Wednesday, January 4, 2017 - link

    That is true but as others have implied you can get 6 or 8 real core Xeons for cheaper than these new Kaby Lake chips if VM is what you need the performance of a CPU is for.
  • ddhelmet - Tuesday, January 3, 2017 - link

    Well I am more glad now that I got a Skylake. Even if I waited for this the performance increase is not worth it.
  • Kaihekoa - Tuesday, January 3, 2017 - link

    Y'all need to revise your game benchmarking analysis. At least use some current generation GPUs, post the minimum framerates, and test at 1440p. The rest of your review is exceptional, but the gaming part needs some modernization, please.
  • lopri - Wednesday, January 4, 2017 - link

    I thank the author for a clear yet thorough review. A lot of grounds are covered and the big picture of the chip's performance and features is well communicated. I agree with the author's recommendation at the end as well. I have not felt that I am missing out anything compared to i7's while running a 2500K for my gaming system, and unless you know for certain that you can take advantage of HyperThreading, spending the difference in dollars toward an SSD or a graphics card is a wiser expenditure that will provide you with better computing experience.

    Having said that, I am wtill not compelled to upgrade my 2500K which has been running at 4.8 GHz for years. (It does 5.0 GHz no problem but I run it at 4.8 to leave some "headroom") While I think the 7600K is barely a worthy upgrade (finally!) on its own light, but the added cost cannot be overlooked. A new motherboard, new memory, and potentially a new heat sink will quickly add to the budget, and I am not sure if it is going to be worth all the expenses that will follow.

    Of course all that could be worthwhile if overclocking was fun, but Intel pretty much have killed overclocking and the overclocking community. Intentionally if I might add. Today overclocking does not give one a sense of discovery or accomplishment. Competition between friendly enthusiasts or hostile motherboard/memory vendors has disappeared. Naturally there is no accumulation or exchange of knowledge in the community, and conversations have become frustrating and vain due to lack of overclocking expertise. Only some brute force overclocking with dedicated cooling has some following, and the occasional "overclocking" topics in the forums are really a braggadocio in disguise, of which the competition underneath is really about who spent the most on their rigs with the latest blingy stuff. Needless to say those are not as exciting or illuminating as the real overclocking of the yore, and in my opinion there are better ways to spend money for such a self-gratification without the complication that often accompanies overclocking which in the end fails to impress.

    Intel might have a second thought about its overclocking policies now, but just as many things Intel have done in recent years, it is too little too late. And their chips have no headroom anyway. My work system is due for an upgrade and I am probably going to pick up a couple of E2670s which will give me 16 real cores for less than $200. Why bother with the new stuff when the IPC gain is meager and there is no fun in overclocking? And contribute to Intel's revenue? Thank you but no thank you.

    P.S. Sorry I meant to commend the author for the excellent (albeit redundant) review but ended up ranting about something else. Oh well, carry on..

Log in

Don't have an account? Sign up now