Gaming: Shadow of War

Next up is Middle-earth: Shadow of War, the sequel to Shadow of Mordor. Developed by Monolith, whose last hit was arguably F.E.A.R., Shadow of Mordor returned them to the spotlight with an innovative NPC rival generation and interaction system called the Nemesis System, along with a storyline based on J.R.R. Tolkien's legendarium, and making it work on a highly modified engine that originally powered F.E.A.R. in 2005.

Using the new LithTech Firebird engine, Shadow of War improves on the detail and complexity, and with free add-on high-resolution texture packs, offers itself as a good example of getting the most graphics out of an engine that may not be bleeding edge. Shadow of War also supports HDR (HDR10).

AnandTech CPU Gaming 2019 Game List
Game Genre Release API IGP Low Med High
Shadow of War Action / RPG Sep 2017 DX11 720p Ultra 1080p Ultra 4K High 8K High

All of our benchmark results can also be found in our benchmark engine, Bench.

Shadow of War IGP Low Medium High
Average FPS

Shadow of War is another game where it’s hard to tease out CPU limitations under reasonable game settings. Even 1080p Ultra is a bunch of Intel CPUs seeing who can tip-toe over 100fps, with AMD right on their tail. The less reasonable 720p Ultra pushes this back slightly – the CPUs with the weakest per-thread performance start to fall behind – but it’s still a tight pack for all of the Coffee Lake CPUs. With the highest frequencies and tied for the most cores among the desktop processors here, it’s clear that the 9900K is going to be the strongest contender. But this isn’t a game that can benefit from that performance right now.

Gaming: Final Fantasy XV Gaming: Civilization 6 (DX12)
Comments Locked

274 Comments

View All Comments

  • 3dGfx - Friday, October 19, 2018 - link

    game developers like to build and test on the same machine
  • mr_tawan - Saturday, October 20, 2018 - link

    > game developers like to build and test on the same machine

    Oh I thought they use remote debugging.
  • 12345 - Wednesday, March 27, 2019 - link

    Only thing I can think of as a gaming use for those would be to pass through a gpu each to several VMs.
  • close - Saturday, October 20, 2018 - link

    @Ryan, "There’s no way around it, in almost every scenario it was either top or within variance of being the best processor in every test (except Ashes at 4K). Intel has built the world’s best gaming processor (again)."

    Am I reading the iGPU page wrong? The occasional 100+% handicap does not seem to be "within variance".
  • daxpax - Saturday, October 20, 2018 - link

    if you noticed 2700x is faster in half benchmarks for games but they didnt include it
  • nathanddrews - Friday, October 19, 2018 - link

    That wasn't a negative critique of the review, just the opposite in fact: from the selection of benchmarks you provided, it is EASY to see that given more GPU power, the new Intel chips will clearly outperform AMD most of the time - generally with average, but specifically minimum frames. From where I'm sitting - 3570K+1080Ti - I think I could save a lot of money by getting a 2600X/2700X OC setup and not miss out on too many fpses.
  • philehidiot - Friday, October 19, 2018 - link

    I think anyone with any sense (and the constraints of a budget / missus) will be stupid to buy this CPU for gaming. The sensible thing to do is to buy the AMD chip that provides 99% of the gaming performance for half the price (even better value when you factor in the mobo) and then to plough that money into a better GPU, more RAM and / or a better SSD. The savings from the CPU alone will allow you to invest a useful amount more into ALL of those areas. There are people who do need a chip like this but they are not gamers. Intel are pushing hard with both the limitations of their tech (see: stupid temperatures) and their marketing BS (see: outright lies) because they know they're currently being held by the short and curlies. My 4 year old i5 may well score within 90% of these gaming benchmarks because the limitation in gaming these days is the GPU. Sorry, Intel, wrong market to aim at.
  • imaheadcase - Saturday, October 20, 2018 - link

    I like how you said limitations in tech and point to temps, like any gamer cares about that. Every game wants raw performance, and the fact remains intel systems are still easier to go about it. The reason is simple, most gamers will upgrade from another intel system and use lots of parts from it that work with current generation stuff.

    Its like the whole Gsync vs non gsync. Its a stupid arguement, its not a tax on gsync when you are buying the best monitor anyways.
  • philehidiot - Saturday, October 20, 2018 - link

    Those limitations affect overclocking and therefore available performance. Which is hardly different to much cheaper chips. You're right about upgrading though.
  • emn13 - Saturday, October 20, 2018 - link

    The AVX 512 numbers look suspicious. Both common sense and other examples online suggest that AVX512 should improve performance by much less than a factor 2. Additionally, AVX-512 causes varying amounts of frequency throttling; so you;re not going to get the full factor 2.

    This suggests to me that your baseline is somehow misleading. Are you comparing AVX512 to ancient SSE? To no vectorization at all? Something's not right there.

Log in

Don't have an account? Sign up now