Gaming: Shadow of War

Next up is Middle-earth: Shadow of War, the sequel to Shadow of Mordor. Developed by Monolith, whose last hit was arguably F.E.A.R., Shadow of Mordor returned them to the spotlight with an innovative NPC rival generation and interaction system called the Nemesis System, along with a storyline based on J.R.R. Tolkien's legendarium, and making it work on a highly modified engine that originally powered F.E.A.R. in 2005.

Using the new LithTech Firebird engine, Shadow of War improves on the detail and complexity, and with free add-on high-resolution texture packs, offers itself as a good example of getting the most graphics out of an engine that may not be bleeding edge. Shadow of War also supports HDR (HDR10).

AnandTech CPU Gaming 2019 Game List
Game Genre Release API IGP Low Med High
Shadow of War Action / RPG Sep 2017 DX11 720p Ultra 1080p Ultra 4K High 8K High

All of our benchmark results can also be found in our benchmark engine, Bench.

Shadow of War IGP Low Medium High
Average FPS

Shadow of War is another game where it’s hard to tease out CPU limitations under reasonable game settings. Even 1080p Ultra is a bunch of Intel CPUs seeing who can tip-toe over 100fps, with AMD right on their tail. The less reasonable 720p Ultra pushes this back slightly – the CPUs with the weakest per-thread performance start to fall behind – but it’s still a tight pack for all of the Coffee Lake CPUs. With the highest frequencies and tied for the most cores among the desktop processors here, it’s clear that the 9900K is going to be the strongest contender. But this isn’t a game that can benefit from that performance right now.

Gaming: Final Fantasy XV Gaming: Civilization 6 (DX12)
Comments Locked

274 Comments

View All Comments

  • evernessince - Saturday, October 20, 2018 - link

    I'm sure for him money is a fixed resource, he is just really bad at managing it. You'd have to be crazy to blow money on the 9900K when the 8700K is $200 cheaper and the 2700X is half the price.
  • Dug - Monday, October 22, 2018 - link

    Relative to how much you make or have. $200 isn't some life threatening amount that makes them crazy because they spent it on a product that they will enjoy. We spend more than that going out for a weekend (and usually don't have anything to show for it). If an extra 200 is threatening to your lively hood, you shouldn't be shopping for new cpu's anyway.
  • close - Saturday, October 20, 2018 - link

    @ekidhardt: "I think far too much emphasis has been placed on 'value'. I simply want the fastest, most powerful CPU that isn't priced absurdly high."

    That, my good man, is the very definition of value. It happens automatically when you decide to take price into consideration the price. I also don't care about value, I just want a CPU with a good performance to price ratio. See what I did there? :)
  • evernessince - Saturday, October 20, 2018 - link

    A little bit extra? It's $200 more then the 8700K, that's not a little.
  • mapesdhs - Sunday, October 21, 2018 - link


    The key point being, for gaming, use the difference to buy a better GPU, whether one gets an 8700K or 2700X (or indeed any one of a plethora of options really, right back to an old 4930K). It's only at 1080p and high refresh rates where strong CPU performance stands out, something DX12 should help more with as time goes by (the obsession with high refresh rates is amusing given NVIDIA's focus shift back to sub-60Hz being touted once more as ok). For gaming at 1440p or higher, one can get a faster system by choosing a cheaper CPU and better GPU.

    There are two exceptions: those for whom money is literally no object, and certain production workloads that still favour frequency/IPC and are not yet well optimised for more than 6 cores (Premiere is probably the best example). Someone mentioned pro tasks being irrelevant because ECC is not supported, but many solo pros can't afford XEON class hw (I mean the proper dual socket setups) even if the initial higher outlay would eventually pay for itself.

    What we're going to see with the 9900K for gaming is a small minority of people taking Intel's mantra of "the best" and running with it. Technically, they're correct, but most normal people have budgets and other expenses to consider, including wives/gfs with their own cost tolerance limits. :D

    If someone can genuinely afford it then who cares, in the end it's their money, but as a choice for gaming it really only makes sense via the same rationale if they've also then bought a 2080 Ti to go with it, though even there one could retort that two used 1080 TIs would be cheaper & faster (at least for those titles where SLI is functional).

    If anything good has come from this and the RTX launch, it's the move away from the supposed social benefit of having "the best"; the street cred is gone, now it just makes one look like a fool who was easily parted from his money.
  • Spunjji - Monday, October 22, 2018 - link

    Word.
  • Total Meltdowner - Sunday, October 21, 2018 - link

    This comment reads like shilling so hard. So hard. Please try harder to not be so obvious.
  • Spunjji - Monday, October 22, 2018 - link

    I think they placed just the right amount of emphasis on "value". Your post basically explains why it's not relevant for you in terms of you being an Intel fanboy with cash to burn. I'll elaborate.

    The MSRP is in the realm of irrational spending for a huge number of people. "Rational" here meaning "do I get out anything like what I put in", wherein the answer in all metrics is an obvious no.

    Following that, there are a HUGE number of reasons not to pre-order a high-end CPU, especially before proper results are out. Pre-ordering *anything* computer related is a dubious prospect, doubly so when the company selling it paid good money to paint a deceptive picture of their product's performance.

    Your assertion that Intel have never launched a bad CPU is false and either ignorance or wilful bias on your part. They have launched a whole bunch of terrible CPUs, from the P3 1.2Ghz that never worked, through the P4 Emergency Edition and the early "dual-core" P4 processors, all the way through to this i9 9900K which is the thirstiest "95W" CPU I've ever seen. Their notebook CPUs are now segregated in such a way that you have to read a review to find out how they will perform, because so much is left on the table in terms of achievable turbo boost limits.

    Sorry, I know I replied just to disagree which may seem argumentative, but you posted a bunch of nonsense and half-turths passed off as common-sense and/or logic. It's just bias; none of it does any harm but you could at least be up-front that you prefer Intel. That in itself (I like Intel and am happy to spend top dollar) is a perfectly legitimate reason for everything you did. Just be open and don't actively mislead people who know less than you do.
  • chris.london - Friday, October 19, 2018 - link

    Hey Ryan. Thanks for the review.

    Would it be possible to check power consumption in a test in which the 2700x and 9900k perform similarly (maybe in a game)? POV-Ray seems like a good way to test for maximum power draw but it makes the 9900k look extremely inefficient (especially compared to the 9600k). It would be lovely to have another reference point.
  • 0ldman79 - Friday, October 19, 2018 - link

    I'm legitimately surprised.

    The 9900k is starving for bandwidth, needs more cache or something. I never expected it to *not* win the CPU benchmarks vs the 9700k. I honestly expected the 9700k to be the odd one out, more expensive than the i5 and slower than the 9900k. This isn't the case. Apparently SMT isn't enabling 100% usage of the CPU's resources, it is allowing a bottleneck due to fighting over resources. I'd love to see the 9900K run against it's brethren with HT disabled.

Log in

Don't have an account? Sign up now