Middle-earth: Shadow of War (DX11)

Next up is Middle-earth: Shadow of War, the sequel to Shadow of Mordor. Developed by Monolith, whose last hit was arguably F.E.A.R., Shadow of Mordor returned them to the spotlight with an innovative NPC rival generation and interaction system called the Nemesis System, along with a storyline based on J.R.R. Tolkien's legendarium, and making it work on a highly modified engine that originally powered F.E.A.R. in 2005.

Using the new LithTech Firebird engine, Shadow of War improves on the detail and complexity, and with free add-on high resolution texture packs, offers itself as a good example of getting the most graphics out of an engine that may not be bleeding edge. Shadow of War also supports HDR (HDR10).

We've updated some of the benchmark automation and data processing steps, so results may vary at the 1080p mark compared to previous data.

Shadow of War - 2560x1440 - Ultra Quality

Shadow of War - 1920x1080 - Ultra Quality

Shadow of War is known to be a bit of a video memory hog, to which the GTX 960 acquiesces. Like Final Fantasy XV, the GTX 1660 Ti again finds itself bringing triple the performance. The GTX 1660 Ti opens a healthy lead here over the GTX 1060 6GB; if framebuffer is indeed a significant factor, it's important to note that the GTX 1660 Ti brings substantially more memory bandwidth to the table.

Grand Theft Auto V F1 2018
Comments Locked

157 Comments

View All Comments

  • PeachNCream - Friday, February 22, 2019 - link

    This article reads a little like that infamous Steve Ballmer developers thing except it's not "developers, developers, developers, etc" but "traditional, traditional, traditionally, etc." instead. Please explore alternate expressions. The word in question implies long history which is something the computing industry lacks and the even shorter time periods referenced (a GPU generation or two) most certainly lack so the overuse stands out like a sore thumb in many of Anandtech's publications.
  • Oxford Guy - Saturday, February 23, 2019 - link

    How about the utterly asinine use of the word "kit" to describe a set of RAM sticks that simply snap into a motherboard?

    The Altair 8800 was a kit. The Heathkit H8 was a kit. Two sticks of RAM that snap into a board doth not a kit maketh.
  • futurepastnow - Friday, February 22, 2019 - link

    A triple-slot card? Really, EVGA?
  • PeachNCream - Friday, February 22, 2019 - link

    Yup, for 120W TDP of all things. But it's in the charts as a 2.75 slot width card so EVGA is probably hoping that no one understands how expansion slots actually would not permit the remaining .25 slot width to support anything.
  • darckhart - Friday, February 22, 2019 - link

    lol this was my first thought upon seeing the photo as well.
  • GreenReaper - Saturday, February 23, 2019 - link

    I suspect it was the cheapest way to get that level of cooling. A more compact heatsink-fan combo could have cost more.

    130W (which is the TDP here) is not a *trivial* amount to dissipate, and it's quite tightly packed.
  • Oxford Guy - Saturday, February 23, 2019 - link

    I think all performance GPUs should be triple slot. In fact, I think the GPU form factor is ridiculously obsolete.
  • Oxford Guy - Monday, February 25, 2019 - link

    Judging by techpowerup's reviews, though, the EVGA card's cooling is inefficient.
  • eastcoast_pete - Friday, February 22, 2019 - link

    @Ryan and Nate: What generation of HDMI and DP does the EVGA card have/support? Apologize if you had it listed and I missed it.
  • Ryan Smith - Friday, February 22, 2019 - link

    HDMI 2.0b, DisplayPort 1.4.

Log in

Don't have an account? Sign up now