Total War: Warhammer II (DX11)

Last in our 2018 game suite is Total War: Warhammer II, built on the same engine of Total War: Warhammer. While there is a more recent Total War title, Total War Saga: Thrones of Britannia, that game was built on the 32-bit version of the engine. The first TW: Warhammer was a DX11 game was to some extent developed with DX12 in mind, with preview builds showcasing DX12 performance. In Warhammer II, the matter, however, appears to have been dropped, with DX12 mode still marked as beta, but also featuring performance regression for both vendors.

It's unfortunate because Creative Assembly themselves have acknowledged the CPU-bound nature of their games, and with re-use of game engines as spin-offs, DX12 optimization would have continued to provide benefits, especially if the future of graphics in RTS-type games will lean towards low-level APIs.

There are now three benchmarks with varying graphics and processor loads; we've opted for the Battle benchmark, which appears to be the most graphics-bound.

Total War: Warhammer II - 2560x1440 - Ultra Quality

Total War: Warhammer II - 1920x1080- Ultra Quality

Rounding out our look at game performance is Total War: Warhammer II. Here, the GTX 1660 is a neat split between the GTX 1060 6GB/RX 590 and the GTX 1660 Ti.

F1 2018 Compute & Synthetics
Comments Locked

77 Comments

View All Comments

  • The_Assimilator - Friday, March 15, 2019 - link

    1660 Ti power usage: more for GPU, less for GDDR6. 1660: less for GPU (due to 2 fewer SMs), but more for GDDR5. Hence why overall power usage for both is the same. What I still don't understand is why all of these cards, despite being rated to draw under 150W, come with 8-pin power connectors; 6-pin would make far more sense and would make them compatible with many older systems.
  • Alistair - Friday, March 15, 2019 - link

    They are still holding back. This would have been an incredible 7nm card. That's still what I want. Not interested.
  • backpackbrady - Saturday, March 16, 2019 - link

    amazing post ryan / nate!@# hoping you could answer a question beyond my knowledge for me. would the 1660 hardware-based encoder nvenc be at a disadvantage with the TU116 and GDDR5 changes? im not sure what effects the encoders performance. thank you very much for your time and knowledge. brady
  • Hrel - Tuesday, March 19, 2019 - link

    Suddenly Nvidia's pricing seems completely fair.
  • Supercell99 - Thursday, March 28, 2019 - link

    Chinese are done dumping after market GFX cards. Used market is drying up
  • Hrel - Saturday, March 30, 2019 - link

    This is looking like one hell of a good card for the money and the market. Faster than the RX 580 and RX 590, priced like a cheap 590 or average 580, less power draw, runs cooler, includes Nvidias (frankly) superior software and drivers. So right now either the GTX1070 used, or the GTX 1660 new, 1070 should be about the same price even used. Only cheaper ones I found were crypto mining cards and F that noise.

    There are some technology differences but idk, you guys don't seem to go into great detail about the differences between GTX 1070 and GTX 1660 excluding game performance. Are there any notable DX features included in the newer card or is it just straight performance improvement?
  • Hrel - Saturday, March 30, 2019 - link

    I think a year or so from now I'll pick one of these up, either 1660 or ti, will depend on then current pricing.

Log in

Don't have an account? Sign up now