Total War: Warhammer II (DX11)

Last in our 2018 game suite is Total War: Warhammer II, built on the same engine of Total War: Warhammer. While there is a more recent Total War title, Total War Saga: Thrones of Britannia, that game was built on the 32-bit version of the engine. The first TW: Warhammer was a DX11 game was to some extent developed with DX12 in mind, with preview builds showcasing DX12 performance. In Warhammer II, the matter, however, appears to have been dropped, with DX12 mode still marked as beta, but also featuring performance regression for both vendors.

It's unfortunate because Creative Assembly themselves have acknowledged the CPU-bound nature of their games, and with re-use of game engines as spin-offs, DX12 optimization would have continued to provide benefits, especially if the future of graphics in RTS-type games will lean towards low-level APIs.

There are now three benchmarks with varying graphics and processor loads; we've opted for the Battle benchmark, which appears to be the most graphics-bound.

Total War: Warhammer II - 3840x2160 - Ultra QualityTotal War: Warhammer II - 2560x1440 - Ultra QualityTotal War: Warhammer II - 1920x1080- Ultra Quality

Again, the RTX 2070 slots into familiar position: faster than the GTX 1070 by 30 to 40%, yet only around 10% or so ahead of the GTX 1080. Even if it were to catch up to the GTX 1080 Ti, the RTX 2080 would need to make ground as well.

At 1080p, the cards quickly run into the CPU bottleneck, which is to be expected with top-tier video cards and the CPU intensive nature of RTS'es. Even still, the Founders Edition specs, which don't materially change the RTX 2070's positioning, is helpful for the RTX 2080 in pipping the GTX 1080 Ti.

F1 2018 Compute & Synthetics
Comments Locked

121 Comments

View All Comments

  • Arbie - Tuesday, October 16, 2018 - link

    Thanks for including Ashes Escalation in the results. I hope you will continue to do so. This is a unique game with great features.
  • abufrejoval - Tuesday, October 16, 2018 - link

    I find a lot of the discussions around here odd: Lots of people trying to convince each other that only their choice makes any sense… Please, let’s just enjoy that there are a lot more choices, even if that can be difficult.

    For me, compute pays the rent, gaming is a side benefit. So I aimed for maximum GPU memory and lowest noise, because training neural network can take a long time and I don’t have an extra room to spare. It was a GTX 1070 from Zotac, 150 Watts TDP, compact, low noise at high loads, not exactly a top performer in games, ok at 1080 slightly overwhelmed here and there with my Oculus Rift CV1, although quite ok with the DK2. I added a GTX 1050ti on another box mostly because it would do video conversion just as fast, but run extremely quiet and at zero power on that 24x7 machine.

    Then I made a 'mistake': I bought a 43” 4k monitor to replace a threesome of 24” 1080 screens.

    Naturally now games wouldn’t focus on one of those, but the big screen, which is 4x the number of pixels. With a screen so big and so close, I cannot really discern all pixel together at all times, but when I swivel my head, I will notice if pixels in my focus are sharp or blurred, so cutting down on resolution or quality won’t really do.

    I replaced the 1070 with the top gun available at the time, a GTX 1080ti.

    Actually, it wasn’t really the top gun, I got a Zotac Mini which again was nicely compact and low noise, does perfectly fine for GPU compute, but will settle on 180Watts for anything long-term. It’s very hard to achieve better than 70% utilization on GPU machine learning compute jobs, so all of these GPUs (except a mobile 1070) tend to stay very quiet.

    A desperate friend took the 1050ti off my hands, because he needed something that wouldn’t require extra power connectors, so I chipped in some extra dosh and got a GTX 1060(6GB) to replace it. Again, I went for a model recommended for low noise from MSI, but was shocked to see that it was vastly bigger than the 1080ti in every direction when I unpacked it. It was, however, very silent even at top gaming loads, a hard squeeze to fit inside the chassis but a perfect fit for ‘noise’ and a surprisingly adequate for 1080 gaming at 120 Watts.

    The reason I keep quoting those Watts is my observation that it’s perhaps a better sign of effective GPU power than the chip, as long as generation and process size are the same: There is remarkably little difference between the high-clocked 1060 at 120Watts, the average clocked 1070 at 150 Watts and the low-clocked 1080ti at 180Watts. Yes, the 1080ti will go to 250 Watts for bursts and deliver accordingly. But soon physics will weigh in onto that 1080ti and increasing fan speed does nothing but add noise, because surface area much like displacement in an engine is hard to replace.

    I got an RTX 2080ti last week, because I want to explore INT8 and INT4 for machine learning inference vs. FP16 or FP32 training: A V100 only gives me FP16 and some extra cores and bandwidth while it costs 4x as much, even among friends. That makes the Turing based consumer product an extremely good deal for my use case: I don’t care for FP64, ECC or GPU virtualization enough to pay the Tesla/Quadro premiums.

    And while the ML stuff will take weeks if not moths to figure out and measure, I am glad to report, that the Palit RTX 2080ti (only one available around here) turned out to be nicely quiet and finally potent enough to run ARK Survival Evolved at full quality at 4k without lags. Physically it’s a monster, but that also means it sustains 250 Watts throughout. That’s exactly how much a GTX 980ti and an R290X gulped from the mains inside that very same 18-Core Xeon box, but with performance increases harking back to the best times of Gordon Moore’s prediction.

    IMHO discussions about the 2xxx delivering 15% more speed at 40% higher prices vs. 1xxx GPUs are meaningless: 15FPs vs. 9 FPs or 250FPs vs. 180FPs are academic. The GTX 1080ti failed at 4k, I had to either compromise quality or go down to 3k: I liked neither. The RTX 2080ti won’t deliver 100FPs at 4k: I couldn’t care less! But it never drops below 25FPS neither, and that makes it worth all the money to a gamer, while actually INT8 and INT4 compute will pay the bill for me.

    I can’t imagine buying an RTX 1070 for myself, because I have enough systems and choices. But even I can imagine how someone would want the ability to explore ray-tracing or machine learning on a budget that offers a choice between a GTX 1080ti or an RTX 1070: Not an easy compromise to make, but a perfectly valid choice made millions of times.

    Don't waste breath or keystrokes on being 'religious' about GPU choices: Enjoy a new generation of compute and bit of quality gaming on the side!
  • abufrejoval - Tuesday, October 16, 2018 - link

    s/RTX 1070/RTX 2070 above: Want edit! It's this RTX 2070 which may not make a lot of sense to pure blooded games, except if they are sure that they continue to run at 1920x1080 over the next couple of years (where a GTX 1080ti is overkill) *and* want to try next generations graphics.
  • Flunk - Tuesday, October 16, 2018 - link

    So Nvidia has decided to push all their card numbers down one because AMD isn't competitive at the moment. The 2060 is now the 2070, 2070 is the 2080 and the 2080 is the 2080 TI. This sort of hubris is just calling out for a competitor to arrive and sell a more competitively priced product.

    As for ray tracing, I'll eat my hat if the 2070 can handle ray-tracing in real games at reasonable frame-rates and real resolutions when they arrive.
  • Kakti - Tuesday, October 16, 2018 - link

    TBH...who gives a crap? With the advent of usable integrated GPUs from Intel and AMD, dGPU vendors are basically no longer making x20, x30 or x40 cards. So maybe they're just pushing up the product stack - instead of "enthusiasts" buying x60, x70 and x80 cards, we'll now be buying x50, x60, x70 and halo x80 products. I could care less what the badge number is for my card, what I care about it performance vs price.

    That said, I don't think I'll ever by a dGPU for more than $400. The highest I've ever paid was I think ~$350 for my 970 or 670. As long as there's a reasonably competitive card in the $300-$400 USD range, I don't care what they call it - it could be a RTX 2110 and I'll snap it up. Given the products NVidia has released so far under the RTX line, I'm going to wait and see what develops. Either I'll grab a cheap used 1080/1080ti or wait for smaller and cheaper 2100 cards. NV can ask whatever they want for a card, but at the end of the day most consumers have a price ceiling in which they won't purchase anything above. Seems like a lot of people are in the 350-500 range so either prices will have to come down or cheaper products will come out. I'm curious whether NV will make any more GTX cards since Tensor cores not only aren't that usable right now, but dramatically increase the fab cost given their size and complexity.
  • Yojimbo - Wednesday, October 17, 2018 - link

    Nahh, look at the die sizes. The 2080 is bigger than the 1080 Ti. The 2070 is bigger than the 1080. The price/performance changes are not because NVIDIA is pushing the cards down one, it's entirely because of the resources spent for ray tracing capabilities. As far as the 2070's ability to handle ray tracing, we won't really know for a few more months.

    As for competitors, if AMD had a competitive product now they might be cleaning up. But since they don't, by the time they or some other competitor (Intel) does arrive they will probably need real time ray tracing to compete.

    No one is forcing you to buy an RTX. If you're not interested in real time ray tracing you probably shouldn't be buying an RTX, and the introduction of RTX has forced the 10 series (and probably soon the Vega and Polaris series) prices down.
  • Voodoo2-SLI - Tuesday, October 16, 2018 - link

    WQHD Performance Index for AnandTech's GeForce RTX 2070 Launch Review

    165.1% ... GeForce RTX 2080 Ti FE
    137.5% ... GeForce RTX 2080 FE
    115.3% ... GeForce RTX 2070 FE
    110.6% ... GeForce RTX 2070 Reference
    126.8% ... GeForce GTX 1080 Ti FE
    100% ..... GeForce GTX 1080 FE
    81,7% .... GeForce GTX 1070 FE
    99,2% .... Radeon RX Vega 64 Reference

    Index from 15 other launchreviews with an overall performance index of the GeForce RTX 2070 launch here:
    https://www.3dcenter.org/news/geforce-rtx-2070-lau...
  • risa2000 - Wednesday, October 17, 2018 - link

    What is exactly "RTX 2080" which is bouncing around the tables? I did not find any reference in the test description chapter. I assumed it could be card "stock clocked" RTX 2080 FE, but it seems these cards are not always performing in expected order (sometimes 2080 beats 2080 FE).

    Also, in the temp and noise section, there are two cards: 2080 and "2080 (baseline)" which give again quite different thermal and noise results.
  • Achaios - Wednesday, October 17, 2018 - link

    Too much blabbering in the comments section. Way I see it:

    GTX 2070 offers the same performance with a GTX 1080, is significantly more expensive than the GTX 1080 whilst being less power efficient and hotter at the same time.

    /Thread
  • milkod2001 - Wednesday, October 17, 2018 - link

    Well and accurately said.
    +1

Log in

Don't have an account? Sign up now