Total War: Warhammer

The last game in our lineup is Total War: Warhammer, a DX11 game that received official DX12 support a couple months after launch. While DX12 is still marked as beta, Warhammer was to some extent developed with DX12 in mind, with preview builds showcasing DX12 performance.

The built-in benchmark was used with Ultra settings without alterations.

While the DX12 render path was used for AMD cards, there appear to be some oddities with 1080p performance. As mentioned earlier, we'd like to use the best performing API for a given card; in this case, while there was improved performance at higher resolutions, we noticed a potential regression in 1080p performance. Unfortunately, due to time constraints, we weren't able to investigate further; like Dawn of War III, it's possible Warhammer at 1080p was CPU-bound as well..

Total War: Warhammer - 3840x2160 - Ultra QualityTotal War: Warhammer - 2560x1440 - Ultra QualityTotal War: Warhammer - 1920x1080- Ultra Quality

Total War: Warhammer - 99th Percentile - 3840x2160 - Ultra QualityTotal War: Warhammer - 99th Percentile - 2560x1440 - Ultra QualityTotal War: Warhammer - 99th Percentile - 1920x1080 - Ultra Quality

F1 2016 Compute Performance
Comments Locked

213 Comments

View All Comments

  • Ryan Smith - Tuesday, August 15, 2017 - link

    3 CUs per array is a maximum, not a fixed amount. Each Hawaii shader engine had a 4/4/3 configuration, for example.

    http://images.anandtech.com/doci/7457/HawaiiDiagra...

    So in the case of Vega 10, it should be a 3/3/3/3/2/2 configuration.
  • watzupken - Tuesday, August 15, 2017 - link

    I think the performance is in line with recent rumors and my expectation. The fact that AMD beats around the bush to release Vega was a tell tale sign. Unlike Ryzen where they are marketing how well it runs in the likes of Cinebench and beating the gong and such, AMD revealed nothing on benchmarks throughout the year for Vega just like they did when they first released Polaris.
    The hardware no doubt is forward looking, but where it needs to matter most, I feel AMD may have fallen short. It seems like the way around is probably to design a new GPU from scratch.
  • Yojimbo - Wednesday, August 16, 2017 - link

    "It seems like the way around is probably to design a new GPU from scratch. "

    Well, perhaps, but I do think with more money they could be doing better with what they've got. They made the decision to focus on reviving their CPU business with their resources, however.

    They probably have been laying the groundwork for an entirely new architecture for some time, though. My belief is that APUs were of primary concern when originally designing GCN. They were hoping to enable heterogeneous computing, but it didn't work out. If that strategy did tie them down somewhat, their next gen architecture should free them from those tethers.
  • Glock24 - Tuesday, August 15, 2017 - link

    Nice review, I'll say the outcome was expected given the Vega FE reviews.

    Other reviews state that the Vega 64 has a switch that sets the power limts, and you have "power saving", "normal" and "turbo" modes. From what I've read the difference between the lowest and highest power limit is as high as 100W for about 8% more performance.

    It seems AMD did not reach the expected performance levels so they just boosted the clocks and voltage. Vega is like Skylake-X in that sense :P

    As others have mentioned, it would be great to have a comparison of Vega using Ryzen CPUs vs. Intel's CPUs.
  • Vertexgaming - Wednesday, August 16, 2017 - link

    It sucks so much that price drops on GPUs aren't a thing anymore because of miners. I have been upgrading my GPU every year and getting an awesome deal on the newest generation GPU, but now the situation has changed so much, that I will have to skip a generation to justify a $600-$800 (higher than MSRP) price tag for a new graphics card. :-(
  • prateekprakash - Wednesday, August 16, 2017 - link

    In my opinion, it would have been great if Vega 64 had a 16gb vram version at 100$ more... That would be 599$ apiece for the air cooled version... That would future proof it to run future 4k games (CF would benefit too)...

    It's too bad we still don't have 16gb consumer gaming cards, the Vega pro being not strictly for gamers...
  • Dosi - Wednesday, August 16, 2017 - link

    So the system does consumes 91W more with Vega 64, cant imagine with the LC V64... it can be 140W more? Actually what you saved on the GPU (V64 instead 1080) you already spent on electricity bill...
  • versesuvius - Wednesday, August 16, 2017 - link

    NVIDIA obviously knows how to break down the GPU tasks into chunks and processing those chunks and sending them out the door better than AMD. And more ROPs can certainly help AMD cards a lot.
  • peevee - Thursday, August 17, 2017 - link

    "as electrons can only move so far on a single (ever shortening) clock cycle"

    Seriously? Electrons? You think that how far electrons move matters? Sheesh.
  • FourEyedGeek - Tuesday, August 22, 2017 - link

    You being serious or sarcastic? If serious then you are ignorant.

Log in

Don't have an account? Sign up now