Gaming: Ashes Classic (DX12)

Seen as the holy child of DirectX12, Ashes of the Singularity (AoTS, or just Ashes) has been the first title to actively go explore as many of the DirectX12 features as it possibly can. Stardock, the developer behind the Nitrous engine which powers the game, has ensured that the real-time strategy title takes advantage of multiple cores and multiple graphics cards, in as many configurations as possible.

As a real-time strategy title, Ashes is all about responsiveness during both wide open shots but also concentrated battles. With DirectX12 at the helm, the ability to implement more draw calls per second allows the engine to work with substantial unit depth and effects that other RTS titles had to rely on combined draw calls to achieve, making some combined unit structures ultimately very rigid.

Stardock clearly understand the importance of an in-game benchmark, ensuring that such a tool was available and capable from day one, especially with all the additional DX12 features used and being able to characterize how they affected the title for the developer was important. The in-game benchmark performs a four minute fixed seed battle environment with a variety of shots, and outputs a vast amount of data to analyze.

For our benchmark, we run Ashes Classic: an older version of the game before the Escalation update. The reason for this is that this is easier to automate, without a splash screen, but still has a strong visual fidelity to test.

AnandTech CPU Gaming 2019 Game List
Game Genre Release Date API IGP Low Med High
Ashes: Classic RTS Mar
2016
DX12 720p
Standard
1080p
Standard
1440p
Standard
4K
Standard

Ashes has dropdown options for MSAA, Light Quality, Object Quality, Shading Samples, Shadow Quality, Textures, and separate options for the terrain. There are several presents, from Very Low to Extreme: we run our benchmarks at the above settings, and take the frame-time output for our average and percentile numbers.

All of our benchmark results can also be found in our benchmark engine, Bench.

AnandTech IGP Low Medium High
Average FPS
95th Percentile

At the lowest resolutions, the 2500X has the high ground, but cedes it to the 8350K as the resolution ramps up.

Gaming: Civilization 6 (DX12) Gaming: Strange Brigade (DX12, Vulkan)
POST A COMMENT

65 Comments

View All Comments

  • Le Québécois - Monday, February 11, 2019 - link

    Ian, any reason why more often than not, you seem to "skip" 1440 in your benchmarks? It's only present for a few games.

    Considering the GTX 1080, your best card, is always the bottleneck at 4K, as your numbers show, wouldn't it make more sense to focus more on 1440 instead?

    Especially considering it's the "best" resolution on the market if you are looking for a high pixel density yet still want to run your games at a playable levels of fps.
    Reply
  • Ian Cutress - Monday, February 11, 2019 - link

    Some benchmarks are run at 1440p. Some go up to 8K. It's a mix. There's what, 10 games there? Not all of them have to conform to the same testing settings. Reply
  • Le Québécois - Tuesday, February 12, 2019 - link

    Sorry for the confusion. I can clearly see we've got very different settings in that mix. I guess a more direct question would be: why do it this way and not with a more standardized series of test?

    A followup question would also be, why 8K? You are already GPU limited at 4K so your 8K result are not going to give any relevant information about those CPUs.

    Sorry, I don't mean to criticized, I simply wish to understand your thought process.
    Reply
  • MrSpadge - Monday, February 11, 2019 - link

    What exactly do you want to see there that you can't see at 1080p? Differences between CPUs are going to be muddied due to approaching the GPU limit, and that's it. Reply
  • Le Québécois - Tuesday, February 12, 2019 - link

    Well, at 1080, you can definitely see the difference between them, and exactly like you said, at 4K, it's all the same because of the GPU limitations. 1440 seems more relevant than 4K considering this. This is after all, a CPU review and most of the 4K results could be summed up by "they all perform within a few %". Reply
  • neblogai - Monday, February 11, 2019 - link

    End of page 19: R5 2600 is really 65W TDP, not 95W. Reply
  • Ian Cutress - Monday, February 11, 2019 - link

    Doh, a typo in all my graphs too. Should be updated. Reply
  • imaheadcase - Monday, February 11, 2019 - link

    Im on phone on AT and truly see how terrible ads are now. AT straight up letting scam ads now being served because desperate for revenue. 😂 Reply
  • PeachNCream - Monday, February 11, 2019 - link

    Is there a point in even mentioning that give how little control they now have over advertising? Just fire up the ad blocker or visit another site and let the new owners figure it out the hard way. Reply
  • StevoLincolnite - Tuesday, February 12, 2019 - link

    Anandtech had Maleware/Viruses infect it's userbase years ago via crappy adverts.

    That was the moment I got Ad-Block. And that is the moment where I will never turn it off again.
    Reply

Log in

Don't have an account? Sign up now