Gaming: Ashes Classic (DX12)

Seen as the holy child of DirectX12, Ashes of the Singularity (AoTS, or just Ashes) has been the first title to actively go explore as many of the DirectX12 features as it possibly can. Stardock, the developer behind the Nitrous engine which powers the game, has ensured that the real-time strategy title takes advantage of multiple cores and multiple graphics cards, in as many configurations as possible.

As a real-time strategy title, Ashes is all about responsiveness during both wide open shots but also concentrated battles. With DirectX12 at the helm, the ability to implement more draw calls per second allows the engine to work with substantial unit depth and effects that other RTS titles had to rely on combined draw calls to achieve, making some combined unit structures ultimately very rigid.

Stardock clearly understand the importance of an in-game benchmark, ensuring that such a tool was available and capable from day one, especially with all the additional DX12 features used and being able to characterize how they affected the title for the developer was important. The in-game benchmark performs a four minute fixed seed battle environment with a variety of shots, and outputs a vast amount of data to analyze.

For our benchmark, we run Ashes Classic: an older version of the game before the Escalation update. The reason for this is that this is easier to automate, without a splash screen, but still has a strong visual fidelity to test.

 

Ashes has dropdown options for MSAA, Light Quality, Object Quality, Shading Samples, Shadow Quality, Textures, and separate options for the terrain. There are several presents, from Very Low to Extreme: we run our benchmarks at the above settings, and take the frame-time output for our average and percentile numbers.

 

All of our benchmark results can also be found in our benchmark engine, Bench.

AnandTech IGP Low Medium High
Average FPS
95th Percentile
Gaming: Final Fantasy XV Gaming: Strange Brigade (DX12, Vulkan)
Comments Locked

220 Comments

View All Comments

  • arashi - Sunday, May 24, 2020 - link

    Replacing Stewart with xx does not a clone account make.

    Try again.
  • Spunjji - Tuesday, May 26, 2020 - link

    Good catch XD
  • Spunjji - Tuesday, May 26, 2020 - link

    You're talking past yourself.

    Sure, it's impressive what Intel's disaster management engineers managed to pull out the wreckage of their failure at 10nm. Their failure at 10nm was an engineering failure too, though, and they still haven't managed to backport their 10nm-planned architecture to 14nm.

    In other words, those engineering failures are the only reason they had to build this crazy nonsense - of which you express such admiration - in the first place.
  • extide - Wednesday, May 20, 2020 - link

    This is not HEDT
  • Spunjji - Tuesday, May 26, 2020 - link

    He's still reading from the 2016 Intel playbook :D
  • Icehawk - Saturday, May 23, 2020 - link

    I care because I like silent machines and use fanless PSUs. I can’t afford to blow 250-300W of the power budget on the CPU when I am limited to 450W, the small difference in real world gaming isn’t worth popping for a higher power PSU that brings with it fan noise. I should be able to run my 3900X with a nV 3070 with what I have, I don’t think I could with this i9.

    If power budget isn’t a concern then it’s down to brand preference, usage mix, etc to me. I have an intel 8700 as well, at the time I felt that was the best CPU choice, when I needed another new machine a few months ago the 3900 was - I still feel it would be today for me.

    YMMV
  • Spunjji - Tuesday, May 26, 2020 - link

    Cool, another person who thinks their personal views on a topic outweigh all others and is psychologically projecting that onto the reviewer. This is how 90% of disinformation works now...
  • prophet001 - Wednesday, May 20, 2020 - link

    I'm curious as to why this only has 16 pcie lanes into the CPU. How much does running your high performance SSD through the PCH or running your GPU in x8 mode affect performance?
  • GreenReaper - Wednesday, May 20, 2020 - link

    Conveniently, there is an article (almost) about that: https://www.anandtech.com/show/15720/intel-ghost-c...
  • azfacea - Wednesday, May 20, 2020 - link

    with intel DIY PC marketshare being well below 50% and 10th gen itself having to compete with 9th, 8th, 7th, with supply shortage and everything I doubt these new LGA1200 motherboards can reach 10% of DIY PC which means the

    " ... 44+ entrants ranging from $150 all the way up to $1200 ..."

    are all massive cash burning operations that would never make sense in a million years w/- intel "development funding". they are literally squandering billions of dollars that they took from ripping of the customers. intel is so stupid, gouging its customers like this and then squandering the money for what ?? LGA 1200 has the option to have pcie 4 by the time its irrelevant ? my god WTF is going on there.

Log in

Don't have an account? Sign up now