Gaming: Ashes Classic (DX12)

Seen as the holy child of DirectX12, Ashes of the Singularity (AoTS, or just Ashes) has been the first title to actively go explore as many of the DirectX12 features as it possibly can. Stardock, the developer behind the Nitrous engine which powers the game, has ensured that the real-time strategy title takes advantage of multiple cores and multiple graphics cards, in as many configurations as possible.

As a real-time strategy title, Ashes is all about responsiveness during both wide open shots but also concentrated battles. With DirectX12 at the helm, the ability to implement more draw calls per second allows the engine to work with substantial unit depth and effects that other RTS titles had to rely on combined draw calls to achieve, making some combined unit structures ultimately very rigid.

Stardock clearly understand the importance of an in-game benchmark, ensuring that such a tool was available and capable from day one, especially with all the additional DX12 features used and being able to characterize how they affected the title for the developer was important. The in-game benchmark performs a four minute fixed seed battle environment with a variety of shots, and outputs a vast amount of data to analyze.

For our benchmark, we run Ashes Classic: an older version of the game before the Escalation update. The reason for this is that this is easier to automate, without a splash screen, but still has a strong visual fidelity to test.

 

Ashes has dropdown options for MSAA, Light Quality, Object Quality, Shading Samples, Shadow Quality, Textures, and separate options for the terrain. There are several presents, from Very Low to Extreme: we run our benchmarks at the above settings, and take the frame-time output for our average and percentile numbers.

 

All of our benchmark results can also be found in our benchmark engine, Bench.

AnandTech IGP Low Medium High
Average FPS
95th Percentile
Gaming: Final Fantasy XV Gaming: Strange Brigade (DX12, Vulkan)
Comments Locked

220 Comments

View All Comments

  • yeeeeman - Wednesday, May 20, 2020 - link

    The CPU won't consume nowhere near 250w during gaming. 250w is valid only for short all core scenarios. Otherwise it will stay in its 130w tdp. Go and read other reviews and you will see I am right.
  • yankeeDDL - Thursday, May 21, 2020 - link

    According to this (https://images.anandtech.com/doci/15785/10900K%20y... it stays at 230W for almost 4min.
    In any case, you can read my sentence again and use 130W instead of 250W, and it does nt change anything.
  • arashi - Saturday, May 23, 2020 - link

    You can't blame him, he's on Intel payroll and has to act the idiot.
  • dirkdigles - Wednesday, May 20, 2020 - link

    Ian, I think the pricing on the charts is a bit misleading. The $488 price for the 10900K is the 1000-unit bulk pricing, and the $499 price on the 3900X hasn't been seen since January 2020... it's currently $409 on Amazon. This would skew the ability for the reader to make comparison.

    I know MSRP is a good metric, but street price is more important. What can I buy these chips for, today? If I'm a consumer, I likely can't get that $488 bulk per chip price for the 10900K, and the 3900X is not going to cost me anywhere near $409. Please update.
  • dirkdigles - Wednesday, May 20, 2020 - link

    *anywhere near $499. Typo.
  • WaltC - Wednesday, May 20, 2020 - link

    Yes, I paid ~$409 for my 3900X, and on top of that AMZN offered me 6-months, same-as-cash, which I was more than happy to accept...;) Good times!
  • AnarchoPrimitiv - Wednesday, May 20, 2020 - link

    Exactly, the 3900x is over $100 cheaper and is nowhere "around the same price"
  • yeeeeman - Wednesday, May 20, 2020 - link

    Well Intel has the 10900f at 400$. Locked with no igpu. almost same frequencies. That is a better buy than the 10900k
  • Spunjji - Tuesday, May 26, 2020 - link

    Right - the 10900F is likely a better deal, but the comparison was with the 10900K.
  • Irata - Wednesday, May 20, 2020 - link

    Waiting for comments on how the two small fans on the mainboard make this an unacceptable option. If I remember correctly, that applied to X570 boards.

Log in

Don't have an account? Sign up now