Gaming: Ashes Classic (DX12)

Seen as the holy child of DirectX12, Ashes of the Singularity (AoTS, or just Ashes) has been the first title to actively go explore as many of the DirectX12 features as it possibly can. Stardock, the developer behind the Nitrous engine which powers the game, has ensured that the real-time strategy title takes advantage of multiple cores and multiple graphics cards, in as many configurations as possible.

As a real-time strategy title, Ashes is all about responsiveness during both wide open shots but also concentrated battles. With DirectX12 at the helm, the ability to implement more draw calls per second allows the engine to work with substantial unit depth and effects that other RTS titles had to rely on combined draw calls to achieve, making some combined unit structures ultimately very rigid.

Stardock clearly understand the importance of an in-game benchmark, ensuring that such a tool was available and capable from day one, especially with all the additional DX12 features used and being able to characterize how they affected the title for the developer was important. The in-game benchmark performs a four minute fixed seed battle environment with a variety of shots, and outputs a vast amount of data to analyze.

For our benchmark, we run Ashes Classic: an older version of the game before the Escalation update. The reason for this is that this is easier to automate, without a splash screen, but still has a strong visual fidelity to test.

 

Ashes has dropdown options for MSAA, Light Quality, Object Quality, Shading Samples, Shadow Quality, Textures, and separate options for the terrain. There are several presents, from Very Low to Extreme: we run our benchmarks at the above settings, and take the frame-time output for our average and percentile numbers.

 

All of our benchmark results can also be found in our benchmark engine, Bench.

AnandTech IGP Low Medium High
Average FPS
95th Percentile
Gaming: Final Fantasy XV Gaming: Strange Brigade (DX12, Vulkan)
Comments Locked

220 Comments

View All Comments

  • Spunjji - Tuesday, May 26, 2020 - link

    Complaining at the reviewer for failing to test something that doesn't really get used is... a thing.
  • Datawhite - Thursday, May 21, 2020 - link

    Bring on ZEN 3 AMD than Intel can R.I.P. ......
    Still waiting for RDNA 2!
  • Samus - Thursday, May 21, 2020 - link

    No quad core under $100 basically just gave AMD the entire budget segment.

    Overall, this pricing is ridiculous but at least the 6C parts are somewhat competitive.
  • ph1nn - Thursday, May 21, 2020 - link

    Does Intel realize global climate change is a thing? This power consumption is an embarrassment, this company used to have the most most efficient CPUs now they draw 200W?!
  • Gastec - Friday, May 22, 2020 - link

    I don't understand what the climate change has to do with a 200W CPU power consumption. I would have understood something like "does Intel realize we have limited or non-existent incomes, given the current Pandemic situation?"
  • Beaver M. - Friday, May 22, 2020 - link

    I hope you buy a new PC only every 10 years.
  • pegnose - Friday, May 22, 2020 - link

    It looks to me that a simple re-ordering of the core-to-core latency chart for the 10900K removes the apparent 3-4 ns jump. You already mentioned that the core "names" not necessarily represent hardware positions, Ian.

    Btw, I am curious why it seems that a higher core/thread index comes with higher latency. Adjacent cores should have low core-to-core latency. But 16-to-18 takes longer than 4-to-6. Is this due to address-checking in the ring-bus communication taking longer for higher indices?
  • Shaquille_Oatmeal - Friday, May 22, 2020 - link

    X570 chipset AMD boards can't be found in stock almost anywhere. This isn't news. But even today, days after Intel's 10th gen LGA1200 CPUs launched, and the arguably subjective reviews are finally made public, there's an endless supply of Z490 boards. PC enthusiasts do want the fastest CPUs, for sure, but we also consider the cost and [overall efficiency]. We are not 12 year old kids wanting the colorful RGB lights for our COD rig. No. The RGB lighting is a nice feature, but we're not idiots. These Intel CPUs are garbage based on even Intel's standards over the years; yet they are being marketed like they are the best CPUs. Intel, we can see the truth. And the truth is we won't touch these CPUs; perhaps if you dropped the price on the 10700K to $250 we can have a serious convo. Hopefully Intel gets there game together. I'm sure their OEM buyers are thinking the same.
  • Gastec - Friday, May 22, 2020 - link

    The way this is going I'm looking forward to that 32-core Intel consumer CPU, with 1000 W power draw, that will definitely give us those much needed 1000 fps @ 1080p
  • boozed - Saturday, May 23, 2020 - link

    Got a question about the game benchmarks. The table has an "IGP" column but the charts in that column have "GTX 1080" written on them. So which is it?

Log in

Don't have an account? Sign up now