Gaming: Ashes Classic (DX12)

Seen as the holy child of DirectX12, Ashes of the Singularity (AoTS, or just Ashes) has been the first title to actively go explore as many of the DirectX12 features as it possibly can. Stardock, the developer behind the Nitrous engine which powers the game, has ensured that the real-time strategy title takes advantage of multiple cores and multiple graphics cards, in as many configurations as possible.

As a real-time strategy title, Ashes is all about responsiveness during both wide open shots but also concentrated battles. With DirectX12 at the helm, the ability to implement more draw calls per second allows the engine to work with substantial unit depth and effects that other RTS titles had to rely on combined draw calls to achieve, making some combined unit structures ultimately very rigid.

Stardock clearly understand the importance of an in-game benchmark, ensuring that such a tool was available and capable from day one, especially with all the additional DX12 features used and being able to characterize how they affected the title for the developer was important. The in-game benchmark performs a four minute fixed seed battle environment with a variety of shots, and outputs a vast amount of data to analyze.

For our benchmark, we run Ashes Classic: an older version of the game before the Escalation update. The reason for this is that this is easier to automate, without a splash screen, but still has a strong visual fidelity to test.

 

Ashes has dropdown options for MSAA, Light Quality, Object Quality, Shading Samples, Shadow Quality, Textures, and separate options for the terrain. There are several presents, from Very Low to Extreme: we run our benchmarks at the above settings, and take the frame-time output for our average and percentile numbers.

 

All of our benchmark results can also be found in our benchmark engine, Bench.

AnandTech IGP Low Medium High
Average FPS
95th Percentile
Gaming: Final Fantasy XV Gaming: Strange Brigade (DX12, Vulkan)
Comments Locked

220 Comments

View All Comments

  • DrKlahn - Wednesday, May 20, 2020 - link

    My biggest issue with gaming is that these reviews rarely show anything other than low resolution scenarios. I realize a sizable slice of the gaming community uses 1080p and that some of them are trying to hit very high frame rates. But there also a lot of us with 1440p+ or Ultrawides and I think it gets overlooked that Intels gaming "lead" largely evaporates for anyone not trying to hit very high frames at 1080p.
  • ElvenLemming - Wednesday, May 20, 2020 - link

    Honestly, I think it's ignored because it's well understood that at 1440p+ the CPU just doesn't matter very much. There's not much value in anything above 1080p for a CPU review the vast majority of games are going to be GPU limited. That said, plenty of other outlets include them in their reviews if you want to see a bunch of charts where the top is all within 1% of each other.
  • DrKlahn - Wednesday, May 20, 2020 - link

    I do agree with you that a lot of us do understand that as resolution and detail increases, CPUs become almost irrelevant to gaming performance. However you do see a fair few posters parroting "Intel is better for gaming" when in reality for their use case it really isn't any better. That's why I feel like these reviews (here and elsewhere) should spotlight where this difference matters. If you are a competitive CS:GO player that wants 1080p or lower with the most frames you can get, then Intel is undoubtedly better. But a person who isn't as tech savvy that games and does some productivity tasks with a 1440p+ monitor is only spending more money for a less efficient architecture that won't benefit them if they simply see "Intel better for gaming" and believe it applies to them.
  • shing3232 - Thursday, May 21, 2020 - link

    3900X or 3800X can beat Intel 9900Kf on csgo with pbo on if I remember correctly.
  • silencer12 - Saturday, May 23, 2020 - link

    Csgo is not a demanding game
  • vanilla_gorilla - Monday, June 15, 2020 - link

    >If you are a competitive CS:GO player that wants 1080p or lower with the most frames you can get, then Intel is undoubtedly better.

    It's actually more complicated than that. Even midrange Zen 2 CPU can hit well over 200 fps in CS:GO. So unless you have a 240hz monitor, it won't make any difference buying Intel or AMD in that case.
  • Irata - Wednesday, May 20, 2020 - link

    Techspot shows a seven game average and there the avg fps / min 1% difference to the Ryzen 3 3300x is less than 10% using a 2080ti.
  • CrimsonKnight - Thursday, May 21, 2020 - link

    This review's benchmarks goes up to 4K/8K resolution. You have to click the thumbnails under the graphs.
  • Meteor2 - Wednesday, July 15, 2020 - link

    To be clear: Anandtech tests at low resolutions so the bottleneck is the CPU, not the GPU. A Ryzen 5 won’t bottleneck a 2080 Ti at 4K.
  • kmmatney - Wednesday, May 20, 2020 - link

    Those of us who live near a Microcenter can get the 3900X for $389, along with a $20 discount on a motherboard (and a serviceable heatsink). The Ryzen 5 (what I bought) is $159, also with a $20 motherboard discount and a decent cooler. So my effective motherboard cost was $79, and total cost of $240 + tax, with a motherboard that can (most likely) be upgraded to Zen 3

Log in

Don't have an account? Sign up now