Gaming: Ashes Classic (DX12)

Seen as the holy child of DirectX12, Ashes of the Singularity (AoTS, or just Ashes) has been the first title to actively go explore as many of the DirectX12 features as it possibly can. Stardock, the developer behind the Nitrous engine which powers the game, has ensured that the real-time strategy title takes advantage of multiple cores and multiple graphics cards, in as many configurations as possible.

As a real-time strategy title, Ashes is all about responsiveness during both wide open shots but also concentrated battles. With DirectX12 at the helm, the ability to implement more draw calls per second allows the engine to work with substantial unit depth and effects that other RTS titles had to rely on combined draw calls to achieve, making some combined unit structures ultimately very rigid.

Stardock clearly understand the importance of an in-game benchmark, ensuring that such a tool was available and capable from day one, especially with all the additional DX12 features used and being able to characterize how they affected the title for the developer was important. The in-game benchmark performs a four minute fixed seed battle environment with a variety of shots, and outputs a vast amount of data to analyze.

For our benchmark, we run Ashes Classic: an older version of the game before the Escalation update. The reason for this is that this is easier to automate, without a splash screen, but still has a strong visual fidelity to test.

 

Ashes has dropdown options for MSAA, Light Quality, Object Quality, Shading Samples, Shadow Quality, Textures, and separate options for the terrain. There are several presents, from Very Low to Extreme: we run our benchmarks at the above settings, and take the frame-time output for our average and percentile numbers.

 

All of our benchmark results can also be found in our benchmark engine, Bench.

AnandTech IGP Low Medium High
Average FPS
95th Percentile
Gaming: Final Fantasy XV Gaming: Strange Brigade (DX12, Vulkan)
Comments Locked

220 Comments

View All Comments

  • catavalon21 - Wednesday, May 20, 2020 - link

    The ability to edit (or ^Z) would be most welcome, trust me.
  • eastcoast_pete - Wednesday, May 20, 2020 - link

    Isn't that Skylake running a bit dry by now? But, seriously, Intel really risks losing a lot of market share in future years by selling these "classics" at high prices, and that is if one can get one in the first place.
    Curious: how many commercial customers buy Intel desktops just because they have iGPUs, but want more CPU oomph than the 3200G has? Is that why Intel still dominates the OEM desktop market?
  • AnarchoPrimitiv - Wednesday, May 20, 2020 - link

    Intel dominates the OEM market through intimidation and threats of retribution... They were literally convicted of bribing OEMs to NOT use AMD CPUs all throughout the 2000s in several courts around the world. The trials uncovered emails between Intel executives that stated, and I quote, "Dell is the best friend money can buy".... The proof is in the fact that currently, the Ryzen 4000 mobile CPUs are the best mobile chips offered right now, but Dell only puts them in the low end laptops. Why? Because Intel is probably giving huge financial incentives to bar AMD from premium designs to perpetuate the myth that AMD isn't a premium brand
  • Retycint - Wednesday, May 20, 2020 - link

    Do keep in mind that these are baseless speculations, based on something that happened 2 decades ago. Both Intel and AMD have changed since then (new engineering team, new management etc) and there has been no evidence of Intel providing incentives to cripple AMD systems. Go take your conspiracy elsewhere.

    And before you inevitably accuse me of being an Intel shill, this isn't about Intel or AMD, it's about facts to support your claim, of which there have been none
  • Irata - Wednesday, May 20, 2020 - link

    Baseless speculation? Financial horsepower, MDF and meet the comp funds are current and no secret.

    Why do you think there are no Ryzen 4000 laptops with GPU above a 2060?
  • Spunjji - Tuesday, May 26, 2020 - link

    Not entirely baseless, as they made two distinct claims. I've been a party to how Intel's "Marketing Development Funds" work - and work it does, at all levels from OEM to reseller to retailer. These days they don't explicitly punish anyone for not buying AMD - they simply tie rebates that will improve the profit margins on a product to specific quantities of those products being sold. It's "nobody's fault" if those quantities happen to make the sale of an AMD product by a given retailer or reseller distinctly unlikely.

    As for incentivizing bad *builds* of AMD systems, though, I'm not so sure. Intel clearly do a lot of work building reference platforms, and the economics of doing integration testing for a new vendor is not trivial. Honestly though, it's hard to tell how we *would* know if this were going on, because it would absolutely be made to look innocent - just like last time.
  • brantron - Wednesday, May 20, 2020 - link

    "literally convicted of bribing"

    1) No. That's not what "literally" means.
    2) No. No one was even *charged* with a crime, much less convicted.
    3) No. It wasn't about bribery.

    The reason Athlon 64s weren't ubiquitous way back when is the same reason the 4000 APUs aren't today - there aren't enough to go around.

    If your post were to be rephrased without hyperbole, baseless accusations, and whataboutism unrelated to the topic of this article, it would read something like this:

    "6 months after AMD's announcement of Renoir, the number of 4000 APUs sold for desktops is literally zero (see how that works?) because TSMC is still slammed."
  • WaWaThreeFIVbroS - Thursday, May 21, 2020 - link

    Your ignorance is amusing
    It is technically a bribery

    https://www.extremetech.com/computing/184323-intel...
  • Spunjji - Tuesday, May 26, 2020 - link

    First 3 points: accurate, if not entirely on-topic. Nobody was charged with a crime, but Intel sure were fined a lot for collusion.

    Which gets to the 4th point: again, accurate, but not entirely relevant. AMD were definitely not able to match Intel for manufacturing, which is why they couldn't have beaten Intel out of the market entirely, but that was barely related to why they weren't getting into Dell systems. See the aforementioned proven-and-fined-for collusion.
  • drothgery - Friday, May 22, 2020 - link

    Or because premium designs take longer when the new chip isn't just another respin of the same thing, and AMD hadn't produced a viable high-end notebook chip in well over a decade so it made sense to wait and see if Ryzen 4000 was any good rather than designing in advance?

Log in

Don't have an account? Sign up now