Closing Thoughts

Wrapping up our second look at Ashes of the Singularity and third overall look at Oxide’s Nitrous engines, it’s interesting to see where things have changed and where they have stayed the same.

Thanks to the general performance optimizations made since our initial look at Ashes, the situation for multi-GPU via DirectX 12 explicit multi-adapter is both very different and very similar. On an absolute basis it’s now a lot harder to max out a multi-GPU configuration; with reasonable quality settings we’re CPU limited even up to 4K, requiring we further increase the rendering quality. This more than anything else handily illustrates just how much performance has improved since the last beta. On the other hand it’s still the most unusual pairing – a Radeon R9 Fury X with a GeForce GTX 980 Ti – that delivers the best multi-GPU performance, which just goes to show what RTG and NVIDIA can accomplish working together.

As for the single GPU configurations, I’m not sure things as they currently stand could be any more different. NVIDIA cards have very good baseline DX11 performance in Ashes of the Singularity, but they mostly gain nothing from Ashes’ DX12 rendering path. RTG cards on the other hand have poorer DX11 performance, but they gain a significant amount of performance from the DX12 rendering path. In fact they gain so much performance that against traditional competitive lineups (e.g. Fury X vs. 980 Ti), the RTG cards are well in the lead, which isn’t usually the case elsewhere.

Going hand-in-hand with DX12, RTG’s cards are the only products to consistently benefit from Ashes’ improved asynchronous shading implementation. Whereas our NVIDIA cards see a very slight regression (with NVIDIA telling us that async shading is not currently enabled in their drivers), the Radeons improve in performance, especially the top-tier Fury X. This by itself isn’t wholly surprising given some of our theories about Fury X’s strengths and weaknesses, but for Ashes of the Singularity performance it further compounds on the other DX12 performance gains for RTG.

Ultimately Ashes gives us a very interesting look at the state of DirectX 12 performance for both RTG and NVIDIA cards, though no more and no less. As we stated at the start of this article this is beta software and performance is subject to change – not to mention the overall sample size of one game – but it is a start. For RTG this certainly lends support to their promotion of and expectations for DirectX 12, and it should be interesting to see how things shape up in March and beyond once the gold version of Ashes is released, and past that even more DirectX 12 games.

The Performance Impact of Asynchronous Shading
Comments Locked

153 Comments

View All Comments

  • Kouin325 - Friday, February 26, 2016 - link

    yes indeed they will be patching DX12 into the game, AFTER all the PR damage from the low benchmark scores is done. Nvidia waved some cash at the publisher/dev to make it a gameworks title, make it DX11, and to lock AMD out of making a day 1 patch.

    This was done to keep the general gaming public from learning that the Nvidia performance crown will all but disappear or worse under DX12. So they can keep selling their cards like hotcakes for another month or two.

    Also, Xbox hasn't been moved over to DX12 proper YET, but the DX11.x that the Xbox one has always used is by far closer to DX12 than DX11 for the PC. I think we'll know for sure what the game was developed for after the patch comes out. If the game gets a big performance increase after the DX12 patch then it was developed for DX12, and NV possibly had a hand in the DX11 for PC release. If the increase is small then it was developed for DX11,

    Reason being that getting the true performance of DX12 takes a major refactor of how assets are handled and pretty major changes to the rendering pipeline. Things that CANNOT be done in a month or two or how long this patch is taking to come out after release.

    Saying "we support DirectX12" is fairly ease and only takes changing a few lines of code, but you won't get the performance increases that DX12 can bring.
  • Kouin325 - Friday, February 26, 2016 - link

    ugh, I think Firefox had a brainfart, sorry for the TRIPPLE post.... *facepalm*
  • Gothmoth - Friday, February 26, 2016 - link

    it´s a crap game anyway so who cares?

    honestly even when nvidia should be 20% worse i would not buy ATI.
    not becasue im a fanboy.. but i use my GPU´s for more than games and ATI GPUs suck big time when it comes to drivers stability in pro applications.
  • D. Lister - Friday, February 26, 2016 - link

    Oxide and their so called "benchmarks" are a joke. Anyone who takes the aforementioned seriously, is just another unwitting victim of AMD's typical underhanded marketing.

    https://scalibq.wordpress.com/2015/09/02/directx-1...
    "And don’t get me started on Oxide… First they had their Star Swarm benchmark, which was made only to promote Mantle (AMD sponsors them via the Gaming Evolved program). By showing that bad DX11 code is bad. Really, they show DX11 code which runs single-digit framerates on most systems, while not exactly producing world-class graphics. Why isn’t the first response of most people as sane as: “But wait, we’ve seen tons of games doing similar stuff in DX11 or even older APIs, running much faster than this. You must be doing it wrong!”?

    But here Oxide is again, in the news… This time they have another ‘benchmark’ (do these guys actually ever make any actual games?), namely “Ashes of the Singularity”.
    And, surprise surprise, again it performs like a dog on nVidia hardware. Again, in a way that doesn’t make sense at all… The figures show it is actually *slower* in DX12 than in DX11. But somehow this is spun into a DX12 hardware deficiency on nVidia’s side. Now, if the game can get a certain level of performance in DX11, clearly that is the baseline of performance that you should also get in DX12, because that is simply what the hardware is capable of, using only DX11-level features. Using the newer API, and optionally using new features should only make things faster, never slower. That’s just common sense."
  • Th-z - Saturday, February 27, 2016 - link

    “But wait, we’ve seen tons of games doing similar stuff in DX11 or even older APIs..."

    Doing similar stuff in DX11? What stuff and what games?

    "The figures show it is actually *slower* in DX12 than in DX11. But somehow this is spun into a DX12 hardware deficiency on nVidia’s side."

    Which figure?

    This is Anandtech, we need to be more specific and provide solid evidence to back up your claims in order to avoid sounding like an astroturfer.
  • D. Lister - Saturday, February 27, 2016 - link

    You see my post? You see that there is this underlined text in blue? Well my friend, it is called a URL, which is an acronym for "Uniform Resource Locator", long story short it is this internet thingy that you go clickity-clickity with your mouse and it opens another page, where you can find the rest of the information.

    Don't worry, the process of opening a new webpage by using a URL may APPEAR quite daunting at first, but with very little practice you could be clicking away like a pro. This is after all "The AnandTech", and everybody is here to help. Heck, who knows if there are more like you out there, I might even make a video tutorial - "Open new webpages in 3 easy steps", or something.

    PS: Another pro tip, there is no such thing as "solid evidence" outside of a court of law. On the internet, you have information resources and reference material, and you have to use your own first-hand knowledge, experience and commonsense to differentiate the right from wrong.
  • Th-z - Sunday, May 29, 2016 - link

    Your blabbering is as useful as your link. I have a pro tip for you: you gave yourself away.
  • EugenM - Tuesday, June 7, 2016 - link

    @Th-z Dont feed the troll.
  • GeneralTom - Saturday, February 27, 2016 - link

    I hope Metal will be supported, too.
  • HollyDOL - Monday, February 29, 2016 - link

    Hm, from the screenshots posted I honestly can't see why would there be a need to run Dx12 with so "low performance" even on the most elite cards. While I give these guys credits for having the guts to go and develop in completely new API, the graphics looks more like early Dx9 games.
    Just a note this opinion is based on screenshots, not actual live render, but still from what I see there I'd expect FPS hitting 120+ with Dx11...

Log in

Don't have an account? Sign up now