Ashes of the Singularity: Escalation (DX12)

A veteran from our 2016 game list, Ashes of the Singularity: Escalation continues to be the DirectX 12 trailblazer, with developer Oxide Games tailoring and designing the Nitrous Engine around such low-level APIs. Ashes remains fresh for us in many ways: Escalation was released as a standalone expansion in November 2016 and was eventually merged into the base game in February 2017, while August 2017's v2.4 brought Vulkan support. Of all of the games in our benchmark suite, this is the game making the best use of DirectX 12’s various features, from asynchronous compute to multi-threaded work submission and high batch counts. While what we see can’t be extrapolated to all DirectX 12 games, it gives us a very interesting look at what we might expect in the future.

Settings and methodology remain identical from its usage in the 2016 GPU suite. To note, the latest version of Ashes has changed the Extreme graphical preset, dialing down MSAA from x4 to x2, as well as adjusting Texture Rank (MipsToRemove in settings.ini).

Ashes of the Singularity: Escalation - 3840x2160 - Extreme QualityAshes of the Singularity: Escalation - 2560x1440 - Extreme QualityAshes of the Singularity: Escalation - 1920x1080 - Extreme QualityAshes: Escalation - 99th Percentile - 3840x2160 - Extreme QualityAshes: Escalation - 99th Percentile - 2560x1440 - Extreme QualityAshes: Escalation - 99th Percentile - 1920x1080 - Extreme Quality

Here, the GTX 1070 Ti cleanly beats Vega 56, and even surpasses Vega 64 at sub-4K resolutions. While normally an AMD hardware favoring game, driver improvements over the past few months seem to have given the lead over to NVIDIA.

Battlefield 1 DOOM
Comments Locked

78 Comments

View All Comments

  • BrokenCrayons - Thursday, November 2, 2017 - link

    This review was a really good read. I also like that the game screenshots were dropped from it since they didn't exactly add much, but do eat a little of my data plan when I'm reading from a mobile device.

    As for the 1070 Ti, agreed its priced a bit too high. However, I think most of the current-gen GPUs are pushing the price envelope right now. Except maybe the 1030 of course which has a reasonable MSRP and doesn't require a dual slot cooler. That's really the only graphics card outside of an iGPU I'd seriously consider if I were in the market at the moment, but then again I'm not playing a lot of games on a PC because I have a console and a phone for that sort of thing.
  • Communism - Thursday, November 2, 2017 - link

    Literally the same price as a 1070 non-Ti was a week ago.

    Those cards sold so well that retailers are still gouging them to this day.
  • Communism - Thursday, November 2, 2017 - link

    And I should mention that the only reason that retail prices of 1070, Vega 56, and Vega 64 went down is due to the launch of 1070 Ti.
  • timecop1818 - Thursday, November 2, 2017 - link

    Still got that fuckin' DVI shit in 2017.
  • DanNeely - Thursday, November 2, 2017 - link

    Lack of a good way to run dual link DVI displays via HDMI/DP is probably keeping it around longer than originally intended. This includes both relatively old 2560x1600 displays that predate DP or HDMI 1.4 and thus could only do DL-DVI, and cheap 'Korean' 2560x1440 monitors from 2 or 3 years ago. The basic HDMI/DP-DVI adapters are single link and max out at 1920x1200. A few claim 2560x1600 by overclocking the data rate by 100% to stuff it down a single link worth of wires; this is mostly useless though since other than HDMI1.4 capable displays (which don't need this) virtually no DVI monitors can actually take a signal that fast. Active DP-DLDVI adapters can theoretically do it for $70-100, but they all came out buggy to one degree or another and sales were apparently too low to justify a new generation of hardware that fixed the issues.
  • Nate Oh - Saturday, November 4, 2017 - link

    This is actually precisely why I don't mind DVI too much, because I have and still use a 1-DVI-input-only Korean 1440p A- monitor from 3 years ago, overclocked to 96Hz. DVI probably needs to go away at some point soon, but maybe not too soon :)
  • ddferrari - Friday, November 3, 2017 - link

    So, that DVI port really ruins everything for ya? What are you, 14??

    There are tons of overclockable 1440p Korean monitors out there that only have one input- DVI. Adding a DVI port doesn't increase cost, slow down performance, or increase heat levels- so what's your imaginary problem again?
  • Notmyusualid - Sunday, November 5, 2017 - link

    @ ddferrari

    What are you - ddriver incarnate?

    I'm older than 14, and I wish the DVI port wasn't' there, as it is work to strip them off when I make my GPUs into single-slot water-cooled versions. Removing / modifying the bracket is one thing, but pulling out those DVI ports is another.
  • Silma - Thursday, November 2, 2017 - link

    How can you compare the Vega 56 to the GTX 1070 when it's 7 dB noisier and consumes up to 78 watts more ?
  • sach1137 - Thursday, November 2, 2017 - link

    Because the MSRP's of both the cards are same. Vega 56 beats 1070 in almost all games.yes it consumes more power and noiser too. But for some people it doesnt matter it gives 10-15% more performance than 1070. When Overclocked you can extract more from Vega 56 too.

Log in

Don't have an account? Sign up now