Ashes of the Singularity: Escalation (DX12)

A veteran from both our 2016 and 2017 game lists, Ashes of the Singularity: Escalation remains the DirectX 12 trailblazer, with developer Oxide Games tailoring and designing the Nitrous Engine around such low-level APIs. The game makes the most of DX12's key features, from asynchronous compute to multi-threaded work submission and high batch counts. And with full Vulkan support, Ashes provides a good common ground between the forward-looking APIs of today. Its built-in benchmark tool is still one of the most versatile ways of measuring in-game workloads in terms of output data, automation, and analysis; by offering such a tool publicly and as part-and-parcel of the game, it's an example that other developers should take note of.

Settings and methodology remain identical from its usage in the 2016 GPU suite. To note, we are utilizing the original Ashes Extreme graphical preset, which compares to the current one with MSAA dialed down from x4 to x2, as well as adjusting Texture Rank (MipsToRemove in settings.ini).

We've updated some of the benchmark automation and data processing steps, so results may vary at the 1080p mark compared to previous data.

Ashes of the Singularity: Escalation - 2560x1440 - Extreme Quality

Ashes of the Singularity: Escalation - 1920x1080 - Extreme Quality

Ashes: Escalation - 99th Percentile - 2560x1440 - Extreme Quality

Ashes: Escalation - 99th Percentile - 1920x1080 - Extreme Quality

Interestingly, Ashes offers the least amount of improvement in the suite for the GTX 1660 Ti over the GTX 1060 6GB. Similarly, the GTX 1660 Ti lags behind the GTX 1070, which is already close to the older Turing sibling. With the GTX 1070 FE and RX Vega 56 neck-and-neck, the GTX 1660 Ti splits the RX 590/RX Vega 56 gap.

Far Cry 5 Wolfenstein II
Comments Locked

157 Comments

View All Comments

  • C'DaleRider - Friday, February 22, 2019 - link

    Good read. Thx.
  • Opencg - Saturday, February 23, 2019 - link

    gtx at rtx prices. not really a fan of that graph at the end. I mean 1080 ti were about 500 about half a year ago. the perf/dollar is surely less than -7% more like -30%. as well due to the 36% perf gain quoted being inflated as hell. double the price and +20% perf is not -7% anand
  • eddman - Saturday, February 23, 2019 - link

    They are comparing them based on their launch MSRP, which is fair.

    Actually, it seems they used the cut price of $500 for 1080 instead of the $600 launch MSRP. The perf/$ increases by ~15% if we use the latter, although it's still a pathetic generational improvement, considering 1080's perf/$ was ~55% better than 980.
  • close - Saturday, February 23, 2019 - link

    In all fairness when comparing products from 2 different generations that are both still on the market you should compare on both launch price and current price. The purpose is to know which is the better choice these days. To know the historical launch prices and trends between generation is good for conformity but very few readers care about it for more than curiosity and theoretical comparisons.
  • jjj - Friday, February 22, 2019 - link

    The 1060 has been in retail for 2.5 years so the perf gains offered here a lot less than what both Nvidia and AMD need to offer.
    They are pushing prices up and up but that's not a long term strategy.

    Then again, Nvidia doesn't care much about this market, they are shifting to server, auto and cloud gaming. In 5 years from now, they can afford to sell nothing in PC, unlike both AMD and Intel.
  • jjj - Friday, February 22, 2019 - link

    A small correction here, there is no perf gain here at all, in terms of perf per dollar.
  • D. Lister - Friday, February 22, 2019 - link

    Did you actually read the article before commenting on it? It is right there, on the last page - 21% increase in performance/dollar, which added with the very decent gain in performance/watt would suggest the company is anything but just sitting on their laurels. Unlike another company, which has been brute-forcing an architecture that is more than a decade old, and squandering their intellectual resources to design budget chips for consoles. :P
  • shabby - Friday, February 22, 2019 - link

    We didn't wait 2.5 years for such a meager performance increase. Architecture performance increases were much higher before Turing, Nvidia is milking us, can't you see?
  • Smell This - Friday, February 22, 2019 - link

    DING !
    I know it's my own bias, but branding looks like a typical, on-going 'bait-and-switch' scam whereby nVidia moves their goal posts by whim -- and adds yet another $100 in retail price (for the last 2 generations?). For those fans who spent beeg-buckeroos on a GTX 1070 (or even a 1060 6GB), it's The Way You Meant to Be 'Ewed-Scrayed.
  • haukionkannel - Saturday, February 23, 2019 - link

    Do you remember how much cpus used to improve From generation to generation... 3-5%...
    That was when there was no competition. Now when there is competition we see 15% increase between generations or less. Well come to the future of GPUs. 3-5 % of increase between generations if there is not competition. Maybe 15 or less if there is competition. The good point is that you can keep the same gpu 6 year and you have no need to upgrade and lose money.

Log in

Don't have an account? Sign up now