Ashes of the Singularity: Escalation (DX12)

A veteran from both our 2016 and 2017 game lists, Ashes of the Singularity: Escalation remains the DirectX 12 trailblazer, with developer Oxide Games tailoring and designing the Nitrous Engine around such low-level APIs. The game makes the most of DX12's key features, from asynchronous compute to multi-threaded work submission and high batch counts. And with full Vulkan support, Ashes provides a good common ground between the forward-looking APIs of today. Its built-in benchmark tool is still one of the most versatile ways of measuring in-game workloads in terms of output data, automation, and analysis; by offering such a tool publicly and as part-and-parcel of the game, it's an example that other developers should take note of.

Settings and methodology remain identical from its usage in the 2016 GPU suite. To note, we are utilizing the original Ashes Extreme graphical preset, which compares to the current one with MSAA dialed down from x4 to x2, as well as adjusting Texture Rank (MipsToRemove in settings.ini).

Ashes of the Singularity: Escalation - 2560x1440 - Extreme Quality

Ashes of the Singularity: Escalation - 1920x1080 - Extreme Quality

With Ashes, the RX 590 performance uplift over the RX 580 pays off in terms of beating its main competition, the GTX 1060 6GB. The lead is slim enough, however, that custom GTX 1060 6GB cards could easily make up the difference. With the price premium the RX 590 has over the GTX 1060 6GB, the reference GeForce is a little too close for comfort.

 

Ashes of the Singularity: Escalation - 99th Percentile - 2560x1440 - Extreme Quality

Ashes of the Singularity: Escalation - 99th Percentile - 1920x1080 - Extreme Quality

While not particularly known as a VRAM-eater, Ashes at 1440p brings down the GTX 960 and its anemic 2GB framebuffer, though it wouldn't be managing playable framerates anyhow.

Far Cry 5 Wolfenstein II
Comments Locked

136 Comments

View All Comments

  • AMD#1 - Tuesday, November 20, 2018 - link

    Well, they released a card this year for the consumer market. My hopes are that Navi will be a better option against NVIDIA RTX, good that Radeon will not support RT. Let NVIDIA first support DX12 with hardware shader units, instead of hanging on to DYING DX11.
  • silverblue - Wednesday, November 21, 2018 - link

    A couple of reviews seem to be showing the Fatboy at similar power consumption (or slightly higher) levels to the 580 Nitro+ (presumably the 1450MHz version) and Red Devil Golden Sample. That's not so bad when you factor in the significantly increased clocks, but the two 580s hardly sipped power to begin with, and basic 580s don't really perform much worse for what would be much lower power consumption. AMD has a product that kind of bridges the gap between the 1060 and the 1070, but uses more power than a 1080... hardly envious, really. The Fatboy has rather poor thermals as well if you don't ramp the fan speed up.

    The 590s we're getting are clocked aggressively on core but not on memory; what would really be interesting is a 590 clocked at 580 levels, even factory overclocked 580 levels. Would it be worth getting a 590 just to undervolt and underclock the core, make use of the extra game in the bundle (once they launch, that is), and essentially be running a more efficient 580? I'd be tempted to overclock the memory at the same time as that appears to be where the 580 is being held back, not core speed.
  • silverblue - Wednesday, November 21, 2018 - link

    I'll be fair to the Fatboy; it does have a zero RPM mode which would explain the thermals.
  • WaltC - Sunday, December 2, 2018 - link

    I just bought a Fatboy...to run in X-fire with my year-old 8GB RX-480 (1.305GHz stock)...! As I am now gaming at 3840x2160, it seemed a worthwhile alternative to dropping $500+ on a single GPU. Paid $297 @ NewEgg & got the 3-game bundle. I read one review by a guy who X-fired a 580 with a 480 without difficulty--and the performance scaled from 70%-90% better when X-Fire is supported. Wouldn't recommend buying two 590's at one time, of course, but for people who already own a 480/580, the X-Fire alternative is the most cost-effective route at present. Gaming sites seem to have forgotten about X-Fire these days, for some reason. Of course, the nVidia 1060 doesn't allow for SLI--so that might be one reason, I suppose. Still, it's kind of baffling as the X-Fire mode seems like such a no brainer. And for those titles that will not X-fire, I'll just run them all @ 1.6GHz on the 590...Until next year when AMD's next < $300 GPU launches...! Then, I may have to think again!
  • quadibloc - Friday, December 7, 2018 - link

    Oh, so Global Foundries does now have a 12 nm process? I'm glad they're doing something a little better than 14 nm at least.

Log in

Don't have an account? Sign up now