Ashes of the Singularity: Escalation (DX12)

A veteran from both our 2016 and 2017 game lists, Ashes of the Singularity: Escalation remains the DirectX 12 trailblazer, with developer Oxide Games tailoring and designing the Nitrous Engine around such low-level APIs. The game makes the most of DX12's key features, from asynchronous compute to multi-threaded work submission and high batch counts. And with full Vulkan support, Ashes provides a good common ground between the forward-looking APIs of today. Its built-in benchmark tool is still one of the most versatile ways of measuring in-game workloads in terms of output data, automation, and analysis; by offering such a tool publicly and as part-and-parcel of the game, it's an example that other developers should take note of.

Settings and methodology remain identical from its usage in the 2016 GPU suite. To note, we are utilizing the vanilla Ashes Classic Extreme graphical preset, which compares to the current one with MSAA dialed down from x4 to x2, as well as adjusting Texture Rank (MipsToRemove in settings.ini). For today, we are also utilizing the vanilla High and Standard presets.

Ashes of the Singularity: Escalation - 1920x1080 - Extreme Quality

Ashes of the Singularity: Escalation - 1920x1080 - High Quality

Ashes of the Singularity: Escalation - 1920x1080 - Standard Quality

Ashes: Escalation - 99th Percentile - 1920x1080 - Extreme Quality

Ashes: Escalation - 99th Percentile - 1920x1080 - High Quality

Ashes: Escalation - 99th Percentile - 1920x1080 - Standard Quality

With Ashes, the GTX 1650 continues on trend, solidly slower than the RX 570 yet clearly a step up from predecessor 2GB cards.

Far Cry 5 Wolfenstein II
Comments Locked

126 Comments

View All Comments

  • philehidiot - Friday, May 3, 2019 - link

    Over here, it's quite routine for people to consider the efficiency cost of using AC in a car and whether it's more sensible to open the window... If you had a choice over a GTX1080 and Vega64 which perform nearly the same, assume they cost nearly the same, then you'd take into account one requires a small nuclear reactor to run whilst the other is probably more energy sipping than your current card. Also, some of us are on this thing called a budget. $50 saving is a weeks food shopping.
  • JoeyJoJo123 - Friday, May 3, 2019 - link

    Except your comment is exactly in line with what I said:
    "Lower power for the same performance at a similar enough price can be a tie-breaker between two competing options, but that's not the case here for the 1650"

    I'm not saying power use of the GPU is irrelevant, I'm saying performance/price is ultimately more important. The RX 570 GPU is not only significantly cheaper, but it outperforms the GTX 1650 is most scenarios. Yes, the RX 570 does so by consuming more power, but it'd take 2 or so years of power bills (at least according to avg American power bill per month) to split an even cost with the GTX 1650, and even at that mark where the cost of ownership is equivalent, the RX 570 still has provided 2 years of consistently better performance, and will continue to offer better performance.

    Absolutely, a GTX1080 is a smarter buy compared the to the Vega64 given the power consumption, but that's because power consumption was the tie breaker. The comparison wouldn't be as ideal for the GTX1080 if it costed 30% more than the Vega64, offered similar performance, but came with the long term promise of ~eventually~ paying for the upfront difference in cost with a reduction in power cost.

    Again, the sheer majority of users on the market are looking for best performance/price, and the GTX1650 outpriced itself out of the market it should be competing with.
  • Oxford Guy - Saturday, May 4, 2019 - link

    "it'd take 2 or so years of power bills (at least according to avg American power bill per month) to split an even cost with the GTX 1650, and even at that mark where the cost of ownership is equivalent, the RX 570 still has provided 2 years of consistently better performance, and will continue to offer better performance."

    This.

    Plus, if people are so worried about power consumption maybe they should get some solar panels.
  • Yojimbo - Sunday, May 5, 2019 - link

    Why in the world would you get solar panels? That would only increase the cost even more!
  • Karmena - Tuesday, May 7, 2019 - link

    So, you multiplied it once, why not multiply that value again. and make it 100$?
  • Gigaplex - Sunday, May 5, 2019 - link

    Kids living with their parents generally don't care about the power bill.
  • gglaw - Sunday, May 5, 2019 - link

    wrong on so many levels. If you find the highest cost electricity city in the US, plug in the most die hard gamer who plays only new games on max settings that runs GPU at 100% load at all times, and assume he plays more hours than most people work you might get close to those numbers. The sad kid who fits the above scenario games hard enough he would never choose to get such a bad card that is significantly slower than last gen's budget performers (RX 570 and GTX 1060 3GB). Kids in this scenario would not be calculating the nickels and dimes he's saving here and there - they'd would be getting the best card in their NOW budget without subtracting the quarter or so they might get back a week. You're trying to create a scenario that just doesn't exist. Super energy conscious people logging every penny of juice they spend don't game dozens of hours a week and would be nit-picky enough they would probably find settings to save that extra 2 cents a week so wouldn't even be running their GPU at 100% load.
  • PeachNCream - Friday, May 3, 2019 - link

    Total cost of ownership is a significant factor in any buying decision. Not only should one consider the electrical costs of a GPU, but indirect additional expenses such as air conditioning needs or reductions in heating costs offset by heat output along with the cost to upgrade at a later date based on the potential for dissatisfaction with future performance. Failing to consider those and other factors ignores important recurring expenses.
  • Geranium - Saturday, May 4, 2019 - link

    Then people need to buy Ryzen R7 2700X than i9 9900K. As 9900K use more power, runs hot so need more powerful cooler and powerful cooler use more current compared to a 2700X.
  • nevcairiel - Saturday, May 4, 2019 - link

    Not everyone puts as much value on cost as others. When discussing a budget product, it absolutely makes sense to consider, since you possibly wouldn't buy such a GPU if money was no object.

    But if someone buys a high-end CPU, the interests shift drastically, and as such, your logic makes no sense anymore. Plenty people buy the fastest not because its cheap, but because its the absolutely fastest.

Log in

Don't have an account? Sign up now