First Thoughts

Wrapping up our first look at Ashes of the Singularity and DirectX 12 Explicit Multi-Adapter, when Microsoft first unveiled the technology back at BUILD 2015, I figured it would only be a matter of time until someone put together a game utilizing the technology. After all, Epic and Square already had their tech demos up and running. However with the DirectX 12 ecosystem still coming together here in the final months of 2015 – and that goes for games as well as drivers – I wasn’t expecting something quite this soon.

As it stands the Ashes of the Singularity multi-GPU tech demo is just that, a tech demo for a game that itself is only in Alpha testing. There are still optimizations to be made and numerous bugs to be squashed. But despite all of that, seeing AMD and NVIDIA video cards working together to render a game is damn impressive.

Seeing as this build of Ashes is a tech demo, I’m hesitant to read too much into the precise benchmark numbers we’re seeing. That said, the fact that the fastest multi-GPU setup was a mixed AMD/NVIDIA GPU setup was something I wasn’t expecting and definitely makes it all the more interesting. DirectX 11 games are going to be around for a while longer yet, so we’re likely still some time away from a mixed GPU gaming setup being truly viable, but it will be interesting to see just what Oxide and other developers can pull off with explicit multi-adapter as they become more familiar with the technology and implement more advanced rendering modes.

Meanwhile it’s interesting to note just how far the industry as a whole has come since 2005 or even 2010. GPU architectures have become increasingly similar and tighter API standards have greatly curtailed the number of implementation differences that would prevent interoperability. And with Explicit Multi-Adapter, Microsoft and the GPU vendors have laid down a solid path for allowing game developers to finally tap the performance of multiple GPUs in a system, both integrated and discrete.

The timing couldn’t be any better either. As integrated GPUs have consumed the low-end GPU market and both CPU vendors devote more die space than ever to their respective integrated GPUs, using a discrete GPU leaves an increasingly large amount of silicon unused in the modern gaming system. Explicit multi-adapter in turn isn’t the silver bullet to that problem, but it is a means to finally putting the integrated GPU to good use even when it’s not a system’s primary GPU.

However with that said, it’s important to note that what happens from here is ultimately more in the hands of game developers than hardware developers. Given the nature of the explicit API, it’s now the game developers that have to do most of the legwork on implementing multi-GPU, and I’m left to wonder how many of them are up to the challenge. Hardware developers have an obvious interest in promoting and developing multi-GPU technology in order to sell more GPUs – which is how we got SLI and Crossfire in the first place – but software developers don’t have that same incentive.

Ultimately as gamers all we can do is take a wait-and-see approach to the whole matter. But as DirectX 12 game development ramps up, I am cautiously optimistic that positive experiences like Ashes will help encourage other developers to plan for multi-adapter support as well.

Ashes GPU Performance: Single & Mixed 2012 GPUs
Comments Locked

180 Comments

View All Comments

  • mosu - Thursday, October 29, 2015 - link

    Did you ever owned or touched an Iris HD 6000? or at least know someone who did?
  • wiak - Friday, October 30, 2015 - link

    eDRAM...
    if AMD goes HBM2 like they did in the past with ddr3 sideport memory

    just a taught
    AMD Zen 4-8 Core with Radeon (2048+ shaders, 2 or 4GB HBM2 (either as slot on mb or ondie like fury)

    i think i read somewhere there will be a single socket for APUs and CPUs,
    so amd lineup can be a Zen CPU with 8-16 cores for perf system and a Zen APU with 4-8 cores, 2048+ shaders and hbm2 for mainstream/laptops computers
  • Michael Bay - Thursday, October 29, 2015 - link

    If it actually could, we would be able to buy it. No such luck.
  • Revdarian - Thursday, October 29, 2015 - link

    Well, it has currently two offerings, one called Xbox One, and the other one that is more powerful is called the Playstation 4.

    Those are technically APU's, developed by AMD, and can be bought at the moment. Just saying, it is possible.
  • Midwayman - Monday, October 26, 2015 - link

    Seems like it would be great to do post effects and free up the main gpu to work on rendering.
  • Alexvrb - Monday, October 26, 2015 - link

    Agreed, as far as dGPU and iGPU cooperation goes I think Epic is on to something there. Free 10% performance boost? Why not. Now for dGPU + dGPU modes, I am not killed on the idea of unlinked mode. Seems like developers would have their work cut out for them with all the different possible configurations. Linked mode makes the most sense to me for consistency and relative difficulty to implement. Plus anyone using multiple GPUs is already used to using a pair of the same GPUs.

    Regardless of whether they go linked or unlinked though... I'd really like them to do something other than AFR. Split-frame, tile-based, something, anything. Blech.
  • Refuge - Monday, October 26, 2015 - link

    For high end AAA Titles likened mode would be optimum, I agree. Allows for their fast releases, and still gives a great performance boost. Their target demographic is already used to having to jump through hoops to get the results they want. Getting identical GPU's won't affect them.

    For games with extended lifetimes like MMO's such as WoW, Swtor, etc, etc. Unlikened mode is worth the investment, as it allows your game to hit a MUCH wider customer base with increased graphical performance. These are crowds that are easy to pole for data so they would easily know who they are directing their efforts towards, and the lifespan of the game make the extra man hours a worthy investment.
  • Gadgety - Tuesday, October 27, 2015 - link

    @alexvrb And game testers have their work cut out for them as well, testing all sorts of hardware configurations.

    In addition game developers will likely see the need for new skill sets, and likely this will benefit larger outfits being able to cope with developing and tuning their games to various hard ware combinations.
  • DanNeely - Tuesday, October 27, 2015 - link

    I suspect most small devs will continue to use their engine in the normal way, not taking any more advantage of most of the DX12 multi-GPU features any more than they did SLI/XFire in DX11 or prior. The only exception I see might be offloading post-processing to the IGP. That looks like a much simpler split to implement; and might be something they could get for free from the next version of their engine.
  • nightbringer57 - Monday, October 26, 2015 - link

    Wow. I didn't expect this to work this well.

    Just out of curiosity... Could you get a few more data points to show how a Titan X + Fury X/Fury X + Titan X would fare?

Log in

Don't have an account? Sign up now