First Thoughts

Wrapping up our first look at Ashes of the Singularity and DirectX 12 Explicit Multi-Adapter, when Microsoft first unveiled the technology back at BUILD 2015, I figured it would only be a matter of time until someone put together a game utilizing the technology. After all, Epic and Square already had their tech demos up and running. However with the DirectX 12 ecosystem still coming together here in the final months of 2015 – and that goes for games as well as drivers – I wasn’t expecting something quite this soon.

As it stands the Ashes of the Singularity multi-GPU tech demo is just that, a tech demo for a game that itself is only in Alpha testing. There are still optimizations to be made and numerous bugs to be squashed. But despite all of that, seeing AMD and NVIDIA video cards working together to render a game is damn impressive.

Seeing as this build of Ashes is a tech demo, I’m hesitant to read too much into the precise benchmark numbers we’re seeing. That said, the fact that the fastest multi-GPU setup was a mixed AMD/NVIDIA GPU setup was something I wasn’t expecting and definitely makes it all the more interesting. DirectX 11 games are going to be around for a while longer yet, so we’re likely still some time away from a mixed GPU gaming setup being truly viable, but it will be interesting to see just what Oxide and other developers can pull off with explicit multi-adapter as they become more familiar with the technology and implement more advanced rendering modes.

Meanwhile it’s interesting to note just how far the industry as a whole has come since 2005 or even 2010. GPU architectures have become increasingly similar and tighter API standards have greatly curtailed the number of implementation differences that would prevent interoperability. And with Explicit Multi-Adapter, Microsoft and the GPU vendors have laid down a solid path for allowing game developers to finally tap the performance of multiple GPUs in a system, both integrated and discrete.

The timing couldn’t be any better either. As integrated GPUs have consumed the low-end GPU market and both CPU vendors devote more die space than ever to their respective integrated GPUs, using a discrete GPU leaves an increasingly large amount of silicon unused in the modern gaming system. Explicit multi-adapter in turn isn’t the silver bullet to that problem, but it is a means to finally putting the integrated GPU to good use even when it’s not a system’s primary GPU.

However with that said, it’s important to note that what happens from here is ultimately more in the hands of game developers than hardware developers. Given the nature of the explicit API, it’s now the game developers that have to do most of the legwork on implementing multi-GPU, and I’m left to wonder how many of them are up to the challenge. Hardware developers have an obvious interest in promoting and developing multi-GPU technology in order to sell more GPUs – which is how we got SLI and Crossfire in the first place – but software developers don’t have that same incentive.

Ultimately as gamers all we can do is take a wait-and-see approach to the whole matter. But as DirectX 12 game development ramps up, I am cautiously optimistic that positive experiences like Ashes will help encourage other developers to plan for multi-adapter support as well.

Ashes GPU Performance: Single & Mixed 2012 GPUs
Comments Locked

180 Comments

View All Comments

  • Badelhas - Monday, October 26, 2015 - link

    Great article but... This is all very nice and everything but what I really miss is a PC game with a graphics breakthrough, like Crysis when it was lauched back on 2007. None of the games I saw in meantime had that WOW factor. I blame consoles.
  • tipoo - Monday, October 26, 2015 - link

    Star Citizen maybe? Pretty good at bringing top systems to their knees. Though yeah, nothing is the singular leader like Crysis in 2007 was, but that's also a product of every engine getting up to a good level.
  • Nfarce - Monday, October 26, 2015 - link

    My thoughts exactly. The jump from games like HL2 and BF2 to Crysis was like a whole new world. Unfortunately it was such a big hit on performance only the most deep pockets could afford to play that game at full capability. I wasn't able to do it until 2009 when building a new rig, and even then wasn't able to get 60fps at 1080p.
  • Marc HFR - Monday, October 26, 2015 - link

    Hello,

    In AFR, isn't the (small) difference between AMD and NVIDIA on the rendering annoying ?
  • PVG - Monday, October 26, 2015 - link

    Most interesting would be to see results with new+old GPUs. As in, "Should I keep my old card in tandem with my new one?"
  • extide - Monday, October 26, 2015 - link

    With AFR, no. If they do a different type of split where each card gets a different set of work to do, and one card gets more than the other, then yes.
  • Refuge - Tuesday, October 27, 2015 - link

    I was under the impression that it had to be DX12 compatible to work.

    That cuts out 90% of the older GPU's out there.
  • DanNeely - Tuesday, October 27, 2015 - link

    Most of the make it faster stuff in DX12 will work on DX11 capable hardware; the stuff that needs new hardware is relatively incidental. AMD intends to support all GCN cards, NVidia as far back as the 4xx family (excluding any low end rebadges). I'm not sure how far along they are with extending support back that far yet.
  • Refuge - Tuesday, October 27, 2015 - link

    These new Multi-GPU modes will require full DX12 compliant cards though correct?

    And thank you for the info, I was unaware of how far back support was going. I'm pleasantly surprised :D
  • rascalion - Monday, October 26, 2015 - link

    I'm excited to see how this technology plays out in the laptop space with small dGPU + iGPU working together.

Log in

Don't have an account? Sign up now