First Thoughts

Wrapping up our first look at Ashes of the Singularity and DirectX 12 Explicit Multi-Adapter, when Microsoft first unveiled the technology back at BUILD 2015, I figured it would only be a matter of time until someone put together a game utilizing the technology. After all, Epic and Square already had their tech demos up and running. However with the DirectX 12 ecosystem still coming together here in the final months of 2015 – and that goes for games as well as drivers – I wasn’t expecting something quite this soon.

As it stands the Ashes of the Singularity multi-GPU tech demo is just that, a tech demo for a game that itself is only in Alpha testing. There are still optimizations to be made and numerous bugs to be squashed. But despite all of that, seeing AMD and NVIDIA video cards working together to render a game is damn impressive.

Seeing as this build of Ashes is a tech demo, I’m hesitant to read too much into the precise benchmark numbers we’re seeing. That said, the fact that the fastest multi-GPU setup was a mixed AMD/NVIDIA GPU setup was something I wasn’t expecting and definitely makes it all the more interesting. DirectX 11 games are going to be around for a while longer yet, so we’re likely still some time away from a mixed GPU gaming setup being truly viable, but it will be interesting to see just what Oxide and other developers can pull off with explicit multi-adapter as they become more familiar with the technology and implement more advanced rendering modes.

Meanwhile it’s interesting to note just how far the industry as a whole has come since 2005 or even 2010. GPU architectures have become increasingly similar and tighter API standards have greatly curtailed the number of implementation differences that would prevent interoperability. And with Explicit Multi-Adapter, Microsoft and the GPU vendors have laid down a solid path for allowing game developers to finally tap the performance of multiple GPUs in a system, both integrated and discrete.

The timing couldn’t be any better either. As integrated GPUs have consumed the low-end GPU market and both CPU vendors devote more die space than ever to their respective integrated GPUs, using a discrete GPU leaves an increasingly large amount of silicon unused in the modern gaming system. Explicit multi-adapter in turn isn’t the silver bullet to that problem, but it is a means to finally putting the integrated GPU to good use even when it’s not a system’s primary GPU.

However with that said, it’s important to note that what happens from here is ultimately more in the hands of game developers than hardware developers. Given the nature of the explicit API, it’s now the game developers that have to do most of the legwork on implementing multi-GPU, and I’m left to wonder how many of them are up to the challenge. Hardware developers have an obvious interest in promoting and developing multi-GPU technology in order to sell more GPUs – which is how we got SLI and Crossfire in the first place – but software developers don’t have that same incentive.

Ultimately as gamers all we can do is take a wait-and-see approach to the whole matter. But as DirectX 12 game development ramps up, I am cautiously optimistic that positive experiences like Ashes will help encourage other developers to plan for multi-adapter support as well.

Ashes GPU Performance: Single & Mixed 2012 GPUs
Comments Locked

180 Comments

View All Comments

  • andrew_pz - Tuesday, October 27, 2015 - link

    Radeon placed in 16x slot, GeFroce installed to 4x slot only. WHY?
    It's cheat!
  • silverblue - Tuesday, October 27, 2015 - link

    There isn't a 4x slot on that board. To quote the specs...

    "- 4 x PCI Express 3.0 x16 slots (PCIE1/PCIE2/PCIE4/PCIE5: x16/8/16/0 mode or x16/8/8/8 mode)"

    Even if the GeForce was in an 8x slot, I really doubt it would've made a difference.
  • Ryan Smith - Wednesday, October 28, 2015 - link

    Aye. And just to be clear here, both cards are in x16 slots (we're not using tri-8 mode).
  • brucek2 - Tuesday, October 27, 2015 - link

    The vast majority of PCs, and 100% of consoles, are single GPU (or less.) Therefore developers absolutely must ensure their game can run satisfactorily on one GPU, and have very little to gain from investing extra work in enabling multi GPU support.

    To me this suggests that moving the burden of enabling multi-gpu support from hardware sellers (who can benefit from selling more cards) to game publishers (who basically have no real way to benefit at all) is that the only sane decision is not invest any additional development or testing on multi gpu support and that therefore multi GPU support will effectively be dead in the DX12 world.

    What am I missing?
  • willgart - Tuesday, October 27, 2015 - link

    well... you no longer need to change your card to a big one, you can just upgrade your pc with a low or middle entry card to get a good boost! and you keep your old one. from a long term point of view we win, not the hardware resellers.
    imagine today you have a GTX970, in 4 years you can get a GTX 2970 and have a stronger system than a single 2980 card... specialy the FPS / $ is very interesting.

    and when you compare the setup HD7970+GTX680, maybe the cost is 100$ today(?) can be compared to a single GTX980 which cost nearly 700$...
  • brucek2 - Tuesday, October 27, 2015 - link

    I understand the benefit to the user. What I'm worried is missing is incentive to the game developer. For them the new arrangement sounds like nothing but extra cost and likely extra technical support hassle to make multi-gpu work. Why would they bother? To use your example of a user with 7970+680, the 680 alone would at least meet the console-equivalent setting, so they'd probably just tell you to use that.)
  • prtskg - Wednesday, October 28, 2015 - link

    It would make their game run better and thus improve their brand name.
  • brucek2 - Wednesday, October 28, 2015 - link

    Making it run "better" implies it runs "worse" for the 95%+ of PC users (and 100% of console users) who do not have multi-GPU. That's a non-starter. The publisher has to make it a good experience for the overwhelmingly common case of single gpu or they're not going to be in business for very long. Once they've done that, what they are left with is the option to spend more of their own dollars so that a very tiny fraction of users can play the same game at higher graphics settings. Hard to see how that's going to improve their brand name more than virtually anything else they'd choose to spend that money on, and certainly not for the vast majority of users who will never see or know about it.
  • BrokenCrayons - Wednesday, October 28, 2015 - link

    You're not missing anything at all. Multi-GPU systems, at least in the case of there being more than one discrete GPU, represent a small number of halo desktop computers. Desktops, gaming desktops in particular, are already a shrinking market and even the large majority of such systems contain only a single graphics card. This means there's minimal incentive for a developer of a game to bother soaking up the additional cost of adding support for multi GPU systems. As developers are already cost-sensitive and working in a highly competitive business landscape, it seems highly unlikely that they'll be willing to invest the human resources in the additional code or soak up the risks associated with bugs and/or poor performance. In essence, DX12 seems poised to end multi GPU gaming UNLESS the dGPU + iGPU market is large enough in modern computers AND the performance benefits realized are worth the cost to the developers to write code for it. There are, after all, a lot more computers (even laptops and a very limited number of tablets) that contain an Intel graphics processor and an NV or more rarely an AMD dGPU. Though even then, I'd hazard a guess to say that the performance improvement is minimal and not worth the trouble. Plus most computers sold contain only whatever Intel happens to throw onto the CPU die so even that scenario is of limited benefit in a world of mostly integrated graphics processors.
  • mayankleoboy1 - Wednesday, October 28, 2015 - link

    Any idea what LucidLogix are doing these days?
    Last i remember, they had released some software solutions which reduced battery drain on Samsung devices (by dynamically decreasing the game rendering quality

Log in

Don't have an account? Sign up now