GPU Scaling

Switching gears, let’s take a look at performance from a GPU standpoint, including how well Star Swarm performance scales with more powerful GPUs now that we have eliminated the CPU bottleneck. Until now Star Swarm has never been GPU bottlenecked on high-end NVIDIA cards, so this is our first time seeing just how much faster Star Swarm can get until it runs into the limits of the GPU itself.

Star Swarm GPU Scaling - Extreme Quality (4 Cores)

As it stands, with the CPU bottleneck swapped out for a GPU bottleneck, Star Swarm starts to favor NVIDIA GPUs right now. Even accounting for performance differences, NVIDIA ends up coming out well ahead here, with the GTX 980 beating the R9 290X by over 50%, and the GTX 680 some 25% ahead of the R9 285, both values well ahead of their average lead in real-world games. With virtually every aspect of this test still being under development – OS, drivers, and Star Swarm – we would advise not reading into this too much right now, but it will be interesting to see if this trend holds with the final release of DirectX 12.

Meanwhile it’s interesting to note that largely due to their poor DirectX 11 performance in this benchmark, AMD sees the greatest gains from DirectX 12 on a relative basis and comes close to seeing the greatest gains on an absolute basis as well. The GTX 980’s performance improves by 150% and 40.1fps when switching APIs; the R9 290X improves by 416% and 34.6fps. As for AMD’s Mantle, we’ll get back to that in a bit.

Star Swarm GPU Scaling - Extreme Quality (2 Cores)

Having already established that even 2 CPU cores is enough to keep Star Swarm fed on anything less than a GTX 980, the results are much the same here for our 2 core configuration. Other than the GTX 980 being CPU limited, the gains from enabling DirectX 12 are consistent with what we saw for the 4 core configuration. Which is to say that even a relatively weak CPU can benefit from DirectX 12, at least when paired with a strong GPU.

However the GTX 750 Ti result in particular also highlights the fact that until a powerful GPU comes into play, the benefits today from DirectX 12 aren’t nearly as great. Though the GTX 750 Ti does improve in performance by 26%, this is far cry from the 150% of the GTX 980, or even the gains for the GTX 680. While AMD is terminally CPU limited here, NVIDIA can get just enough out of DirectX 11 that a 2 core configuration can almost feed the GTX 750 Ti. Consequently in the NVIDIA case, a weak CPU paired with a weak GPU does not currently see the same benefits that we get elsewhere. However as DirectX 12 is meant to be forward looking – to be out before it’s too late – as GPU performance gains continue to outstrip CPU performance gains, the benefits even for low-end configurations will continue to increase.

CPU Scaling DirectX 12 vs. Mantle, Power Consumption
Comments Locked

245 Comments

View All Comments

  • Archetype - Saturday, August 1, 2015 - link

    They did not optimize the hell out of it for NVidia. They just added DX12 support. Originally it was just for DX11 on any card and Mantle supported by AMD. It is not a game - Its a tech demo.

    The 980 obviously has significant horse power. I am just unsure why they used a 290x and not the current flagship. But maybe the 290x is still supposed to be more powerful.
  • Freosan - Sunday, August 2, 2015 - link

    Archetype. When this was published, the 290x WAS the current AMD flagship.
  • Azix - Monday, August 3, 2015 - link

    flagship was 295x2
  • Archetype - Saturday, August 1, 2015 - link

    They explained it quite clearly. Starswarm was written specifically to have zounds of draw calls and as such a high level API (layers and layers of API between the code that draws the scenes) will not be able to deal with it well. That was the whole point of Mantle and now DX12. To remove some of those layers of API and give software more direct access to underlying drivers and through them the hardware. You really need to get more informed.
  • Flunk - Friday, February 6, 2015 - link

    That might annoy me more if they weren't giving everyone on Windows 7 and 8 an upgrade to 10 for nothing. I suppose not having to backport this to Windows 8 (as was originally announced) is probably saving a fair amount.
  • dakishimesan - Friday, February 6, 2015 - link

    Sorry for my laziness, I was actually just pointing out a spelling mistake; I think it should say because DirectX 12… Etc.

    But for what it's worth I agree with you, with windows 10 being a free upgrade, and only enthusiast really caring about direct X 12 on their current high performance gear, I have no problem with Microsoft doing it this way.
  • Christopher1 - Monday, February 16, 2015 - link

    Exactly. I understood why people were reticent to upgrade to Windows 7 and 8 because it was a paid upgrade, but with Windows 10 being a free upgrade if you have Windows 7 or 8? No reason not to upgrade.
  • yuhong - Friday, February 6, 2015 - link

    Yea, the old DirectX redists are long dead.
  • Wwhat - Saturday, February 7, 2015 - link

    I'm highly suspicious about windows 10.
    There is a reason why they give it for free, as the saying goes "If You're Not Paying for It; You're the Product", which is generally true these days if it comes from a commercial outfit.
  • damianrobertjones - Saturday, February 7, 2015 - link

    The more that people use the Windows store the more profit MS makes. Simple.

Log in

Don't have an account? Sign up now