Star Swarm & The Test

For today’s DirectX 12 preview, Microsoft and Oxide Games have supplied us with a newer version of Oxide’s Star Swarm demo. Originally released in early 2014 as a demonstration of Oxide’s Nitrous engine and the capabilities of Mantle, Star Swarm is a massive space combat demo that is designed to push the limits of high-level APIs and demonstrate the performance advantages of low-level APIs. Due to its use of thousands of units and other effects that generate a high number of draw calls, Star Swarm can push over 100K draw calls, a massive workload that causes high-level APIs to simply crumple.

Because Star Swarm generates so many draw calls, it is essentially a best-case scenario test for low-level APIs, exploiting the fact that high-level APIs can’t effectively spread out the draw call workload over several CPU threads. As a result the performance gains from DirectX 12 in Star Swarm are going to be much greater than most (if not all) video games, but none the less it’s an effective tool to demonstrate the performance capabilities of DirectX 12 and to showcase how it is capable of better distributing work over multiple CPU threads.

It should be noted that while Star Swarm itself is a synthetic benchmark, the underlying Nitrous engine is relevant and is being used in multiple upcoming games. Stardock is using the Nitrous engine for their forthcoming Star Control game, and Oxide is using the engine for their own game, set to be announced at GDC 2015. So although Star Swarm is still a best case scenario, many of its lessons will be applicable to these future games.

As for the benchmark itself, we should also note that Star Swarm is a non-deterministic simulation. The benchmark is based on having two AI fleets fight each other, and as a result the outcome can differ from run to run. The good news is that although it’s not a deterministic benchmark, the benchmark’s RTS mode is reliable enough to keep the run-to-run variation low enough to produce reasonably consistent results. Among individual runs we’ll still see some fluctuations, while the benchmark will reliably demonstrate larger performance trends.


Star Swarm RTS Mode

The Test

For today’s preview Microsoft, NVIDIA, and AMD have provided us with the necessary WDDM 2.0 drivers to enable DirectX 12 under Windows 10. The NVIDIA driver is 349.56 and the AMD driver is 15.200. At this time we do not know when these early WDDM 2.0 drivers will be released to the public, though we would be surprised not to see them released by the time of GDC in early March.

In terms of bugs and other known issues, Microsoft has informed us that there are some known memory and performance regressions in the current WDDM 2.0 path that have since been fixed in interim builds of Windows. In particular the WDDM 2.0 path may see slightly lower performance than the WDDM 1.3 path for older drivers, and there is an issue with memory exhaustion. For this reason Microsoft has suggested that a 3GB card is required to use the Star Swarm DirectX 12 binary, although in our tests we have been able to run it on 2GB cards seemingly without issue. Meanwhile DirectX 11 deferred context support is currently broken in the combination of Star Swarm and NVIDIA's drivers, causing Star Swarm to immediately crash, so these results are with D3D 11 deferred contexts disabled.

For today’s article we are looking at a small range of cards from both AMD and NVIDIA to showcase both performance and compatibility. For NVIDIA we are looking at the GTX 980 (Maxwell 2), GTX 750 Ti (Maxwell 1), and GTX 680 (Kepler). For AMD we are looking at the R9 290X (GCN 1.1), R9 285 (GCN 1.2), and R9 260X (GCN 1.1). As we mentioned earlier support for Fermi and GCN 1.0 cards will be forthcoming in future drivers.

Meanwhile on the CPU front, to showcase the performance scaling of Direct3D we are running the bulk of our tests on our GPU testbed with 3 different settings to roughly emulate high-end Core i7 (6 cores), i5 (4 cores), and i3 (2 cores) processors. Unfortunately we cannot control for our 4960X’s L3 cache size, however that should not be a significant factor in these benchmarks.

DirectX 12 Preview CPU Configurations (i7-4960X)
Configuration Emulating
6C/12T @ 4.2GHz Overclocked Core i7
4C/4T @ 3.8GHz Core i5-4670K
2C/4T @ 3.8GHz Core i3-4370

Though not included in this preview, AMD’s recent APUs should slot between the 2 and 4 core options thanks to the design of AMD’s CPU modules.

CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional
Power Supply: Corsair AX1200i
Hard Disk: Samsung SSD 840 EVO (750GB)
Memory: G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)
Case: NZXT Phantom 630 Windowed Edition
Monitor: Asus PQ321
Video Cards: AMD Radeon R9 290X
AMD Radeon R9 285
AMD Radeon R7 260X
NVIDIA GeForce GTX 980
NVIDIA GeForce GTX 750 Ti
NVIDIA GeForce GTX 680
Video Drivers: NVIDIA Release 349.56 Beta
AMD Catalyst 15.200 Beta
OS: Windows 10 Technical Preview 2 (Build 9926)

Finally, while we’re going to take a systematic look at DirectX 12 from both a CPU standpoint and a GPU standpoint, we may as well answer the first question on everyone’s mind: does DirectX 12 work as advertised? The short answer: a resounding yes.

Star Swarm GPU Scaling - Extreme Quality (4 Cores)

The Current State of DirectX 12 & WDDM 2.0 CPU Scaling
Comments Locked

245 Comments

View All Comments

  • OrphanageExplosion - Tuesday, February 10, 2015 - link

    Sony already has its low-level API, GNM, plus its DX11-a-like GNMX, for those who need faster results, easier debugging, handling of state etc.
  • Gigaplex - Monday, February 9, 2015 - link

    The consoles already use bare metal APIs. This will have minimal impact for consoles.
  • Bill McGann - Tuesday, February 10, 2015 - link

    D3D11.x is garbage compared to GNM (plus the XbOne HW is garbage compared to the PS4). The sad fact is that the XbOne desperately needs D3D12(.x) in order to catch up to the PS4.
  • Notmyusualid - Monday, February 9, 2015 - link

    Hello all... couldn't find any MULTI-GPU results...

    So here are mine:

    System: M18x R2 3920XM (16GB RAM 1866MHz CAS10), Crossfire 7970Ms.
    Test: Star Swarm Perf demo RTS, all Extreme setting.
    Ambient temp 31C, skype idling in background, and with CPU ALL at stock settings:

    Mantle: fps max 99.03 min 9.65 AVG 26.5
    DX11: fps max 141.56 min 3.26 AVG 11.6

    CPU at 4.4GHz x 4 cores, fans 100%, skype in background again:

    Mantle: fps max 51.03 min 7.26 AVG 26.2 - max temp 94C
    DX11: fps max 229.06 min 3.6 AVG 12.3 - max temp 104C + thus cpu throtteled at 23%.

    So I guess there are good things coming to PC gamers with multi-gpu's then. Alas, being a mere mortal, I've no access to DX12 to play with...

    Anyone care to offer some SLI results in return?
  • lamebot - Monday, February 9, 2015 - link

    I have been reading this article page by page over the weekend, giving myself time to take it in. The frame times from each vendor under DX11 is very interesting. Thank you for a great, in depth article! This is what keeps me coming back.
  • ChristTheGreat - Monday, February 9, 2015 - link

    Can it be only a WIndows 10 Driver problem or optimisation for these result with the R9 290/X? On windows 8.1, 4770k 4.3ghz and R9 290 1100/1400, I do 40avg D3D11 and 66avg Mantle.. Scenario: follow, Detail: extreme...
  • ChristTheGreat - Monday, February 9, 2015 - link

    Oh well, I saw that you guys use the Asus PQ321, so running in 3 840 x 2 160?
  • Wolfpup - Monday, February 9, 2015 - link

    Good, can Mantle go away now? The idea of moving BACK to proprietary APIs is a terrible one, unless the only point of it was to push Microsoft to make these fixes in Direct X.

    "Stardock is using the Nitrous engine for their forthcoming Star Control game"

    I didn't know! This is freaking awesome news. I love love love Star Control 2 on the 3DO. Liked 1 on the Genesis too, but SC2 on the 3DO (which I think is better than the PC version) is not unlike Starflight-and somewhat like Mass Effect.
  • Wolfpup - Monday, February 9, 2015 - link

    Forgot to mention... I also think AMD needs to worry about making sure their DirectX/Open GL drivers are flawless across years of hardware before wasting time on a proprietary API too...
  • lordken - Monday, February 9, 2015 - link

    Sorry but are you just amd hater or something?
    imho AMD did great thing with mantle and possibly pushed M$ to come with DX12 (or not we will never know). But that aside how about you think about some advantages mantle has?
    I think that when they bring it to linux (they said they would, not sure about current status) that will be nice advantage as I guess native api would work better than wine etc. With more recent games (thanks to new engines) comming to linux would probably benefit more if they would run on mantle.

    And nonetheless how about ppl that wont move to Win10 (for whatever reason)? Mantle on Win7/8 would still be huge benefit for such ppl. Not to mention that there will be probably more games with mantle than with DX12 in very close feature.
    Plus if they work out console & pc mantle aspect it could bring better console ports to PC, even if game devs would be lazy to do proper optimalization mantle should pretty much eliminate this aspect (though only for amd gpu).
    But either way I dont see much reason why should mantle go. I mean once it is included in all/most bigger engines (already in cryengine,frostbite,unreal) and it works what reason would be to trash something that is usefull?

    btw funny how "M$ would need to do huge kernel rework to bring DX12 to Win7/8" while mantle, which does similar thing, is easily capable to be "OS version independent" (sure it is amd specific but still)

    PS:What about amd drivers? They are fine imho, never had problem in last years. (you see personal experience is not a valid argument)

Log in

Don't have an account? Sign up now