The DirectX 12 Performance Preview: AMD, NVIDIA, & Star Swarm
by Ryan Smith on February 6, 2015 2:00 PM EST- Posted in
- GPUs
- AMD
- Microsoft
- NVIDIA
- DirectX 12
First Thoughts
Bringing our preview of DirectX 12 to a close, what we’re seeing today is both a promising sign of what has been accomplished so far and a reminder of what is left to do. As it stands much of DirectX 12’s story remains to be told – features, feature levels, developer support, and more will only finally be unveiled by Microsoft next month at GDC 2015. So today’s preview is much more of a beginning than an end when it comes to sizing up the future of DirectX.
But for the time being we’re finally at a point where we can say the pieces are coming together, and we can finally see parts of the bigger picture. Drivers, APIs, and applications are starting to arrive, giving us our first look at DirectX 12’s performance. And we have to say we like what we’ve seen so far.
With DirectX 12 Microsoft and its partners set out to create a cross-vendor but still low-level API, and while there was admittedly little doubt they could pull it off, there has always been the question of how well they could do it. What kind of improvements and performance could you truly wring out of a new API when it has to work across different products and can never entirely avoid abstraction? The answer as it turns out is that you can still enjoy all of the major benefits of a low-level API, not the least of which are the incredible improvements in CPU efficiency and multi-threading.
That said, any time we’re looking at an early preview it’s important to keep our expectations in check, and that is especially the case with DirectX 12. Star Swarm is a best case scenario and designed to be a best case scenario; it isn’t so much a measure of real world performance as it is technological potential.
But to that end, it’s clear that DirectX 12 has a lot of potential in the right hands and the right circumstances. It isn’t going to be easy to master, and I suspect it won’t be a quick transition, but I am very interested in seeing what developers can do with this API. With the reduced overhead, the better threading, and ultimately a vastly more efficient means of submitting draw calls, there’s a lot of potential waiting to be exploited.
245 Comments
View All Comments
Ryan Smith - Friday, February 6, 2015 - link
Gen 7.5 graphics and up will support DX12 . So that's Haswell/4th Gen Core and newer.Murloc - Saturday, February 7, 2015 - link
as you said, it's unlikely to bring much advantage. Just the usual new features from one version to the next.inighthawki - Sunday, February 8, 2015 - link
Remember that lower CPU requirements means that there is more power available for the integrated GPU. There was an article that sort of described the behavior a while back:http://www.extremetech.com/gaming/187970-directx-1...
tipoo - Friday, February 6, 2015 - link
I'm confused at this"What we find is that Star Swarm and DirectX 12 are so efficient that only our most powerful card, the GTX 980, is not CPU bound even with 2 cores. "
I'm not sure how the first part of that proves the second. Wouldn't more CPU efficiency more likely be shown in being GPU bound, not CPU bound?
tipoo - Friday, February 6, 2015 - link
Yeah, having read the next few pages I think that should either say "is CPU bound" rather than "not CPU bound", as the rest of the cards can be fed with just a 2 core CPU, while the 980 has headroom for more performance.OrphanageExplosion - Friday, February 6, 2015 - link
There are some pretty big differences in the CPU utilisation of the DX11 NVIDIA and AMD drivers. Maybe reviewing all GPUs with a high-end i7 isn't such a good idea, particularly on the lower-end and mainstream cards which aren't likely to be paired with top-end processors?yannigr2 - Friday, February 6, 2015 - link
Thats a very interesting article and a big victory for Maxwell architecture. I hope AMD's 300 series to be more mature under SX12 and Mantle and perform much better that 200 series.It will be extremely interesting to see an AMD FX in this test. Maybe the ugly duck could transform to a swan?
200380051 - Friday, February 6, 2015 - link
Your comment sums it up well. FX test, great idea.zmeul - Friday, February 6, 2015 - link
quick question:why VRAM usage hasn't been taken into account ?
Ryan Smith - Saturday, February 7, 2015 - link
The short answer is that all of these cards have enough VRAM that it's not a real issue.