DirectX 9 vs. DirectX 10

Here we'll take a closer look at some of the scaling differences between DirectX 9 and DirectX 10 on current hardware under current drivers with Company of Heroes and Lost Planet.

First up is a look at relative scaling between cards under each API. The idea is to see whether cards that perform better under DX9 also perform better under DX10 (and vice versa). This will only give us a glimpse at what could happen going forward, as every game (and every implementation of that game) will be different.

Company of Heroes

For Company of Heroes, we see huge performance drops in moving to DirectX 10 from DirectX 9. The new lighting and shadowing techniques combined with liberal geometry shader use are responsible for at least halving performance when running the more detailed DX10 path. NVIDIA seems to handle the new features Relic added better than AMD. These results are especially impressive remembering that NVIDIA already outperformed AMD hardware under DX9.

Lost Planet

Lost Planet is a completely different animal. With Capcom going for a performance boost under DX10, we can see that they actually succeeded with the top of the line NVIDIA cards. There isn't much else enticing about the DX10 version of Lost planet, and it's clear that AMD's drivers haven't been optimized to tackle this game quite yet.

Next we want to take a look at AA scaling difference between DirectX 9 and DirectX 10. Can we expect less impact from AA on one API or the other? Let's take a look.

Call of Juarez

Under Call of Juarez, our low-end NVIDIA cards suffer from a huge drop in performance when AA is enabled. This is likely due to the fact that they can't handle either the bandwidth or the shader requirements of Techland's HDR correct AA. The higher end parts seem to handle the AA method fairly well, though certainly NVIDIA would be happier if the retained their hardware AA advantage.

Company of Heroes

For our DX10 Company of Heroes test, which does use hardware MSAA resolve where available, AMD hardware scales much worse than NVIDIA hardware.

Company of Heroes

All of our cards scale worse under DX9 when enabling 4xAA than under DX10. While we don't have enough information to really understand why that is under Company of Heroes, it is certainly interesting to see some sort of across the board performance advantage for DX10 (even if it is in a round about way).

Lost Planet

Lost Planet

Lost Planet, with its attempt to improve performance by moving to DX10, delivers very similar performance impact from AA in either DX9 or DX10. Again we see a very slight scaling advantage in favor of DX10 (especially with AMD hardware), but nothing life changing.

Lost Planet: Extreme Condition Final Words
POST A COMMENT

59 Comments

View All Comments

  • slickr - Monday, July 09, 2007 - link

    Great review, thats what we all need to get Nvidia and ATI stop bitchin around and stealing our money with slow hardware that can't even outperform last generations hardware. If you ask me the 8800Ultra should be the middle 150$ class here and top end should be some graphic card with 320 stream processors 1GB GDDR4 clocked at 2.4GHZ and 1000MHz core clock, same from amd they need the X2900XT to be middle 150$ class and top of the line should be some graphic card with 640stream processors 1GB GDDR4 2.4GHz and 1000MHz core clock!

    More of this kind of reviews please so we can put to ATI and Nvidia we won't buy their hardware if its not good!!!!!!!!

    Reply
  • ielmox - Tuesday, July 24, 2007 - link

    I really enjoyed this review. I have been agonizing over selecting an affordable graphics card that will give me the kind of value I enjoyed for years from my trusty and cheap GF5900xt (which runs Prey, Oblivion, and EQ2 at decent quality and frame rates) and I am just not seeing it.

    I'm avoiding ATI until they bring their power use under control and generally get their act together. I'm avoiding nVidia because they're gouging the hell out of the market. And the previous generation nVidia hardware is still quite costly because nVidia know very well that they've not provided much of an upgrade with the 8xxx family, unless you are willing to pay the high prices for the 8800 series (what possessed them to use a 128bit bus on everything below the 8800?? Did they WANT their hardware to be crippled?).

    As a gamer who doesn't want to be a victim of the "latest and greatest" trends, I want affordable performance and quality and I don't really see that many viable options. I believe we have this half-baked DX10 and Vista introduction to thank for it - system requirements keep rocketing upwards unreasonably but the hardware economics do not seem to be keeping pace.

    Reply
  • AnnonymousCoward - Saturday, July 07, 2007 - link

    Thanks Derek for the great review. I appreciate the "%DX10 performance of DX9" charts, too. Reply
  • Aberforth - Thursday, July 05, 2007 - link

    This article is ridiculous. Why would Nvidia and other dx10 developers want gamers to buy G80 card for high dx10 performance? DX10 is all about optimization, the performance factor depends on how well it is implemented and not by blindly using API's. Vista's driver model is different and dx10 is different. The present state of Nvidia drivers are horrible, we can't even think of dx10 performance at this stage.

    the dx10 version of lost planet runs horribly eventhough it is not graphically different from dx9 version. So this isn't dx10 or GPU's fault, it's all about the code and the drivers. Also the CEO of Crytek has confirmed that Nvidia 8800 (possibly 8800GTS) and E6600 CPU can max Crysis in Dx10 mode.

    Long back when dx9 came out I remember reading an article about how it sucked badly. So I'm definetly not gonna buy this one.
    Reply
  • titan7 - Thursday, July 12, 2007 - link

    No, it's not about sucky code or sucky drivers. It's about shaders. Look at how much faster cards with more shader power are in d3d9. Now in d3d10 longer, richer, prettier shaders are used that take more power to process.

    It's not about optimization this time as the IHVs have already figured out how to write optimized drivers, it's about raw FLOPS for shader performance.
    Reply
  • DerekWilson - Thursday, July 05, 2007 - link

    DX9 performance did (and does) "suck badly" on early DX9 hardware.

    DX10 is a good thing, and pushing the limits of hardware is a good thing.

    Yes drivers and game code can be rocky right now, but the 162 from NVIDIA are quite stable and NV is confident in their performance. Lost planet shows that NV's drivers are at least getting close to parity with DX9.

    This isn't an article about DX10 not being good, it's an article about early DX10 hardware not being capable of delivering all that DX10 has to offer.

    Which is as true now as it was about early DX9 hardware.
    Reply
  • piroroadkill - Friday, July 06, 2007 - link

    Wait, performance on the Radeon 9700 Pro sucked? I seem to remember games several years later that were DirectX 9 still being playable... Reply
  • DerekWilson - Saturday, July 07, 2007 - link

    yeah, 9700 pro sucks ... when actually running real world DX9 code.

    Try running BF2 at any playable setting (100% view distance, high shadows and lighting). This is really where games started using DX9 (to my knowledge, BF2 was actually the first game to require DX9 support to run).

    But many other games still include the ability to run 1.x shaders rather 2.0 ... Like Oblivion can turn the detail way down to the point where there aren't any DX9 heavy features running. But if you try to enable them on a 9700 Pro it will not run well at all. I actually haven't tested Oblivion at the lowest quality so I don't know if it can be playable on a 9700 Pro, but if it is, it wouldn't even be the same game (visually).
    Reply
  • DerekWilson - Saturday, July 07, 2007 - link

    BTW, BF2 was released less than 3 years after the 9700 Pro ... (aug 02 to june 05) ... Reply
  • Aberforth - Thursday, July 05, 2007 - link

    Fine...

    Just want to know why a DX10 game called Crysis was running at 2048x1536 res with 60+ FPS equipped with Geforce 8800 GTX.

    crysis-online.com/?id=172
    Reply

Log in

Don't have an account? Sign up now