DirectX 9 vs. DirectX 10

Here we'll take a closer look at some of the scaling differences between DirectX 9 and DirectX 10 on current hardware under current drivers with Company of Heroes and Lost Planet.

First up is a look at relative scaling between cards under each API. The idea is to see whether cards that perform better under DX9 also perform better under DX10 (and vice versa). This will only give us a glimpse at what could happen going forward, as every game (and every implementation of that game) will be different.

Company of Heroes

For Company of Heroes, we see huge performance drops in moving to DirectX 10 from DirectX 9. The new lighting and shadowing techniques combined with liberal geometry shader use are responsible for at least halving performance when running the more detailed DX10 path. NVIDIA seems to handle the new features Relic added better than AMD. These results are especially impressive remembering that NVIDIA already outperformed AMD hardware under DX9.

Lost Planet

Lost Planet is a completely different animal. With Capcom going for a performance boost under DX10, we can see that they actually succeeded with the top of the line NVIDIA cards. There isn't much else enticing about the DX10 version of Lost planet, and it's clear that AMD's drivers haven't been optimized to tackle this game quite yet.

Next we want to take a look at AA scaling difference between DirectX 9 and DirectX 10. Can we expect less impact from AA on one API or the other? Let's take a look.

Call of Juarez

Under Call of Juarez, our low-end NVIDIA cards suffer from a huge drop in performance when AA is enabled. This is likely due to the fact that they can't handle either the bandwidth or the shader requirements of Techland's HDR correct AA. The higher end parts seem to handle the AA method fairly well, though certainly NVIDIA would be happier if the retained their hardware AA advantage.

Company of Heroes

For our DX10 Company of Heroes test, which does use hardware MSAA resolve where available, AMD hardware scales much worse than NVIDIA hardware.

Company of Heroes

All of our cards scale worse under DX9 when enabling 4xAA than under DX10. While we don't have enough information to really understand why that is under Company of Heroes, it is certainly interesting to see some sort of across the board performance advantage for DX10 (even if it is in a round about way).

Lost Planet

Lost Planet

Lost Planet, with its attempt to improve performance by moving to DX10, delivers very similar performance impact from AA in either DX9 or DX10. Again we see a very slight scaling advantage in favor of DX10 (especially with AMD hardware), but nothing life changing.

Lost Planet: Extreme Condition Final Words
Comments Locked

59 Comments

View All Comments

  • DerekWilson - Thursday, July 5, 2007 - link

    this is true -- our current information shows that AMD does relatively worse than NVIDIA when compared under DX10 than under DX9.
  • rADo2 - Thursday, July 5, 2007 - link

    "there are applications where the 2900 xt does outperform its competition" - where? 2900XT has 22FPS, 8800GTX 24FPS nad 8800ULTRA 26FPS. Despite "crippled for NVIDIA" / "paid by ATI" I see still green camp to outperform ATI.

    And in Lost Planet NVIDIA has 2x (!) better performance.

    It is not even worth considering ATI for purchase.
  • smitty3268 - Thursday, July 5, 2007 - link

    Read the comment again: The point is that the 2900XT does not compete against the 8800GTX, it competes against the 8800GTS, which it did outperform in that test.

    It certainly isn't the fastest card available, but I could also make a statement like "The GeForce7300Go outperforms it's competition" without saying it's the fastest thing available. I'm just saying it beats the cards in a similar price range.
  • KeithTalent - Thursday, July 5, 2007 - link

    Um, what about price? Last time I checked the 8800GTX still costs about $150 more than the 2900XT. I will not even bring up the Ultra which is still way overpriced.

    So for $150 less you get a card that competes with the GTX some of the time and is more than capable of playing most games maxed out at high resolution. That is why the 2900XT is worth considering for purchase.

    KT

  • rADo2 - Thursday, July 5, 2007 - link

    Problem is, 2900XT is many times NOT playable. Its performance is sometimes close to NVIDIA, sometimes 2x lower. And this applies for both DX9 and DX10. With 60+ games I own, I can assume ATI would "suck" on 30 of them.

    Look at Lost Planet with AA, 22FPS versus 40-50FPS is a huge difference in playability. On 8800GTX/ULTRA you can play even at 1920x1200 (30FPS), with 2900XT even 1280x1024 is giving you problems.
  • KeithTalent - Thursday, July 5, 2007 - link

    Well I have two of them and they work more than fine on every single game I have tried so far, including demos, betas, and a couple of older games (using Catalyst 7.6).

    Lost Planet is a POS port anyway, but when I ran the test benchmark in DX9 with Crossfired 2900XTs I had frames well above 40 with everything maxed at 1920x1200 so I am somewhat confused by the numbers here. I will have to wait until I am home to see my exact numbers, but they were much higher than what was presented here. Maybe there is something wonky with the beta drivers?

    I'll post back tonight once I have verified my numbers.

    KT

  • DerekWilson - Thursday, July 5, 2007 - link

    We didn't use the demo benchmark, and the release version (afaik) does not include an updated version of the benchmark either.

    For the Lost Planet test, we had to use FRAPS running through the snow. This will absolutely give lower performance, as the benchmark runs through a couple in door scenes as well with higher framerates.

    I mentioned FRAPS on the test page, but I'll add to the Lost Planet section the method we used for testing.
  • defter - Thursday, July 5, 2007 - link

    Do Intel CPUs outperform currently AMD or not? After all, $200 AMD CPU is about as fast as $200 Intel CPU....

    It's natural that slower parts have good price/peformance ratio compared to competition since otherwise nobody would buy them. However, this has nothing to do which one fastest...

  • KeithTalent - Thursday, July 5, 2007 - link

    Not sure what you are getting at, I was responding to this ridiculous statement:

    quote:

    It is not even worth considering ATI for purchase.


    Which is completely untrue, because price can be a big consideration.

    With respect to CPUs, if you spend an extra $50 - $100 for the better Intel processor, you are getting exponentially better performance (I know this from experience), while if you spend $150 more for a GTX, you are getting only marginally better performance.

    KT

Log in

Don't have an account? Sign up now