Final Words

For now, AMD does seem to have an advantage in Call of Juarez, while NVIDIA leads the way in Company of Heroes and Lost Planet. But as far as NVIDIA vs. AMD in DirectX 10 performance, we really don't want to call a winner right now. It's just way too early, and there are many different factors behind what we are seeing here. As the dust settles and everyone gets fully optimized DirectX 10 drivers out the door with a wider variety of games, then we'll be happy to take a second look.

The more important fact to realize is that DirectX 10 is finally here. While developers are used to programmable hardware after years with DirectX 9, there is still room for experimentation and learning with geometry shaders, more flexibility, lower state change and object overhead, and (especially) faster hardware. But DirectX 10 isn't an instant pass to huge performance and incredible effects.

Let's look at it like this: there are really three ways a game can come to support DirectX 10, and almost all games over the next few years will ship with a DX9 path as well. The easiest thing is to do a straight port of features from DirectX 9 (which should generally be slightly faster than the DirectX 9 counterpart if drivers are of equal quality). We could also see games offer a DirectX 10 version with enhanced features that could still be implemented in DX9 in order to offer an incentive for users to move to a DX10 capable platform. The most aggressive option is to implement a game focused around effects that can only be effectively achieved through DirectX 10.

Games which could absolutely only be done in DX10 won't hit for quite a while for a number of reasons. The majority of users will still be on DX9 platforms. It is logical to spend the most effort developing for the user base that will actually be paying for the games. Developers are certainly interested in taking advantage of DX10, but all games for the next couple of years will definitely have a DX9 path. It doesn't make sense to rewrite everything from the ground up if you don't have to.

We are also hearing that some of the exclusive DX10 features that could enable unique and amazing effects DX9 isn't capable of just don't perform well enough on current hardware. Geometry shader heavy code, especially involving geometry amplification, does not perform equally well on all available platforms (and we're looking at doing some synthetic tests to help demonstrate this). The performance of some DX10 features is lacking to the point where developers are limited in how intensely they can use these new features.

Developers (usually) won't write code that will work fine on one platform and not at all on another. The decisions on how to implement a game are in the hands of the developers, and that's where gamers rightly look when performance is bad or hardware and feature support is not complete. Building a consistent experience for all gamers is important. It won't be until most users have hardware that can handle all the bells and whistles well that we'll see games start to really push the limits of DX10 and reach beyond what DX9 can do.

In conversations with developers we've had thus far, we get the impression that straight ports of DX9 to DX10 won't be the norm either. After all, why would a developer want to spend extra time and effort developing, testing and debugging multiple code paths that do exactly the same thing? This fact, combined with the lack of performance in key DX10 features on current hardware, means it's very likely that the majority of DX10 titles coming out in the near term will only be slightly enhanced versions of what could have been done through DX9.

Both NVIDIA and AMD were very upset over how little we thought of their DX10 class mainstream hardware. They both argued that graphics cards are no longer just about 3D, and additional video decode hardware and DX10 support add a lot of value above the previous generation. We certainly don't see it this way. Yes, we can't expect last years high-end performance to trickle down to the low-end segment, but we should at least demand that this generation's $150 part will always outperform last generation's.

This is especially important in a generation that defines the baseline of support for a new API. The 2400 and 8400 cards will always be the lowest common denominator in DX10 hardware (until Intel builds a DX10 part, but most developers will likely ignore that unless Intel can manage to pull a rabbit out of their hat). We can reasonably expect that people who want to play games will opt for at least an 8600 or a 2600 series card. Going forward, developers will have to take that into account, and we won't be able to see key features of games require more horsepower than these cards provide for the next couple of years.

AMD and NVIDIA had the chance to define the minimum performance of a DX10 class part higher than what we can expect from cards that barely get by with DX9 code. By choosing to design their hardware without a significant, consistent performance advantage over the X1600 and 7600 class of parts, developers have even less incentive (not to mention ability) to push next generation features only possible with DX10 into their games. These cards are just not powerful enough to enable widespread use of any features that reach beyond the capability of DirectX 9.

Even our high-end hardware struggled to keep up in some cases, and the highest resolution we tested was 2.3 megapixels. Pushing the resolution up to 4 MP (with 30" display resolutions of 2560x1600) brings all of our cards to their knees. In short, we really need to see faster hardware before developers can start doing more impressive things with DirectX 10.

DirectX 9 vs. DirectX 10
Comments Locked

59 Comments

View All Comments

  • misaki - Thursday, July 5, 2007 - link

    So Nvidia and AMD actually complained about their "mainstream" parts being below par?

    It sounds to me like they are seriously out of touch. Media center PCs will get their 8400 and 2400 cards for h264 acceleration. Gamers with lots of money will buy those $400+ cards as usual. But your average gamer in the $200 market is stuck with junk that is unplayable for dx10 games and performs like previous gen hardware that just barely makes the grade for current games already on the market.

    What is so hard to understand?
  • MadBoris - Thursday, July 5, 2007 - link

    I'm glad that some of the things that were said in the article, are starting to be said. DX10 reminds me of Physx, great concept but can't possibly succeed due to certain hurdles in the technology being able to actually take off. Unfortunately DX10 also isn't going to be what we all hoped it would be in real performance, video drivers are not the only reason.

    There is something real counterintuitive with quality native DX10 rendering support.
    There is very little incentive for a developer to produce a good DX10 renderer when developers have DX9 support on Vista along with many of their real current goals of console support for more customer base. With only so many hours a day, console support is much more lucrative with new people having access to buying a title, that will also actually run just fine with only DX9 on a Vista platform, never needing or really benefitting from DX10.

    The costs of making DX10 games outweigh the benefits, and the benefits aren't currently that palpable in the first place. Furthermore, actual DX10 performance will rarely ever be all that positive, something that is too early to prove, but rather negative even though apples to apples DX9 to DX10 comparisons cannot be made.

    As to the low end parts, it's just marketing, theirs no "real" value DX10 for $100 -$150 for gaming. If people want a good video value purchase for games, $200 is where the good quality/price starts and it's mostly in a previous generation, not some value card for $150. Whether we agree with that price point being 'worth it' is another matter altogether and is purely personal preference. I don't know why Nvidia or ATI even introduced low end DX10 compatible hardware when customers will only get angry at developers or video mfr's for the blunder of underpowered hardware for high end game titles. This low end DX10 hardware mystified me. it was either going to slow down DX10 progress or have to be ignored. It seems obvious that all high end games will have to pretty much ignore the low end parts for achieving acceptable framerates with DX10 for new eye candy titles. They should have left DX10 out entirely in low end, but they had to include it because of competition between AMD/Nvidia, neither wanted to leave the other with a marketing advantage of saying, look we have DX10. DX10 with it's lofty goals of being able to render more in a scene and produce even greater quality eyecandy is at odds with low price. Higher quality rendering will always be at odds with low price, they are mutually exclusive. Low price never is going to give you good performance and quality, people should really start being realistic as to what they are paying for at a $100 - $150 price point, it's a consumer expectations problem. Low end hardware will work fine for games like Sims and those games will target low end hardware, but not high end games for higher resolutions and decent frame rates.

    In the end, with future goggles on I think the picture is becoming quite clear that DX10 will become one of the DX API's that becomes mainly skipped (if a decent DX successor becomes available in next few years). The only time it will really make sense to go above DX9 native support is when Vista saturates >%85 gaming market share. In the several years that that will take, DX11 or higher should be available and will be superior to DX10, so DX10 in hindsight will really end up being just a marketing ploy to upgrade to Vista, little more.

    Glad the mask is starting to come off and more people are being able to see the picture clearer that making any purchases around DX10 with a GPU or OS is silly and bound to cause frustration.
  • strafejumper - Thursday, July 5, 2007 - link

    i upgraded recently - but ended up only spending under $300 for a core2duo system
    this is why - people were saying get a DX10 card - future proof
    i decided to keep my old agp card because i felt real dx10 games and real dx10 hardware were not here yet.

    i'm happy i made i feel the right choice and didn't spend money on new psu, sata drives etc. so i could have a $$$ dx10 card only to play call of juarez at 14 fps.
  • stromgald - Thursday, July 5, 2007 - link

    I have to agree. I was considering trying to get an 8600 for my SFF PC, but after looking at this, I'm probably going to hold off until the next generation. HTPC and SFF PCs just can't handle the heat an 8800 series generates, and I want at least playable DX10 performance.
  • jay401 - Thursday, July 5, 2007 - link

    quote:

    Both NVIDIA and AMD were very upset over how little we thought of their DX10 class mainstream hardware. They both argued that graphics cards are no longer just about 3D, and additional video decode hardware and DX10 support add a lot of value above the previous generation. We certainly don't see it this way. Yes, we can't expect last years high-end performance to trickle down to the low-end segment, but we should at least demand that this generation's $150 part will always outperform last generation's.


    Seriously, F them. It's pathetic they're trying to pawn off half-assed hardware as "mid-range enthusiast" parts when they can't even perform as good as the mid-range from the previous generation. Jerks.
  • jay401 - Thursday, July 5, 2007 - link

    Another new article showing how DX10 Vista performance propaganda is garbage.

    Gotta love people who try to act superior b/c they bought Vista for gaming when all it does is suck up more system resources and uses DX10, the combination of which will easily inhibit performance on equivalent hardware.
  • BigLan - Thursday, July 5, 2007 - link

    "They both argued that graphics cards are no longer just about 3D, and additional video decode hardware and DX10 support add a lot of value above the previous generation."

    Yeah, I don't buy into this either. I've pretty much given up on 'video decode,' be it avivo or purevideo. You end up stuck with using a specific product, rather than ati or nvidia opening the features for any developer to access. Right now, it's only useful with the latest windvd, powerdvd or nero but you have to hope your driver version is the right one, and doesn't (and probably never will) work for x264 or xvid content.

    Purevideo is horribly branded by nvidia - is it card features that everyone has access to, or do you have to buy it from them? And has ati actually released their avivo video converter to the public? Could I use it to compress some of my recorded tv shows from mpeg to xvid?

    Maybe this is like the mpeg2 decoder situation was in 98/99, in which case we should just wait for cpu speeds to increase and mean that we don't need video decode acceleration.
  • titan7 - Thursday, July 12, 2007 - link

    I agree. How much quicker would things be if all the video transistors were spent on more shader processors? I want my video card for video games. I want my dvd player for movies.
  • vailr - Thursday, July 5, 2007 - link

    Comparing mid-range DX10 cards:
    (lowest prices found via froogle.com; shipping not included)
    Radeon 2600XT 256Mb ~$145
    http://www.ewiz.com/detail.php?p=AT-2600XT&c=f...">http://www.ewiz.com/detail.php?p=AT-2600XT&c=f...
    nVidia 8600GT 256Mb ~$100 (after $15 MIR)
    http://www.newegg.com/Product/Product.asp?Item=N82...">http://www.newegg.com/Product/Product.asp?Item=N82...
    How is the 2600XT worth the added $45 v. the 8600GT?
  • Comdrpopnfresh - Thursday, July 5, 2007 - link

    I have a 7600gt oc'd to 635/800. I get 30+fps with better than default settings, with AAx2 set @ 1024x768. Why do these cards not seem to do much better (given they are @ 1280x1024)?

Log in

Don't have an account? Sign up now