DirectX 10.1 on an NVIDIA GPU?

Easily the most interesting thing about the GT 220 and G 210 is that they mark the introduction of DirectX 10.1 functionality on an NVIDIA GPU. It’s no secret that NVIDIA does not take a particular interest in DX10.1, and in fact even with this they still don’t. But for these new low-end parts, NVIDIA had some special problems: OEMs.

OEMs like spec sheets. They want parts that conform to certain features so that they can in turn use those features to sell the product to consumers. OEMs don’t want to sell a product with “only” DX10.0 support if their rivals are using DX10.1 parts. Which in turn means that at some point NVIDIA would need to add DX10.1 functionality, or risk losing out on lucrative OEM contracts.

This is compounded by the fact that while Fermi has bypassed DX10.1 entirely for the high-end, Fermi’s low-end offspring are still some time away. Meanwhile AMD will be shipping their low-end DX11 parts in the first half of next year.

So why do GT 220 and G 210 have DX10.1 functionality? To satisfy the OEMs, and that’s about it. NVIDIA’s focus is still on DX10 and DX11. DX10.1 functionality was easy to add to the GT200-derrived architecture (bear in mind that GT200 already had some DX10.1 functionality), and so it was done for the OEMs. We would also add that NVIDIA has also mentioned the desire to not be dinged by reviewers and forum-goers for lacking this feature, but we’re having a hard time buying the idea that NVIDIA cares about either of those nearly as much as they care about what the OEMs think when it comes to this class of parts.

DX10.1 in a nutshell, as seen in our Radeon 3870 Review

At any rate, while we don’t normally benchmark with DX10.1 functionality enabled, we did so today to make sure DX10.1 was working as it should be. Below are our Battleforge results, using DX10 and DX10.1 with Very High SSAO enabled.

The ultimate proof that DX10.1 is a checkbox feature here is performance. Certainly DX10.1 is a faster way to implement certain effects, but running them in the first place still comes at a significant performance penalty. Hardware of this class is simply too slow to make meaningful use of the DX10.1 content that’s out there at this point.

Index A Better HTPC Card: MPEG-4 ASP Decoding & LPCM Audio


View All Comments

  • Guspaz - Tuesday, October 13, 2009 - link

    Errm, Valve's latest hardware survey shows that only 2.39% of gamers are using 2+ GPUs with SLI or Crossfire. ATI has a 27.26% marketshare.

    Of those who did buy multi-GPU solutions, some may be "hidden" (GTX295, the various X2 solutions), in which case it had no impact whatsoever (since it's presented as a single card). Some may have used it as an upgrade to an existing card, in which case SLI/Crossfire may not have driven their decision.

    It's true that SLI (2.14%) has greatly outsold Crossfire (0.25%), but that's such a tiny market segment that it doesn't amount to much.

    ATI has managed to hold on to a respectable market share. In fact, their 4800 series cards are more popular than every single nVidia series except for the 8800 series.

    So, I think I've sufficiently proven that SLI wasn't a knockout blow... It was barely a tickle to the market at large.
  • Seramics - Tuesday, October 13, 2009 - link

    When Sli came out? Stop mentioning ancient news. Right now, Sli n Xfire r abt equally sucks. Heard of Hydra? Thats the cool stuff dude. And yeah nvidia is very innovative indeed, renaming old products to look new to deceive customers, shave half the spec of a products n keep the same name (9600gso), releasing crappy products n selling it overprice.... MAN! Thats really innovative dun u think? Reply
  • Souleet - Tuesday, October 13, 2009 - link

    Are you ignorant or something, ATI fanboy. GT220 is a 40nm and 9600GSO is a 65nm. How can you say they just changed the name? I thought so... Reply
  • gx80050 - Monday, October 12, 2009 - link

    Die painfully okay? Prefearbly by getting crushed to death in a
    garbage compactor, by getting your face cut to ribbons with a
    pocketknife, your head cracked open with a baseball bat, your stomach
    sliced open and your entrails spilled out, and your eyeballs ripped
    out of their sockets. Fucking bitch

    I really hope that you get curb-stomped. It'd be hilarious to see you
    begging for help, and then someone stomps on the back of your head,
    leaving you to die in horrible, agonizing pain. Faggot

    Shut the fuck up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.

  • gx80050 - Monday, October 12, 2009 - link

    Fuck off and die retard Reply
  • Seramics - Monday, October 12, 2009 - link

    Let's face it. Nvidia is NOT competitive at every front at every single price point. From ultra low end to mid range to ultra high end, tell me, which price point is nvidia being competitive?
    Well, of cos I believe Fermi will be something different. I truly believe so. In fact, given that HD5870's slightly below par performance for its spec (very likely bcos memory bandwith limited), and Fermi being on a much larger die and higher transistor count, I EXPECT nVidia next gen Fermi to easily outperform HD5870. Just like how GTX285 outperform HD4890. But by how much? For almost 100 USD more for juz 5-10% improvements? I believe this will likely be the case with Fermi vs 5870. Surely its faster, but ur mayb paying 100% more to get 25% extra fps.

    CONCLUSION: Even if Nvidia retake the top single GPU performance crown, they were never a winner in price to performance at ANY price point. They care about profits than they care about you.
  • Souleet - Monday, October 12, 2009 - link

    I agree what your conclusion. Definitely price point ATI has always been on the top of their game but NVIDIA innovations is what make the two apart. But who knows, maybe one day ATI/AMD comes out with CPU/GPU solution that will change the technology industry. That would be cool. Reply
  • formulav8 - Monday, October 12, 2009 - link


    Remember what happen to the ATI 9700/9800 series, we all know what happen after that. :)

    NVidia brought out the FX5800 Ultra??
  • TRIDIVDU - Tuesday, September 21, 2010 - link

    My son plays GTA, FIFA, POP, Tomb Raider, NFS etc. in my P4, 3.06 GHz WinXP m/c with N 9400 GT (MSI) 1GB card without any problem in a 19inch LCD monitor. Now that I am planning to exchange the 4 year old m/c with a new i5 650, 3.2 GHz, Win7 m/c fitted with GT220 1 GB card, please tell me whether he will find the new machine a better one to play games with. Reply
  • Thatguy97 - Tuesday, June 30, 2015 - link

    nvidias mid range was shit back then Reply

Log in

Don't have an account? Sign up now