There are some things you just don’t talk about among polite company. Politics, money, and apparently OEM-only GPUs. Back in July NVIDIA launched their first 40nm GPUs, and their first GPUs featuring DX10.1 support; these were the GeForce GT 220 and G 210. And if you blinked, you probably missed it. As OEM-only parts, these went into the OEM channel without any fanfare or pageantry.

Today that changes. NVIDIA is moving the GT 220 and G 210 from OEM-only sales to retail, which means NVIDIA’s retail vendors can finally get in on the act and begin selling cards. Today we are looking at one of the first of those cards, the Palit GT 220 Sonic Edition.

Form Factor 9600GT 9600GSO GT 220 (GDDR3) 9500GT G 210 (DDR2)
Stream Processors 64 48 48 32 16
Texture Address / Filtering 32 / 32 24 / 24 16 / 16 16 / 16 16 / 16
ROPs 16 16 8 8 8
Core Clock 650MHz 600MHz 625MHz 550MHz 675MHz
Shader Clock 1625MHz 1500MHz 1360MHz 1400MHz 1450MHz
Memory Clock 900MHz 900MHz 900MHz 400MHz

400MHz

Memory Bus Width 256-bit 128-bit 128-bit 128-bit 64-bit
Frame Buffer 512MB 512MB 512MB 512MB 512MB
Transistor Count 505M 505M 486M 314M 260M
Manufacturing Process TSMC 55nm TSMC 55nm TSMC 40nm TSMC 55nm TSMC 40nm
Price Point  $69-$85  $40-$60 $69-$79  $45-$60 $40-$50



GT 220 and G 210 are based on the GT216 and GT218 cores respectively (anyone confused yet?) which are the first and so far only 40nm members of NVIDIA’s GT200 family. These are specifically designed as low-end cards, with 48 SPs on the GT 220, and 16 SPs on the G 210. The GT 220 is designed to sit between the 9500GT and 9600GT in performance, making its closest competitor the 48SP 9600GSO. Meanwhile the G 210 is the replacement for the 9400GT.

We will see multiple configurations for each card, as NVIDIA handles low-end parts somewhat looser than they do the high-end parts. GT 220 will come with DDR2, DDR3, or GDDR3 memory on a 128bit bus, with 512MB or 1GB of it. So the memory bandwidth of these cards is going to vary wildly; the DDR2 cards will be 400MHz and the GDDR3 cards may go as high as 1012MHz according to NVIDIA’s specs. G 210 meanwhile will be DDR2 and DDR3 only, again going up to 1GB. With a 64bit memory bus, it will have half the memory bandwidth of GT 220.

The GPU configurations will vary too, but not as wildly. NVIDIA’s official specs call for a 625MHz core clock and a 1360MHz shader clock. We’ve seen a number of cards with a higher core clock, and a card with a lower shader clock. Based on the card we have, we’re going to consider 635Mhz/1360Mhz stock for the GPU, and 900MHz stock for GDDR3, and test accordingly.

The transistor count for the GT218 die comes out to 260M, and for GT216 it’s 486M. NVIDIA would not disclose the die size to us, but having disassembled our GT 220, we estimate it to be around 100mm2. One thing we do know for sure is that along with the small die, NVIDIA has managed to knock down power consumption for the GT 220. At load the card will draw 58W, at idle it’s a tiny 7W. We don’t have the power data for the G 210, but it’s undoubtedly lower.

The prices on GT 220 cards are expected to range between $69 and $79, with the cards at the top end being those with the best RAM. This puts GT 220 in competition with AMD’s Radeon HD 4600 series, and NVIDIA’s own 9600GT. The G 210 will have an approximate price of $45, putting it in range of the Radeon HD 4300/4500 series, and NVIDIA’s 9500GT. The GT 220 in particular is in an odd spot: it’s supposed to underperform the equally priced (if not slightly cheaper) 9600GT. Meanwhile the G210 is supposed to underperform the 9500GT, which is also available for $45. So NVIDIA is already starting off on the wrong foot here.

Finally, availability should not be an issue. These cards have been shipping for months to OEMs, so the only thing that has really changed is that now some of them are going into the retail pool. It’s a hard launch and then some. Not that you’ll see NVIDIA celebrating; while the OEM-only launch was no-key, this launch is only low-key at best. NVIDIA didn’t send out any samples, and it wasn’t until a few days ago that we had the full technical data on these new cards.

We would like to thank Palit for providing us with a GT 220 card for today’s launch, supplying us with their GT 220 Sonic Edition.

DirectX 10.1 on an NVIDIA GPU?
Comments Locked

80 Comments

View All Comments

  • Guspaz - Tuesday, October 13, 2009 - link

    Errm, Valve's latest hardware survey shows that only 2.39% of gamers are using 2+ GPUs with SLI or Crossfire. ATI has a 27.26% marketshare.

    Of those who did buy multi-GPU solutions, some may be "hidden" (GTX295, the various X2 solutions), in which case it had no impact whatsoever (since it's presented as a single card). Some may have used it as an upgrade to an existing card, in which case SLI/Crossfire may not have driven their decision.

    It's true that SLI (2.14%) has greatly outsold Crossfire (0.25%), but that's such a tiny market segment that it doesn't amount to much.

    ATI has managed to hold on to a respectable market share. In fact, their 4800 series cards are more popular than every single nVidia series except for the 8800 series.

    So, I think I've sufficiently proven that SLI wasn't a knockout blow... It was barely a tickle to the market at large.
  • Seramics - Tuesday, October 13, 2009 - link

    When Sli came out? Stop mentioning ancient news. Right now, Sli n Xfire r abt equally sucks. Heard of Hydra? Thats the cool stuff dude. And yeah nvidia is very innovative indeed, renaming old products to look new to deceive customers, shave half the spec of a products n keep the same name (9600gso), releasing crappy products n selling it overprice.... MAN! Thats really innovative dun u think?
  • Souleet - Tuesday, October 13, 2009 - link

    Are you ignorant or something, ATI fanboy. GT220 is a 40nm and 9600GSO is a 65nm. How can you say they just changed the name? I thought so...
  • gx80050 - Monday, October 12, 2009 - link




    Die painfully okay? Prefearbly by getting crushed to death in a
    garbage compactor, by getting your face cut to ribbons with a
    pocketknife, your head cracked open with a baseball bat, your stomach
    sliced open and your entrails spilled out, and your eyeballs ripped
    out of their sockets. Fucking bitch



    I really hope that you get curb-stomped. It'd be hilarious to see you
    begging for help, and then someone stomps on the back of your head,
    leaving you to die in horrible, agonizing pain. Faggot


    Shut the fuck up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.

  • gx80050 - Monday, October 12, 2009 - link

    Fuck off and die retard
  • Seramics - Monday, October 12, 2009 - link

    Let's face it. Nvidia is NOT competitive at every front at every single price point. From ultra low end to mid range to ultra high end, tell me, which price point is nvidia being competitive?
    Well, of cos I believe Fermi will be something different. I truly believe so. In fact, given that HD5870's slightly below par performance for its spec (very likely bcos memory bandwith limited), and Fermi being on a much larger die and higher transistor count, I EXPECT nVidia next gen Fermi to easily outperform HD5870. Just like how GTX285 outperform HD4890. But by how much? For almost 100 USD more for juz 5-10% improvements? I believe this will likely be the case with Fermi vs 5870. Surely its faster, but ur mayb paying 100% more to get 25% extra fps.

    CONCLUSION: Even if Nvidia retake the top single GPU performance crown, they were never a winner in price to performance at ANY price point. They care about profits than they care about you.
  • Souleet - Monday, October 12, 2009 - link

    I agree what your conclusion. Definitely price point ATI has always been on the top of their game but NVIDIA innovations is what make the two apart. But who knows, maybe one day ATI/AMD comes out with CPU/GPU solution that will change the technology industry. That would be cool.
  • formulav8 - Monday, October 12, 2009 - link

    quote:

    Remember what happen to the ATI 9700/9800 series, we all know what happen after that. :)



    NVidia brought out the FX5800 Ultra??
  • TRIDIVDU - Tuesday, September 21, 2010 - link

    My son plays GTA, FIFA, POP, Tomb Raider, NFS etc. in my P4, 3.06 GHz WinXP m/c with N 9400 GT (MSI) 1GB card without any problem in a 19inch LCD monitor. Now that I am planning to exchange the 4 year old m/c with a new i5 650, 3.2 GHz, Win7 m/c fitted with GT220 1 GB card, please tell me whether he will find the new machine a better one to play games with.
  • Thatguy97 - Tuesday, June 30, 2015 - link

    nvidias mid range was shit back then

Log in

Don't have an account? Sign up now