Quick Look: MSI’s GeForce 210

by Ryan Smith on February 16, 2010 12:00 AM EST

Since NVIDIA did not send out review samples for the G(T) 200 series last year, we’ve been slowly building up our database of results as cards have arrived from the vendors themselves. We saw the GT 220 late last year, and last month we finally saw the GT 240. Thanks to MSI, today we’ll be looking at the final card in the (G)T 200 series: the GeForce 210. The specific card we’re looking at is MSI’s N210-MD512H.


9600GT GT 220 9500GT
9400GT
G 210
Stream Processors 64 48 32 16 16
Texture Address / Filtering 32 / 32 16 / 16 16 / 16 8 / 8 8 / 8
ROPs 16 8 8 4 4
Core Clock 650MHz 625MHz 550MHz 550MHz 589MHz
Shader Clock 1625MHz 1360MHz 1400MHz 1400MHz 1402MHz
Memory Clock 900MHz (1800MHz effective) GDDR3 900MHz (1800MHz effective) GDDR3 400MHz (800MHz effective) DDR2 400MHz (800MHz effective) DDR2 500MHz (1000Mhz effective) DDR2
Memory Bus Width 256-bit 128-bit 128-bit 128-bit 64-bit
Frame Buffer 512MB 512MB 512MB 512MB 512MB
Transistor Count 505M 486M 314M 314M 260M
Manufacturing Process TSMC 55nm TSMC 40nm TSMC 55nm TSMC 55nm TSMC 40nm
Price Point  $69-$85 $69-$79  $45-$60 $40-60 $30-$50

Like the GT 220, the G210 started life over the summer as an OEM-only card, earning its wings for a public launch in October along-side the GT 220. G210 is based on GT218, the smallest member of NVIDIA’s DirectX 10.1 GPU lineup, built out of 260M transistors and measuring a mere 57mm2 on TSMC’s 40nm process. As a product it replaces NVIDIA’s similar 9400GT, while in terms of pricing it replaces the slower 8400 GS.

Coupled with the GT218 GPU on the G210 is 512MB of DDR2 RAM, using the customary 64bit memory bus. Interestingly, unlike most other entry-level products, the G210 only comes in 1 memory configuration: 512MB. Even more interesting are the DDR2 chips on our MSI card – 4 chips we can’t identify bearing the ATI logo.

Since the G210 is intended to replace the 9400GT, the GPU configuration is basically the same, composed of 16 SPs, 8 TMUs, and 4 ROPs. The clocks are 589MHz for the core, 1402MHz for the shaders, and 500MHz (1000MHz effective) for the RAM. Compared to the GT 220, this gives the G210 around 30% of the shader power, 50% the texture and ROP power, and 27% of the memory bandwidth. All of this consumes 30W under load, and less than 7W when idling.

The G210 is a low-profile card, the only reference card to be that way out of NVIDIA’s entire 200-series lineup. NVIDIA’s reference design (not that anyone quite uses it) is a single-slot card with a similarly wide active cooler. For our MSI card, MSI has forgone that for a passive design, using a single double-wide heatsink. Most G210 cards do use some form of an active cooler, so MSI’s N210 is fairly unique in that regard.

Since the G210 is a low-profile card, the port configurations are as expected: 1 DVI, 1 HDMI or DisplayPort, and then the detachable VGA port. In the case of MSI's card, they have gone with an HDMI port as the second digital port, while the VGA port can be used on a full-profile bracket or a second bracket in a low-profile case.

Finally, in terms of pricing, the G210 is the ultimate budget card. With the MSI card going for only $30 after rebate, it’s for all practical purposes as cheap as a video card can get. Even the DDR2 version of the Radeon HD 5450 we reviewed last week goes for nearly 50% more. In terms of pricing this makes its competitors the GeForce 8400GS and the DDR2-based Radeon HD 4350, the latter of which is also its closest competitor to in terms of feature parity. It goes without saying that this is the ultimate entry-level card, and performs accordingly.

MSI’s Software & HTPC Usability
Comments Locked

24 Comments

View All Comments

  • mindless1 - Thursday, February 18, 2010 - link

    Can you just aim for lowest cost possible with integrated video? Of course, but as always some features come at additional cost and in this case it's motherboard features you stand to gain.

    The advantage is being able to use whatever motherboard you want to, instead of being limited by the feature sets of those with integrated video, boards that are typically economized low-end offerings.
  • JarredWalton - Tuesday, February 16, 2010 - link

    If you read some of http://www.anandtech.com/mobile/showdoc.aspx?i=373...">my laptop reviews, you'll see that even a slow GPU like a G210M is around 60% faster than GeForce 9400M, which is a huge step up from GMA 4500MHD (same as X4500, except for lower clocks). HD 3200/4200 are also slower than 9400M, so really we're looking at $30 to get full HD video support (outside of 1080i deinterlacing)... plus Flash HD acceleration that works without problems in my experience. Yes, you'll be stuck at 1366x768 and low detail settings in many games, but at least you can run the games... something you can't guarantee on most IGP solutions.
  • LtGoonRush - Wednesday, February 17, 2010 - link

    We're talking about desktop cards though, and as the tests above show this card simply isn't fast enough for HTPC or gaming usage. The point I'm making is that if the IGP doesn't meet your needs, then it's extremely doubtful that a low-end card will. There's a performance cliff below the R5670/GT 240 GDDR5, and it makes more sense for the performance of IGP solutions to grow up into that space than it does to buy an incredibly cut down discrete GPU. Not that I'm expecting competitive gaming performance from an IGP of course, but R5450-level results (in terms of HTPC and overall functionality) are achievable.
  • Ryan Smith - Wednesday, February 17, 2010 - link

    Even though the G210 isn't perfect for HTPC use, it's still better than IGPs. I don't think any IGP can score 100 on the HD HQV v1 test - the G210 (and all other discrete GPUs) can.
  • shiggz - Wednesday, February 17, 2010 - link

    Recently bought a 220 for 55$ to replace an old 3470 for my HTPC. MPCHC uses dxva for decoding but shaders for picture cleanup. "sharpen complex 2" greatly improves video appearance on a 1080p x264 video but was too much for a 3470. My new 220 handles it with about 30-70% gpu usage depending on the scene according to evga precision. The 220 was the perfect sweet spot of performance/power/cost. My HTPC and 42" TV used to pour out the heat in summer leaving TV viewing unpleasant.

    Had a 65nm dual/8800gt thus a 200watt loaded htpc and 2x 150watt light bulbs plus 200w for the LCD tv. That was a constant 700Watts. Switch the two bulbs to new lower power 30w=150 cut the HTPC to a 45nm cpu and 40nm gpu making htpc 100w leaving me 160w+200w from tv for 360W a big improvement over 700watt. Next year I'll get an LCD backlight and I should be able to get all of that down to about 150-200w for total setup that 3 years ago was 700w.
  • shiggz - Thursday, February 18, 2010 - link

    Meant to say "LED Backlit LCD." As their power consumption is impressive.
  • milli - Tuesday, February 16, 2010 - link

    Don't you think it's about time you stop using Furmark for power consumption testing. Both nVidia and Ati products throttle when they detect something like Furmark. The problem is that you don't know how much each product throttles so the results become inconsistent.
    Measuring the average power consumption during actual gaming seems more realistic to me (not just the peak). 3DMark06 Canyon Flight is a good candidate. It generates higher power usage than most games but still doesn't cause unrealistic high load on the shader domain like Furmark.
  • 7Enigma - Wednesday, February 17, 2010 - link

    I completely agree with the post above. The load power consumption chart is meaningless in this context. Feel free to keep it in if you want, but please add in a more "real-world" test, be it a demanding game or something like futuremark which, while not quite "real-world", is definitely going to get the companies to run at their peak (after all they are trying for the highest score in that instance, not trying to limit power consumption).

    When I read something like, "On paper the G210 should be slightly more power hungry and slightly warmer than the Radeon HD 5450 we tested last week. In practice it does better than that here, although we’ll note that some of this likely comes down to throttling NVIDIA’s drivers do when they see FurMark and OCCT", I immediately ask myself why did you choose to report this data if you know it's possibly not accurate?

    What I really think a site as respected and large as Anandtech needs is proprietary benchmarks that NEVER get released to the general public nor the companies that make these parts. In a similar fashion to your new way of testing server equipment with the custom benchmark, I'd love to see you guys come up with a rigorous video test that ATI/NVIDIA can't "tailor" their products towards by recognizing a program and adjusting settings to gain every possible edge.

    Please consider it Anand and company, I'm sure we would all be greatly appreciative!
  • iamezza - Friday, February 19, 2010 - link

    +1
  • GiantPandaMan - Tuesday, February 16, 2010 - link

    So is it pretty much set that Global Foundries will be the next major builder of AMD and nVidia GPU's? Even when AMD owned GF outright they never used them for GPU's, right? What's changed other than TSMC under-delivering on 40nm for a half year?

Log in

Don't have an account? Sign up now