Quick Look: MSI’s GeForce 210

by Ryan Smith on February 16, 2010 12:00 AM EST

MSI’s N210-MD512H is priced as the ultimate budget card, and it performs accordingly. It’s slower than next-tier cards by a significant amount, but it’s still fast enough that it can at least run every game in our test suite at some level, which is going to be better than what most IGP-based systems can do right now. It goes without saying that you can buy much faster cards even among the limited selection of the low-profile market, but any such card is going to cost a great deal more than $30. So much like the Radeon HD 5450 we took a look at last week, this is a card best suited for buyers moving up from an IGP while needing to do so on a very limited budget.

Looking at our data, we’re a bit surprised that NVIDIA didn’t make the reference G210 design a passively cooled card. Based on MSI’s use of a double-wide heatsink, G210 is plenty suitable for passive operation and likely even a single-wide operation with some care. For a card like the G210, we can’t think of any good reason to use a cooler with a fan if there’s enough room in a computer to use a card with a heatsink. To that end MSI’s G210 looks to be one of the best G210 cards available since it’s one of the only ones with a passive cooler.

We do have a single disappointment though, and that’s for HTPC use. We’re less concerned about the audio limitations (let’s be fair, this card launched back in the summer of 2009) as we are the results of our video test. We weren’t expecting to get great quality out of an entry-level card but we were expecting it to at least fall back gracefully on deinterlacing. The fact that it’s the only card that can’t at least do something smoothly on the Cheese Slices test is disheartening.

Finally, it’ll be interesting to see what NVIDIA does to replace the G210 late this year. It’s reasonable to assume that GF100 will cascade down in to a part similar to G210, which will be a definite benefit for NVIDIA since it means they can bring complete audio bitstreaming to a card at this price point. What we’re left wondering is how they’re going to do this: do they do a 40nm GF100 derivative, or do they push the envelope some and do a 28nm part.

Since G210 is already based on 40nm, a GF100 derivative is ultimately going to be bigger than the GT218 GPU which makes it harder to offer a card at $30. A 28nm GPU would presumably let them pack in GF100/DX11 functionality without expanding the die, but a 28nm product this year is ultimately going to be dependent on how well Global Foundries’ 28nm process is coming along.

Of course we’ll first have to see how GF100 does when it finally launches next month…

The Test & Results
Comments Locked

24 Comments

View All Comments

  • mindless1 - Thursday, February 18, 2010 - link

    Can you just aim for lowest cost possible with integrated video? Of course, but as always some features come at additional cost and in this case it's motherboard features you stand to gain.

    The advantage is being able to use whatever motherboard you want to, instead of being limited by the feature sets of those with integrated video, boards that are typically economized low-end offerings.
  • JarredWalton - Tuesday, February 16, 2010 - link

    If you read some of http://www.anandtech.com/mobile/showdoc.aspx?i=373...">my laptop reviews, you'll see that even a slow GPU like a G210M is around 60% faster than GeForce 9400M, which is a huge step up from GMA 4500MHD (same as X4500, except for lower clocks). HD 3200/4200 are also slower than 9400M, so really we're looking at $30 to get full HD video support (outside of 1080i deinterlacing)... plus Flash HD acceleration that works without problems in my experience. Yes, you'll be stuck at 1366x768 and low detail settings in many games, but at least you can run the games... something you can't guarantee on most IGP solutions.
  • LtGoonRush - Wednesday, February 17, 2010 - link

    We're talking about desktop cards though, and as the tests above show this card simply isn't fast enough for HTPC or gaming usage. The point I'm making is that if the IGP doesn't meet your needs, then it's extremely doubtful that a low-end card will. There's a performance cliff below the R5670/GT 240 GDDR5, and it makes more sense for the performance of IGP solutions to grow up into that space than it does to buy an incredibly cut down discrete GPU. Not that I'm expecting competitive gaming performance from an IGP of course, but R5450-level results (in terms of HTPC and overall functionality) are achievable.
  • Ryan Smith - Wednesday, February 17, 2010 - link

    Even though the G210 isn't perfect for HTPC use, it's still better than IGPs. I don't think any IGP can score 100 on the HD HQV v1 test - the G210 (and all other discrete GPUs) can.
  • shiggz - Wednesday, February 17, 2010 - link

    Recently bought a 220 for 55$ to replace an old 3470 for my HTPC. MPCHC uses dxva for decoding but shaders for picture cleanup. "sharpen complex 2" greatly improves video appearance on a 1080p x264 video but was too much for a 3470. My new 220 handles it with about 30-70% gpu usage depending on the scene according to evga precision. The 220 was the perfect sweet spot of performance/power/cost. My HTPC and 42" TV used to pour out the heat in summer leaving TV viewing unpleasant.

    Had a 65nm dual/8800gt thus a 200watt loaded htpc and 2x 150watt light bulbs plus 200w for the LCD tv. That was a constant 700Watts. Switch the two bulbs to new lower power 30w=150 cut the HTPC to a 45nm cpu and 40nm gpu making htpc 100w leaving me 160w+200w from tv for 360W a big improvement over 700watt. Next year I'll get an LCD backlight and I should be able to get all of that down to about 150-200w for total setup that 3 years ago was 700w.
  • shiggz - Thursday, February 18, 2010 - link

    Meant to say "LED Backlit LCD." As their power consumption is impressive.
  • milli - Tuesday, February 16, 2010 - link

    Don't you think it's about time you stop using Furmark for power consumption testing. Both nVidia and Ati products throttle when they detect something like Furmark. The problem is that you don't know how much each product throttles so the results become inconsistent.
    Measuring the average power consumption during actual gaming seems more realistic to me (not just the peak). 3DMark06 Canyon Flight is a good candidate. It generates higher power usage than most games but still doesn't cause unrealistic high load on the shader domain like Furmark.
  • 7Enigma - Wednesday, February 17, 2010 - link

    I completely agree with the post above. The load power consumption chart is meaningless in this context. Feel free to keep it in if you want, but please add in a more "real-world" test, be it a demanding game or something like futuremark which, while not quite "real-world", is definitely going to get the companies to run at their peak (after all they are trying for the highest score in that instance, not trying to limit power consumption).

    When I read something like, "On paper the G210 should be slightly more power hungry and slightly warmer than the Radeon HD 5450 we tested last week. In practice it does better than that here, although we’ll note that some of this likely comes down to throttling NVIDIA’s drivers do when they see FurMark and OCCT", I immediately ask myself why did you choose to report this data if you know it's possibly not accurate?

    What I really think a site as respected and large as Anandtech needs is proprietary benchmarks that NEVER get released to the general public nor the companies that make these parts. In a similar fashion to your new way of testing server equipment with the custom benchmark, I'd love to see you guys come up with a rigorous video test that ATI/NVIDIA can't "tailor" their products towards by recognizing a program and adjusting settings to gain every possible edge.

    Please consider it Anand and company, I'm sure we would all be greatly appreciative!
  • iamezza - Friday, February 19, 2010 - link

    +1
  • GiantPandaMan - Tuesday, February 16, 2010 - link

    So is it pretty much set that Global Foundries will be the next major builder of AMD and nVidia GPU's? Even when AMD owned GF outright they never used them for GPU's, right? What's changed other than TSMC under-delivering on 40nm for a half year?

Log in

Don't have an account? Sign up now