Quick Look: MSI’s GeForce 210

by Ryan Smith on 2/16/2010 12:00 AM EST
POST A COMMENT

24 Comments

Back to Article

  • gumdrops - Thursday, February 18, 2010 - link

    Where can I find this card for $30? Froogle and Newegg both list this card at $40 which is only $2 cheaper than a 5450. In fact, the cheapest 210 of *any* brand is $38.99.

    With only a $2-$5 difference to the 5450, is it really value for money to go with this card?
    Reply
  • Taft12 - Thursday, February 18, 2010 - link

    In a word: No

    ncix.com has the BFG version of this card on sale for $29.99CAD, but Ryan makes it pretty clear the MSI is the only OEM that produces a G210 worth owning
    Reply
  • mindless1 - Thursday, February 18, 2010 - link

    If building for a small form factor system you have to be a bit more concerned because you may not have any place to put bigger 92+mm fans, so for any particular airflow rate your smaller fans are running at higher RPM already.

    If you are building towards low noise, your system will be quieter by having a lower intake and exhaust rate, then a very low RPM fan on a heatsink instead of a passive heatsink.

    That way it will also accumulate less dust, and help cool other areas like the power regulation circuit (mosfets). It also makes a product more compatible to have a single-height heatsink without an elaborate construction to maximize surface area like you'd need if that single height sink were passive.

    Don't fear or avoid fans, just avoid high(er) RPM fans. Low RPM fans are inaudible, last a long time if they don't pick a very low quality fan.
    Reply
  • greenguy - Wednesday, February 24, 2010 - link

    You've got a good point there - that's why I went with the megahalems and a pwm fan (as opposed to ninja), and scythe kama pwm fans on both intake and exhaust (on 400rpm or so). I probably should have done the same with the graphics card, but didn't do the research. Do you have any pointers to specific cards or coolers?

    I might have to come up with a more localized fan or some ducting.
    Reply
  • AnnonymousCoward - Wednesday, February 17, 2010 - link

    I just wanted to say, great article and I love the table on Page 1. Without it, it's so hard to keep model numbers straight. Reply
  • teko - Wednesday, February 17, 2010 - link

    Come on, does it really make sense to benchmark Crysis for this card? Choose something that the card buyer will actually use/play! Reply
  • killerclick - Wednesday, February 17, 2010 - link

    Once my discrete graphics card died on me on a saturday afternoon and since I didn't have a spare or an IGP my computer was useless until monday around noon. I'm going to get this card to keep as a spare. It has passive cooling, it's small, it's only $30 and I'm sure it'll perform better than any IGP even if I had one. Reply
  • greenguy - Wednesday, February 17, 2010 - link

    I was quite amazed to see this review of the card I had just purchased two of. I wasn't sure, but I have since determined that you can run two 1920x1200 monitors from the one card (using the DVI port and the HDMI port). This is pretty cool - it doesn't force you to use the D-SUB port if you want multi-monitors, so you have all that fine detailed resolutiony goodness.

    It looks very promising that I will be able to get the quad monitor in portrait setup working in linux like I wanted to, using two of these cards rather than an expensive quadro solution. Fingers crossed that I can do it also in FreeBSD or OpenSolaris. I really want the self-healing properties of ZFS, because this will be a developer workstation and I don't want any errors not of my own introduction.

    I'm using a P183 case, and I've found that the idle temperature of the heatsinks are 61 degrees C without the front fan (the one in front of the top 3.5" enclosure). Installing a Scythe Kama PWM fan there I got this down to 47 degrees C. (Note that both of these I had both exhaust fans installed, though they are only doing about 500rpm tops.)

    Using nvidia-settings to monitor the actual temperature of the GPU itself, I am getting a temperature of 74 degrees C of one card that is running two displays with compiz on, and the other is running at 54 degrees C.

    Note that the whole system is a Xeon 3450 (equivalent to i5-750 with HT), 8GB RAM, with Seasonic X-650, and it is idling at 62-67 Watts. Phenomenal.
    Reply
  • Exelius - Tuesday, February 16, 2010 - link

    I'd be interested in seeing how this card performs as an entry-level CAD card. I understand it's not going to set any records, but for a low-end CAD station coupled with 8GB RAM and a core i7, does this card perform acceptably with AutoCAD 2010 (or perform at all?)

    I'm not a CAD guy, btw, so don't flame me too hard if this is totally unacceptable (and I know you can't benchmark AutoCAD so I'm not expecting numbers.) This card just shows up in a lot of OEM configurations so I'm curious if I'd need to replace it with something beefier for a CAD station.
    Reply
  • LtGoonRush - Tuesday, February 16, 2010 - link

    The reality is that the cards at this pricepoint don't really provide any advantages over onboard video to justify their cost. There's so little processing power that they still can't game at all, can't provide a decent HTPC experience, all they're capable of is the same basic video decode acceleration as any non-Atom video chipset. This sort of makes sense when you're talking about an Ion 2 drop-in accelerator for an Atom system to compete with Broadcom, but I just don't see the value proposition over AMD HD 4200 or Intel GMA X4500 (much less Intel HD Graphics in Clarkdale). I'd like to see how the upcoming AMD 800-series chipsets with onboard graphics stack up. Reply
  • mindless1 - Thursday, February 18, 2010 - link

    Can you just aim for lowest cost possible with integrated video? Of course, but as always some features come at additional cost and in this case it's motherboard features you stand to gain.

    The advantage is being able to use whatever motherboard you want to, instead of being limited by the feature sets of those with integrated video, boards that are typically economized low-end offerings.
    Reply
  • JarredWalton - Tuesday, February 16, 2010 - link

    If you read some of http://www.anandtech.com/mobile/showdoc.aspx?i=373...">my laptop reviews, you'll see that even a slow GPU like a G210M is around 60% faster than GeForce 9400M, which is a huge step up from GMA 4500MHD (same as X4500, except for lower clocks). HD 3200/4200 are also slower than 9400M, so really we're looking at $30 to get full HD video support (outside of 1080i deinterlacing)... plus Flash HD acceleration that works without problems in my experience. Yes, you'll be stuck at 1366x768 and low detail settings in many games, but at least you can run the games... something you can't guarantee on most IGP solutions. Reply
  • LtGoonRush - Wednesday, February 17, 2010 - link

    We're talking about desktop cards though, and as the tests above show this card simply isn't fast enough for HTPC or gaming usage. The point I'm making is that if the IGP doesn't meet your needs, then it's extremely doubtful that a low-end card will. There's a performance cliff below the R5670/GT 240 GDDR5, and it makes more sense for the performance of IGP solutions to grow up into that space than it does to buy an incredibly cut down discrete GPU. Not that I'm expecting competitive gaming performance from an IGP of course, but R5450-level results (in terms of HTPC and overall functionality) are achievable. Reply
  • Ryan Smith - Wednesday, February 17, 2010 - link

    Even though the G210 isn't perfect for HTPC use, it's still better than IGPs. I don't think any IGP can score 100 on the HD HQV v1 test - the G210 (and all other discrete GPUs) can. Reply
  • shiggz - Wednesday, February 17, 2010 - link

    Recently bought a 220 for 55$ to replace an old 3470 for my HTPC. MPCHC uses dxva for decoding but shaders for picture cleanup. "sharpen complex 2" greatly improves video appearance on a 1080p x264 video but was too much for a 3470. My new 220 handles it with about 30-70% gpu usage depending on the scene according to evga precision. The 220 was the perfect sweet spot of performance/power/cost. My HTPC and 42" TV used to pour out the heat in summer leaving TV viewing unpleasant.

    Had a 65nm dual/8800gt thus a 200watt loaded htpc and 2x 150watt light bulbs plus 200w for the LCD tv. That was a constant 700Watts. Switch the two bulbs to new lower power 30w=150 cut the HTPC to a 45nm cpu and 40nm gpu making htpc 100w leaving me 160w+200w from tv for 360W a big improvement over 700watt. Next year I'll get an LCD backlight and I should be able to get all of that down to about 150-200w for total setup that 3 years ago was 700w.
    Reply
  • shiggz - Thursday, February 18, 2010 - link

    Meant to say "LED Backlit LCD." As their power consumption is impressive. Reply
  • milli - Tuesday, February 16, 2010 - link

    Don't you think it's about time you stop using Furmark for power consumption testing. Both nVidia and Ati products throttle when they detect something like Furmark. The problem is that you don't know how much each product throttles so the results become inconsistent.
    Measuring the average power consumption during actual gaming seems more realistic to me (not just the peak). 3DMark06 Canyon Flight is a good candidate. It generates higher power usage than most games but still doesn't cause unrealistic high load on the shader domain like Furmark.
    Reply
  • 7Enigma - Wednesday, February 17, 2010 - link

    I completely agree with the post above. The load power consumption chart is meaningless in this context. Feel free to keep it in if you want, but please add in a more "real-world" test, be it a demanding game or something like futuremark which, while not quite "real-world", is definitely going to get the companies to run at their peak (after all they are trying for the highest score in that instance, not trying to limit power consumption).

    When I read something like, "On paper the G210 should be slightly more power hungry and slightly warmer than the Radeon HD 5450 we tested last week. In practice it does better than that here, although we’ll note that some of this likely comes down to throttling NVIDIA’s drivers do when they see FurMark and OCCT", I immediately ask myself why did you choose to report this data if you know it's possibly not accurate?

    What I really think a site as respected and large as Anandtech needs is proprietary benchmarks that NEVER get released to the general public nor the companies that make these parts. In a similar fashion to your new way of testing server equipment with the custom benchmark, I'd love to see you guys come up with a rigorous video test that ATI/NVIDIA can't "tailor" their products towards by recognizing a program and adjusting settings to gain every possible edge.

    Please consider it Anand and company, I'm sure we would all be greatly appreciative!
    Reply
  • iamezza - Friday, February 19, 2010 - link

    +1 Reply
  • GiantPandaMan - Tuesday, February 16, 2010 - link

    So is it pretty much set that Global Foundries will be the next major builder of AMD and nVidia GPU's? Even when AMD owned GF outright they never used them for GPU's, right? What's changed other than TSMC under-delivering on 40nm for a half year? Reply
  • hwhacker - Tuesday, February 16, 2010 - link

    Hmm, maybe he knows something we don't?

    Last I heard circulating AMD was going to get (sample?) product from both TSMC and GF on 32nm, but that got all borked when TSMC cancelled 32nm. As such, now they will transition to each company's respective 28nm process instead. This is said to have messed up Northern Islands' release, but may result in a better (not just smaller/faster) product. Who knows if that's true. All things being equal, I'm sure AMD would like to use 28nm bulk at GF.

    As for nVIDIA, it's been interesting to watch. First they said absolutely not to GF, then 40nm at TSMC happened. After that Jensen was said to be in talks with GF, publically said some good things about GF over TSMC (likely because they're angry about 40nm RE: Fermi and used it for intimidation) and that's all we know. All things being equal, I'm sure nVIDIA would like to use 28nm bulk at TSMC.
    Reply
  • Natfly - Tuesday, February 16, 2010 - link

    You're right, both companies canned their 32nm bulk processes. So either the author is insinuating that nVidia is going to switch to 32nm SOI or he means 28nm. Reply
  • Ryan Smith - Tuesday, February 16, 2010 - link

    He means 28nm. Reply
  • Natfly - Tuesday, February 16, 2010 - link

    My apologies, I would have referred to you by name if I wasn't too lazy to go back to the article from the comments page to check :P Reply

Log in

Don't have an account? Sign up now