MSI

MSI has an excellent concept, bringing the only cooling solution that attaches the ramsinks to the main body of the heatsink. This should allow the fan to cool the block of metal that stretches across the ram as well. Our test also showed this to be one of the quietest cards that we tested.

When we began testing, we noticed that we had a problem. Even though MSI went with a mostly round design, there was apparently enough leverage between the two spring pins to rip the thermal tape loose from the RAM. Even at stock clocks, it did get way too hot and so, we needed to use makeshift clamps on the ram to hold the heatsink in place on the GPU.



It is unfortunate to see a card with such potential overcome by HSF mounting issues. We would love the opportunity to retest this card with a properly mounted cooling solution, and update our cooling numbers. This could have been an unnoticed manufacturing defect, but the setup lends itself to easily pulling up off the RAM if the end user were to press down too hard on the opposite side. In fact, it seemed as if the spring pins held the heatsink off the GPU rather than down onto it. This would have been useful to add pressure to the GPU if the thermal tape had held on the RAM, but again, we received a part in non-working order, so we aren't sure what it should have looked like.



Leadtek Palit
Comments Locked

84 Comments

View All Comments

  • Bonesdad - Wednesday, February 16, 2005 - link

    Yes, I too would like to see an update here...have any of the makers attacked the HSF mounting problems?
  • 1q3er5 - Tuesday, February 15, 2005 - link

    can we please get an update on this article with more cards, and replacements of defective cards?

    I'm interested in the gigabyte card
  • Yush - Tuesday, February 8, 2005 - link

    Those temperature results are pretty dodge. Surely no regular computer user would have a caseless computer. Those results are only favourable and only shed light on how cool the card CAN be, and not how hot they actually are in a regular scenario. The results would've been much more useful had the temperature been measured inside a case.
  • Andrewliu6294 - Saturday, January 29, 2005 - link

    i like the albatron best. Exactly how loud is it? like how many decibels?
  • JClimbs - Thursday, January 27, 2005 - link

    Anyone have any information on the Galaxy part? I don't find it in a pricewatch or pricegrabber search at all.
  • Abecedaria - Saturday, January 22, 2005 - link

    Hey there. I noticed that Gigabyte seems to have modified their HSI cooling solution. Has anyone had any experience with this? It looks much better.

    Comments?
    http://www.giga-byte.com/VGA/Products/Products_GV-...

    abc
  • levicki - Sunday, January 9, 2005 - link

    Derek, do you read your email at all? I got Prolink 6600 GT card and I would like to hear a suggestion on improving the cooling solution. I can confirm that retail card reaches 95 C at full load and idles at 48 C. That is really bad image for nVidia. They should be informed about vendor's doing poor job on cooling design. I mean, you would expect it to be way better because those cards ain't cheap.
  • levicki - Sunday, January 9, 2005 - link

  • geogecko - Wednesday, January 5, 2005 - link

    Derek. Could you speculate on what thermal compound is used to interface between the HSF and the GPU on the XFX card? I e-mailed them, and they won't tell me what it is?! It would be great if it was paste or tape. I need to be able to remove it, and then later, would like to re-install it. I might be able to overlook not having the component video pod on the XFX card, as long as I get an HDTV that supports DVI.
  • Beatnik - Friday, December 31, 2004 - link


    I thought I would add about the DUAL-DVI issue, in the new NVIDIA drivers, they show that the second DVI can be used for HDTV output. It appears that even the overscan adjustments are there.

    So not having the component "pod" on the XFX card appears to be less of a concern than I thought it might be. It would be nice to hear if someone tried running 1600x1200 + 1600x1200 on the XFX, just to know if the DVI is up to snuff for dual LCD use.

Log in

Don't have an account? Sign up now