Silence is Golden: Silent GPU Roundup

by Josh Venning on 8/31/2006 5:50 AM EST
POST A COMMENT

49 Comments

Back to Article

  • TheInternal - Tuesday, September 12, 2006 - link

    It's wonderful to see Anandtech take the time to review silent products. I've really been trying to quiet down my PC, and seeing this review gave me some further encouragement. With rumors of ASUS acquiring XFX, it will be interesting to see if Anandtech decides to review the passively colled XFX 7950 GT with heat pipes that look awfully reminescent of the ones from the ASUS 7800 you reviewed.
    I'm also curious to see if any 7900 GS cards become available with passive cooling soon.
    Reply
  • Richey02hg - Tuesday, September 05, 2006 - link

    I was just curious if any of these cards are AGP? or they all PCI Express only? and also, its hard to tell since an x800xt all in one wonder isnt in there. But would any of these be an upgrade over that? Because I have to admit, just seeing that word "silent" makes me happy cause my GPU is insanely loud Reply
  • JarredWalton - Wednesday, September 06, 2006 - link

    All are PCIe. I'm not sure if there are any silent AGP cards out there other than very low end components. As for the X800 XT, that is roughly equivalent to the 7800 GS in performance, albeit without SM 3.0 support. 7600 GT would also be pretty similar in performance I think. I would recommend holding onto your current system as long as you can, and when he can no longer stand the performance it offers do a wholesale upgrade to PCI-E GPU and motherboard, and probably a new CPU and RAM is well. At that point, you might as well just go ahead and buy a completely new system -- you could even try selling off your current system to recoup some of the cost. Reply
  • Richey02hg - Thursday, September 07, 2006 - link

    thanks for the advice, Im actually planning to get a laptop in 2006 and thanks to your review im definetly waiting for that second wave (forget the name) of the core 2 duos for laptops :) Reply
  • Eddie Lin - Thursday, August 31, 2006 - link

    Gigabyte seems don't need reserve SLI bridge seems 7300GS only go with S/W SLI and don't need bridge. Is really good heatsink design on this card Reply
  • DerekWilson - Friday, September 01, 2006 - link

    Thanks Eddie --

    We have added this information to the article.
    Reply
  • yacoub - Thursday, August 31, 2006 - link

    It's an absolute joke that Asus and Gigabyte don't have silently-cooled 7900GTs out yet. The card requires less power and runs cooler than the 7800GT did. It's a shoe-in to get a silent version. wtf.

    This is practically a roundup of grandfathers and retirees when you include a 7800GT. ;P
    Reply
  • nullpointerus - Friday, September 01, 2006 - link

    Maybe they are trying to get rid of old cards without dropping the price too much? Reply
  • yyrkoon - Thursday, August 31, 2006 - link

    Alot of people that would consider buying a fanless GPU wouldnt even care if it DID make tons of noise, some of us live in deserts, where its extremely dusty. There is nothing like owning an air compressor or two, just for 'dusting' you house, shops, and PC / electronics innards.

    I guess I'm one of the few people who actually enjoy having a fan or two on while I'm sleeping for background noise, but less moving parts means longer part life here in the Nevada desert. However, I own a eVGA 7600GT KO, that has a fan on it, and you know what, I have a really hard time hearing it from 6 feet away. In fact, the 120mm low RPM fans that came with my Lian Li case make more noise, and they dont make much noise themselves.

    I think its a great idea that these manufactuers are making products like this, but at the same time, for me personally its not really an option. I only buy parts from a compnay with a good reputation, and offer excellent customer support, and hence I'm very picky about who I buy from. At the same time, I know what I want, and if something passive isnt availible on say a 7600GT (which is what I wanted for this current system), and at the same time, from a company I would normally buy parts from, then I wont bother. I would think it a better option to buy the part you wanted for a video card, then buy an aftermarket passive cooler if it comes down to that (which would probably void your warranty, so again, for some of us, not really an option).

    So basicly, what it boils down to, is that I have to buy a graphics card with a fan to get what I want, and if problems later ensue, its a good thing I have a can of miracle oil around, and a few saringes . . .
    Reply
  • Josh Venning - Thursday, August 31, 2006 - link

    These are some good points; it's true that with less moving parts you would theoretically see longer life and resistence to dust and dirt, something that could be a plus. And while it's true that a normal graphics card (with a fan on it) will be pretty hard to hear from a little ways away inside your computer case, the idea is that some people need that extra bit of silence for whatever reason, and every extra fan adds to the noise level on the system. For myself, when recording sound/music with a computer, getting things as quiet as possible is very important, so this is one case where eliminating even a couple of dbs is worth buying a silent gpu for. (especially if, like myself, your recording computer is one you also want to be able to play games on.) Reply
  • Josh Venning - Thursday, August 31, 2006 - link

    I also forgot to mention that some people use their pcs in home theater systems as well. This would be another case when you want as little noise from your computer as possible. Reply
  • imaheadcase - Thursday, August 31, 2006 - link

    That was not always the case, my 9700 Pro i still use when fan went out a year ago, works like a charm without it on. It was in its time the high end card, lets hope those days come buy again :D Reply
  • eckre - Thursday, August 31, 2006 - link

    What a great review, when tom did their silent VC review, they included a grand total of three cards...pfft. nice job anand.

    I have the 7600GT, very sweet and 0dB is oh so nice.
    Reply
  • Josh Venning - Thursday, August 31, 2006 - link

    We just wanted to say thanks all for your comments and we are still trying to make sure we've caught any errors. (there are actually only 20 cards in the roundup and not 21) As Derek said, these cards were included in the article because we requested any and all silent cards that any of the manufacturers were willing to give us to review. That's also why we have more cards from ASUS and Gigabyte than the others. Reply
  • Olaf van der Spek - Thursday, August 31, 2006 - link

    quote:

    If a general purpose CPU can offer a 40% improvement over its predecessor (Pentium D) while consuming 40% less power on average, why can't a GPU revolution accomplish the same thing?


    Because the videocard industry hasn't introduced such a bad design as the netburst architecture.
    Reply
  • epsilonparadox - Thursday, August 31, 2006 - link

    No they've introduced worse. When they recommend a second PS just for grafx or even a 1Kw single PS, they've taken intel's lack of thermal control to a whole new level. Reply
  • DerekWilson - Thursday, August 31, 2006 - link

    graphics cards use much much less power in 2d mode than in 3d mode -- and even their 3d power saving capabilities are really good.

    this is especially true when you consider the ammount of processing power a GPU delivers compared to a CPU.

    Theoretical peak performance of a current desktop CPU is in the 10-15 GFLOPS range at best. For a GPU, theoretical peak performance is at least one order of magnitude larger reaching up over 200 GFLOPS in high end cases.

    I'm not saying we can reach these theoretical peak rates on either a CPU or a GPU, but a GPU is doing much much more work under load than a CPU possibly could.

    Keep in mind we aren't even up to GHz on GPU cores. On the CPU front, Intel just shortened the pipeline and decreased clock speeds to save power -- doing more work in one cycle. This is absolutely what a GPU does.

    And the icing on the cake is the sheer options on the silent GPU front. Neither AMD nor Intel make a fast desktop CPU that can be (easily) passively cooled. These parts are a testiment to the efficiency of the GPU.

    On the flip side, ATI and NVIDIA push their high end parts way up in clock speed and power consumption trying as hard as possible to gain the performance crown.

    There are plenty of reasons GPUs draw more power than a CPU under load, but a lack of thermal control or inefficient desing is not one of them. It's about die size, transistor count, and total ammount of work being done.
    Reply
  • JarredWalton - Saturday, September 02, 2006 - link

    I disagree with Derek, at least in some regards. The budget and midrange GPUs generally do a good job at throttling down power requirements in 2D mode. The high-end parts fail miserably in my experience. Sure, they consume a lot less power than they do in 3D mode, but all you have to do is look at the difference between using a Radeon Mobility X1400 and a GeForce Go 7800 in the Dell laptops to http://www.anandtech.com/mobile/showdoc.aspx?i=276...">see the difference in battery life.

    In 2D mode, graphics chips still consume a ton of power relatively speaking -- probably a lot of that going to the memory as well. A lot of this can be blamed on transistor counts and die size, but I certainly think that NVIDIA and ATI could reduce power more. The problem right now is that power use is a secondary consideration, and ATI and NVIDIA both need to have a paradigm shift similar to what Intel had with the Pentium M. If they could put a lot of resources into designing a fast but much less power-hungry GPU, I'm sure they could cut power draw quite a bit in both idle and load situations.

    That's really the crux of the problem though: resources. Neither company has anywhere near the resources that AMD has, let alone the resources that Intel has. Process technology is at least a year behind Intel if not more, chip layouts are mostly computer generated as opposed to being tweaked manually (I think), and none of the companies have really started at square one trying to create a power efficient design; that always seems to be tacked on after-the-fact.

    GPUs definitely do a lot of work, although GFLOPS is a terrible measure performance. The highly parallel nature of 3D rendering does allow you to scale performance very easily, but power requirements also scale almost linearly with performance when using the same architecture. It would be nice to see some balance between performance scaling and power requirements... I am gravely concerned about what Windows Vista is going to do for battery life on laptops, at least if you enable the Aero Glass interface. Faster switching to low-power states (for both memory and GPU) ought to be high on the list for next-generation GPUs.
    Reply
  • DaveLessnau - Thursday, August 31, 2006 - link

    I'm wondering why Anandtech tested Asus' EN7800 GT card instead of their EN7600 GT. That card would be more in line with Gigabyte's 7600 GT version and, I believe, is more available than the 7800 version. In the near future, I'd like to buy one of these silent 7600GTs and was hoping this review would help. Oh, well. Reply
  • DerekWilson - Thursday, August 31, 2006 - link

    you can get a really good idea of how it would perform by looking at Gigabyte's card.

    as I mentioned elsewhere in the comments, we requested all the silent cards manufacturers could provide. if we don't have it, it is likely because they were unable to get us the card in time for inclusion in this review.
    Reply
  • Leo V - Thursday, August 31, 2006 - link

    ...I can buy a high-end 7800GT substantially cheaper, buy a quiet Zalman 80mm low-rpm GPU cooler and run it undervolted at 7V. (In fact, I have done exactly that.) It will be cheaper, run WAY cooler, and be quieter, because I can get rid of a case fan that I would need with a "silent" card anyway.

    The idea of running a 50-100watt GPU with a silent cooler is dubious -- you still need a fan somewhere in your system, and the best place is closest to the hottest parts. Those parts are naturally the CPU and GPU.

    Instead of "silent" (but not really) high-end cards, give us cards with heatpipes + large, slow quiet fans that can be undervolted.

    Most importantly, ATI and NVIDIA please stop making 100watt monsters and follow Intel's and AMD's lead in improving power efficiency.
    Reply
  • yyrkoon - Thursday, August 31, 2006 - link

    Sorry, I cant say I would agree that a fan would be quieter than a passive solution, I dont care if you could run it at 1V, and did :) Reply
  • Leo V - Thursday, August 31, 2006 - link

    quote:

    ...I can buy a high-end 7800GT substantially cheaper


    e.g. substantially cheaper than the holy grail "silent" version of the 7800GT.

    And Kudos to the companies for the inventive products and to Anandtech for covering them.
    Reply
  • hkBst - Thursday, August 31, 2006 - link

    I've been waiting for a review of the passively cooled 7900GT from MSI for a while and I was expecting it to be in here. How can it not be?

    Look here: http://www.msi.com.tw/program/products/vga/vga/pro...">http://www.msi.com.tw/program/products/vga/vga/pro...
    Reply
  • DerekWilson - Thursday, August 31, 2006 - link

    We sent multiple requests for cards out to 16 different graphics card manufacturers. I'd say we did pretty well with more than half of those responding.

    We also requested that each manufacturer send us all their passively cooled cards. If something was left out it was either because the manufacturer decided not to send it, or we weren't able to get ahold of it before our submission deadline. We tested a lot of cards and have been working on this for quite some time, so silent cards that have come out recently or were not widely available until recently will not have been included.
    Reply
  • JarredWalton - Thursday, August 31, 2006 - link

    Also, the MSI 7900GT Silent card is only available in Europe, and we did mention this in the review. Reply
  • haris - Thursday, August 31, 2006 - link

    Any chance you could retest the cards using a mid range system. It seems kind of silly to test an FX-55 with a $50-100 video card. Reply
  • nullpointerus - Thursday, August 31, 2006 - link

    Yet Another Silly Performance Retest Request (YAMPRR)

    Testing an FX-55 with a $50-100 video card is not silly; testing graphics cards' performance relative to each other requires removing all other factors including the CPU and RAM. Not everyone has a "mid-range" system, and those who do not have a "mid-range" system do not want the results skewed just to make your life easier. If you want specific performance advice for your particular system and games, why do you not join and post in the forums?
    Reply
  • ss284 - Thursday, August 31, 2006 - link

    Well considering the majority of people who are looking for midrange graphics cards have a midrange system, his request is a perfectly good one. Unless Anandtech enjoys targeting the minority of its readers it should be doing more applicable performance testing. Then again, the FX-55 isnt exactly a cutting edge processor anymore. Just scale everything back 10% and you will have a rough estimate of what performance would be like on a mid range system. Reply
  • nullpointerus - Thursday, August 31, 2006 - link

    Yet Another Defense of a YAMPRR (YADY). *yawn*

    Well considering the majority of people who are looking for midrange graphics cards have a midrange system, his request is a perfectly good one.

    No, it's a silly one. The point of the article is to compare graphics cards, not to make life easier for a certain group of people. People who follow this esoteric stuff religiously tend to distill the information into a more practical form. And as I said, the information he wanted is readily available in the forums. A couple of mouse clicks and a bit of typing is better than ignorantly saying the video card article is silly for not providing framerates similar to some mythical ideal of a mid-range system.

    Unless Anandtech enjoys targeting the minority of its readers it should be doing more applicable performance testing.

    How about you go where the information is normally provided instead of trying to turn all the front page articles into your personal system upgrade newsfeeds?

    Could we just skip ahead to where everyone chimes in with their own ideas of what a mid-range system is. Does it use AMD or Intel? Single or multi-core? How much RAM? Which timings? Which system boards? Which components are overclocked?

    I'll make a deal with you: get together a mid-range system that everyone will agree on, and then I will agree with you that we should conflate graphics cards testing with mid-range system testing. You see, ridding the comments section of silly YAMPRR and YADY posts will not benefit anyone if we still have to deal with all the senseless bickering about little details such as chipset revisions, features, and all the other inane griping I have seen posted when Anandtech picks out a CPU, overclocking, or RAM configuration as representative of X-range systems.
    Reply
  • yyrkoon - Thursday, August 31, 2006 - link

    If its silly, why even bother replying . . . No need to go out of your way to be a jerk. Reply
  • nullpointerus - Friday, September 01, 2006 - link

    Jerks don't take the time to apologize. As for why I apologized, I felt badly for responding in kind. I was belittling people who felt the need to belittle the site without taking the trouble to think their arguments through. Apparently that put some kind of chip on your shoulder such that you felt the need to attack me after I'd already apologized. Reply
  • DerekWilson - Friday, September 01, 2006 - link

    maybe we can take a different angle as the standard reasoning has been rolled out already ...

    if we decide to test with a system that "matches" the graphics card, we are making a decision about what is reasonable for either a specific level of performance or price point. By making such a decision, we limit ourselves -- for instance, in this review we may have chosen a system to match a 7600 GS. But maybe it's too under powered for a 7600 GT, or perhaps its too overpriced for a 7300 GS.

    we absolutely can't test every card with every processor and every memory configuration on every chipset for every review.

    en lieu of choosing one system that is supposed to be a "one size fits all", we can remove the system from consideration by choosing the highest end configuration possible.

    when a graphics card peforms better in our system, we know it is capable of better performance in any system. this is true in almost every case.

    this does put a burden on the reader to understand the limitations of his or her own system -- i.e., will the fact that the 7600 GT performs higher than 7600 GS expose a CPU limitation on the system the reader is building/upgrading.

    this question can be answered in a couple ways.

    with game tests, if you can borrow a high end graphics card and see where the cpu limitation falls at something like 800x600 without aa and af, you'll know where the upper limit on framerate is based on the CPU. thus a decision can be made about the best fit for a card.

    if you can't borrow a higher end card, you can turn all the graphics settings down as far as possible and run at 640x480 or lower if possible (does anything aside from the chronicles of riddick still support 320x240?). this isn't ideal, but even on a low end card you can get a pretty good idea of whether or not there will be a cpu limitation entering into the mix.

    when you know what the cpu limit of your system is, pick the resolution you want to run, and find a card that gives you a number just over this limit. this card is the ideal fit for your system at your resolution. it will deliver the performance your cpu will ask for.

    I know its complicated, but its much better than the can of worms we'd open if we went in another direction.

    In GPU reviews meant to demonstrate the capabilities of a graphics card, we will not add unnecessary bottlenecks to the system.
    Reply
  • nullpointerus - Friday, September 01, 2006 - link

    You need a form letter, or something. Maybe you could put up a short page entitled Why We Test this Way and link to it on the front page of each article. Reply
  • nullpointerus - Thursday, August 31, 2006 - link

    Hmm...that last paragraph came out a little too harsh. I apologize in advance if I've offended anyone. I still think the points are valid, though. Reply
  • JarredWalton - Thursday, August 31, 2006 - link

    If you look at the performance difference between an E6400 stock and 3.0 GHz OC in our http://www.anandtech.com/systems/showdoc.aspx?i=28...">PC Club system review, you will see that it makes virtually no difference in performance even with a 7900 GT. All of these GPUs are the bottleneck in gaming, but we use a higher-end (relatively speaking) CPU just to make sure. Reply
  • imaheadcase - Thursday, August 31, 2006 - link

    I disagree 800x600 is great for sniping, i play on a 9700 Pro and normally switch between 800x600 and 1024x768 and like 800x600 better on large maps. It brings the objects "bigger" to me and lets me get better accuracy.

    Even if i had a 7900GT i would prob not go higher than 1024x768. Don't know why people play at higher rez, makes everything so tiny. Squinting to play a game is annoying and distracting from gameplay :D
    Reply
  • Josh7289 - Thursday, August 31, 2006 - link

    People who have larger monitors have to use higher resolutions to keep things from getting too large, and to make good use of all that real estate, especially when it's an LCD (native resolution).

    For example, a 17" CRT is best run at 1024 x 768 for games, while a 21" or so LCD is best run at 1600 x 1200 or 1680 x 1050, depending on its native resolution.
    Reply
  • Olaf van der Spek - Thursday, August 31, 2006 - link

    What do you mean with 'too large'?
    In games it's not like in Windows where objects get smaller if you increase the resolution.
    Reply
  • DerekWilson - Thursday, August 31, 2006 - link

    this is correct (except with user interfaces for some reason -- and there the exception is warcraft 3). thanks Olaf.

    lower resolution will give you much less accuracy -- larger pixels in the same screen area decrease detail.

    the extreme example is if you have a 4x3 grid and you need to snipe someone -- his head has to be in the center of one of the 12 blocks you have to aim through to even be able to hit him. The smaller these blocks are, the more pixels fit into the head, the more capable you will be of sniping.
    Reply
  • imaheadcase - Thursday, August 31, 2006 - link

    I guess to each his own, i play bf2 on a 19inch CRT monitor at 1024x768. But even if i had a better card i would still prefer lower rez. Reply
  • DerekWilson - Thursday, August 31, 2006 - link

    it's an issue of how games work on the inside ...

    all the objects, shapes, characters, and landscapes are there no matter how you see them. everything is mathematically represented in the software. rendered onto your display is a viewport into the world. this viewport only allows you to see a fixed grid of colors. the color of each pixel is determined by a bunch of factors, but the largest contribution is made by the object that projects onto a particular pixel.

    ... on second thought, this is too hard for me to explain with out a lot of math. lets look at it another way.

    when there's a naked person on tv, they decrease the resolution of the area over the persons naughty bits. this makes it harder to see what's really there because there is a smaller number of large pixels that can only represent one color each. it follows, then, that it would also be harder to shoot the person acurately in said bits.

    I think your preference may be based on your experience with performance at higher resolutions. Responsiveness is necessary for a quality experience in games like bf2. If you get a faster card, I would encourage you to at least try a higher resolution.
    Reply
  • blckgrffn - Thursday, August 31, 2006 - link

    When it is in stock at newegg, its ~$90, not nearly $140.

    Nat
    Reply
  • mostlyprudent - Thursday, August 31, 2006 - link

    I would be interested to know how much noise (quantitatively) an actively cooled 7600GS or 7600GT contributes to a system built in a relatively quiet case like an Antec P150. I am familiar with some of the leaf blowers attached to the higher end cards, but wonder how much overall system noise savings you'ld get in the mid-range cards. Reply
  • wilburpan - Thursday, August 31, 2006 - link

    One obvious use for silent video cards would be in an HTPC system, where quiet performance would be a priority. Can't have those noisy computer fans intrude on watching Snakes on a Plane, you know. :@) Anyway, it would have been nice to include some video playback benchmarks to see how these cards can handle playing back a 1080p HDTV signal, or similiar tests. Reply
  • ViRGE - Thursday, August 31, 2006 - link

    Since HDTV is MPEG2, any modern video card should be able to handle a 1080P signal(since this is an either/or case, it either can or can't). The limitations come in to H.264, where the video decode engine may not be clocked high enough to do higher resolution decoding. Unfortunately, I'm not sure there's any 1080 commerical/usable content that would work with Cyberlink/Intervideo's H.264 decoders(the only ones with GPU acceleration), since Quicktime content doesn't work in those. Reply
  • DerekWilson - Thursday, August 31, 2006 - link

    with nvidia, the video decode engine is clocked off the core -- it actually will run better on a card with fewer pipelines and a higher core speed ... iow, the 7600gt is a better video decode graphics card than a 7900gt at default clock speeds.

    a little counter intuitive, but there it is.

    nvidia 7 series parts with a core clock of >450 MHz should have no problem accelerating 1080p decode on players that support purevideo.
    Reply
  • MontagGG - Thursday, August 31, 2006 - link

    Which of these have HDCP? Reply
  • DerekWilson - Thursday, August 31, 2006 - link

    to my knowledge, none of the cards tested here support hdcp. but I will certainly try to confirm this ... Reply

Log in

Don't have an account? Sign up now