Gigabyte


Gigabyte has a lot of experience in the silent graphics market, and we have several ATI and NVIDIA cards from them for this review. An interesting statistic is that 90% of Gigabyte's shipped graphics solutions are silent, which makes a statement about the desirability of silent cards. Gigabyte is well represented in this review, having provided us with the highest number of solutions for testing. While their cards may not win any awards for good looks (the gold and turquoise coloring and unique heat sink designs aren't very easy on the eyes), they apparently handle silent operation very well, even at higher-than-stock clock speeds.

On the NVIDIA side, we have the Gigabyte 7600 GT, 7600 GS, 7300 GT, and 7300 GS. The Gigabyte 7600 GS comes shipped with its core overclocked to 450MHz, slightly higher (by 50MHz) than the stock version of this card. The Gigabyte 7300 GT also has a factory overclock of 450MHz core clock as opposed to the standard 350MHz core clock and an 800MHz memory clock verses the standard 667MHz memory clock on the 7300 GT. On the ATI side, we have the Gigabyte X1600 XT, X1600 Pro, X1300 Pro and X1300. All other Gigabyte cards aside from the 7600 GS and 7300 GT are clocked at reference speeds.




As we can see, the Gigabyte NVIDIA 7600 GT, ATI X1600 XT, and ATI X1600 Pro all have almost identical heat sink designs. The heat sinks are unique, with a bulky sink up front towards the DVI connections and heatpipes connecting to other heat sinks on the front and back of the card. The Gigabyte cards with this style heat sink are much more bulky than the other ones, and it's important to note that the silver heat sink uses up a second slot making these dual-slot solutions.


The Gigabyte X1300 Pro looks almost the same as the previous three Gigabyte cards; however, it is missing the extra heat sink on the back and the second metal pipe that would extend to it. The silver heat sink on the front still takes up an extra slot however, which is a point against this and the other three cards.



The Gigabyte 7600 GS and 7300 GT are very similar in appearance. The heat sinks on these cards are much more compact than those on the first four cards we mentioned. These cards don't take up an extra slot and only have a gold heat sink on the front, with a strange looking heatpipe running out and back in a "U" shape from the main heat sink. The heatpipe is attached to thin metal sheets, all in the name of increasing surface area and removing heat from the hotter portions of the card. These thin metal sheets seem a little delicate and might bend if the card is mishandled, though a slightly bent heat sink fin isn't really a major problem.



The Gigabyte 7300 GS and X1300 have heat sinks with a unique type of gold cover or hood running along the top part of the cards, over a silver heat sink on the cards' processor. The Gigabyte 7300 GS doesn't have a bridge connector on the top of the card for SLI operation, but it will still run in SLI (with a second card) without the bridge connection.

Gigabyte seems to have a definite plan about their custom heat sink designs, and after testing and overclocking these cards, there isn't much doubt that they perform well. They did, however, tend to get very hot to the touch after repeated testing, to the point where they became difficult to hold. We'll talk more about how well these cards overclocked in the "Overclocking" section.

ASUS HIS & EVGA
Comments Locked

49 Comments

View All Comments

  • yyrkoon - Thursday, August 31, 2006 - link

    If its silly, why even bother replying . . . No need to go out of your way to be a jerk.
  • nullpointerus - Friday, September 1, 2006 - link

    Jerks don't take the time to apologize. As for why I apologized, I felt badly for responding in kind. I was belittling people who felt the need to belittle the site without taking the trouble to think their arguments through. Apparently that put some kind of chip on your shoulder such that you felt the need to attack me after I'd already apologized.
  • DerekWilson - Friday, September 1, 2006 - link

    maybe we can take a different angle as the standard reasoning has been rolled out already ...

    if we decide to test with a system that "matches" the graphics card, we are making a decision about what is reasonable for either a specific level of performance or price point. By making such a decision, we limit ourselves -- for instance, in this review we may have chosen a system to match a 7600 GS. But maybe it's too under powered for a 7600 GT, or perhaps its too overpriced for a 7300 GS.

    we absolutely can't test every card with every processor and every memory configuration on every chipset for every review.

    en lieu of choosing one system that is supposed to be a "one size fits all", we can remove the system from consideration by choosing the highest end configuration possible.

    when a graphics card peforms better in our system, we know it is capable of better performance in any system. this is true in almost every case.

    this does put a burden on the reader to understand the limitations of his or her own system -- i.e., will the fact that the 7600 GT performs higher than 7600 GS expose a CPU limitation on the system the reader is building/upgrading.

    this question can be answered in a couple ways.

    with game tests, if you can borrow a high end graphics card and see where the cpu limitation falls at something like 800x600 without aa and af, you'll know where the upper limit on framerate is based on the CPU. thus a decision can be made about the best fit for a card.

    if you can't borrow a higher end card, you can turn all the graphics settings down as far as possible and run at 640x480 or lower if possible (does anything aside from the chronicles of riddick still support 320x240?). this isn't ideal, but even on a low end card you can get a pretty good idea of whether or not there will be a cpu limitation entering into the mix.

    when you know what the cpu limit of your system is, pick the resolution you want to run, and find a card that gives you a number just over this limit. this card is the ideal fit for your system at your resolution. it will deliver the performance your cpu will ask for.

    I know its complicated, but its much better than the can of worms we'd open if we went in another direction.

    In GPU reviews meant to demonstrate the capabilities of a graphics card, we will not add unnecessary bottlenecks to the system.
  • nullpointerus - Friday, September 1, 2006 - link

    You need a form letter, or something. Maybe you could put up a short page entitled Why We Test this Way and link to it on the front page of each article.
  • nullpointerus - Thursday, August 31, 2006 - link

    Hmm...that last paragraph came out a little too harsh. I apologize in advance if I've offended anyone. I still think the points are valid, though.
  • JarredWalton - Thursday, August 31, 2006 - link

    If you look at the performance difference between an E6400 stock and 3.0 GHz OC in our http://www.anandtech.com/systems/showdoc.aspx?i=28...">PC Club system review, you will see that it makes virtually no difference in performance even with a 7900 GT. All of these GPUs are the bottleneck in gaming, but we use a higher-end (relatively speaking) CPU just to make sure.
  • imaheadcase - Thursday, August 31, 2006 - link

    I disagree 800x600 is great for sniping, i play on a 9700 Pro and normally switch between 800x600 and 1024x768 and like 800x600 better on large maps. It brings the objects "bigger" to me and lets me get better accuracy.

    Even if i had a 7900GT i would prob not go higher than 1024x768. Don't know why people play at higher rez, makes everything so tiny. Squinting to play a game is annoying and distracting from gameplay :D
  • Josh7289 - Thursday, August 31, 2006 - link

    People who have larger monitors have to use higher resolutions to keep things from getting too large, and to make good use of all that real estate, especially when it's an LCD (native resolution).

    For example, a 17" CRT is best run at 1024 x 768 for games, while a 21" or so LCD is best run at 1600 x 1200 or 1680 x 1050, depending on its native resolution.
  • Olaf van der Spek - Thursday, August 31, 2006 - link

    What do you mean with 'too large'?
    In games it's not like in Windows where objects get smaller if you increase the resolution.
  • DerekWilson - Thursday, August 31, 2006 - link

    this is correct (except with user interfaces for some reason -- and there the exception is warcraft 3). thanks Olaf.

    lower resolution will give you much less accuracy -- larger pixels in the same screen area decrease detail.

    the extreme example is if you have a 4x3 grid and you need to snipe someone -- his head has to be in the center of one of the 12 blocks you have to aim through to even be able to hit him. The smaller these blocks are, the more pixels fit into the head, the more capable you will be of sniping.

Log in

Don't have an account? Sign up now