Meet The Gigabyte GeForce GTX 660 Ti OC

Our final GTX 660 Ti of the day is Gigabyte’s entry, the Gigabyte GeForce GTX 660 Ti OC. Unlike the other cards in our review today this is not a semi-custom card but rather a fully-custom card, which brings with it some interesting performance ramifications.

GeForce GTX 660 Ti Partner Card Specification Comparison
  GeForce GTX 660 Ti(Ref) EVGA GTX 660 Ti Superclocked Zotac GTX 660 Ti AMP! Gigabyte GTX 660 Ti OC
Base Clock 915MHz 980MHz 1033MHz 1033MHz
Boost Clock 980MHz 1059MHz 1111MHz 1111MHz
Memory Clock 6008MHz 6008MHz 6608MHz 6008MHz
Frame Buffer 2GB 2GB 2GB 2GB
TDP 150W 150W 150W ~170W
Width Double Slot Double Slot Double Slot Double Slot
Length N/A 9.5" 7.5" 10,5"
Warranty N/A 3 Year 3 Year + Life 3 Year
Price Point $299 $309 $329 $319

The big difference between a semi-custom and fully-custom card is of course the PCB; fully-custom cards pair a custom cooler with a custom PCB instead of a reference PCB. Partners can go in a few different directions with custom PCBs, using them to reduce the BoM, reduce the size of the card, or even to increase the capabilities of a product. For their GTX 660 Ti OC, Gigabyte has gone in the latter direction, using a custom PCB to improve the card.

On the surface the specs of the Gigabyte GeForce GTX 660 Ti OC are relatively close to our other cards, primarily the Zotac. Like Zotac Gigabyte is pushing the base clock to 1033MHz and the boost clock to 1111MHz, representing a sizable 118MHz (13%) base overclock and a 131MHz (13%) boost overclock respectively. Unlike the Zotac however there is no memory overclocking taking place, with Gigabyte shipping the card at the standard 6GHz.

What sets Gigabyte apart here in the specs is that they’ve equipped their custom PCB with better VRM circuitry, which means NVIDIA is allowing them to increase their power target from the GTX 660 Ti standard of 134W to an estimated 141W. This may not sound like much (especially since we’re working with an estimate on the Gigabyte board), but as we’ve seen time and time again GK104 is power-limited in most scenarios. A good GPU can boost to higher bins than there is power available to allow it, which means increasing the power target in a roundabout way increases performance. We’ll see how this works in detail in our benchmarks, but for now it’s good enough to say that even with the same GPU overclock as Zotac the Gigabyte card is usually clocking higher.

Moving on, Gigabyte’s custom PCB measures 8.4” long, and in terms of design it doesn’t bear a great resemblance to either the reference GTX 680 PCB nor the reference GTX 670 PCB; as near as we can tell it’s completely custom. In terms of design it’s nothing fancy – though like the reference GTX 670 the VRMs are located in the front – and as we’ve said before the real significance is the higher power target it allows. Otherwise the memory layout is the same as the reference GTX 660 Ti with 6 chips on the front and 2 on the back. Due to its length we’d normally insist on there being some kind of stiffener for an open air card, but since Gigabyte has put the GPU back far enough, the heatsink mounting alone provides enough rigidity to the card.

Sitting on top of Gigabyte’s PCB is a dual fan version of Gigabyte’s new Windforce cooler. The Windforce 2X cooler on their GTX 660 Ti is a bit of an abnormal dual fan cooler, with a relatively sparse aluminum heatsink attached to unusually large 100mm fans. This makes the card quite large and more fan than heatsink in the process, which is not something we’ve seen before.

The heatsink itself is divided up into three segments over the length of the card, with a pair of copper heatpipes connecting them. The bulk of the heatsink is over the GPU, while a smaller portion is at the rear and an even smaller portion is at the front, which is also attached to the VRMs. The frame holding the 100mm fans is then attached at the top, anchored at either end of the heatsink. Altogether this cooling contraption is both longer and taller than the PCB itself, making the final length of the card nearly 10” long.

Finishing up the card we find the usual collection of ports and connections. This means 2 PCIe power sockets and 2 SLI connectors on the top, and 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2 on the front. Meanwhile toolless case users will be happy to see that the heatsink is well clear of the bracket, so toolless clips are more or less guaranteed to work here.

Rounding out the package is the usual collection of power adapters and a quick start guide. While it’s not included in the box or listed on the box, the Gigabyte GeForce GTX 660 Ti OC works with Gigabyte’s OC Guru II overclocking software, which is available on Gigabyte’s website. Gigabyte has had OC Guru for a number of years now, and with this being the first time we’ve seen OC Guru II we can say it’s greatly improved from the functional and aesthetic mess that defined the previous versions.

While it won’t be winning any gold medals, in our testing OC Guru II gets the job done. Gigabyte offers all of the usual tweaking controls (including the necessary power target control), along with card monitoring/graphing and an OSD. It’s only real sin is that Gigabyte hasn’t implemented sliders on their controls, meaning that you’ll need to press and hold down buttons in order to dial in a setting. This is less than ideal, especially when you’re trying to crank up the 6000MHz memory clock by an appreciable amount.

Wrapping things up, the Gigebyte GeForce GTX 660 Ti OC comes with Gigabyte’s standard 3 year warranty. Gigabyte will be releasing it at an MSRP of $319, $20 over the price of a reference-clocked GTX 660 Ti and $10 less than the most expensive card in our roundup today.

Meet The Zotac GeForce GTX 660 Ti AMP! Edition The First TXAA Game & The Test
Comments Locked

313 Comments

View All Comments

  • TheJian - Friday, August 24, 2012 - link

    Pity I hadn't dug a bit further and found this also...I just checked 3 sites and used them...LOL.

    Even crysis 2 is a wash ~1fps difference 1920x1080 and above again we see below 30min fps even on 7950B. It takes the 7970 to do 30 and it won't be there all day. It will likely dip under 30. Ryan comments a few times your experience won't be great below 60 as those will dip :)

    The 7950 or B rises with volts and a lot of them have a hard time hitting over 1150 and run 80watts more. Not good if that's how you have to clock your card to keep up (or even win...it's bad either way). The one at guru3d.com was a regular 7950 that those #'s came from so it will have a hard time beating a 1300mhz much on NV's side. Memory can hit 7.71 as shown at hardocp with ONE sample. Must be pretty easy. Memory won't be an issue at 1920x1200 or even less at 1920x1080 and you can OC the mem any time you like :) Interesting article.

    Again, 1322/6.7ghz on mem. Above Zotac Amp in both cases. Easy to hit 1300 I guess ;) and it still won't be as hot/noisy or use as many watts at those levels. Not that I'd run either card at max. They're all great cards, it's a consumers dream right now, but NV just seems to be in better position and Ryan's comments were just out of touch with reality.
  • CeriseCogburn - Saturday, August 25, 2012 - link

    Well as far as the overclocking that's almost all amd people were left with since the 600 series nVidia released.
    All the old whines were gone - except a sort of memory whine. That gets proven absolutely worthless, but it never ends anyway.
    Amd does not support their cards with drivers properly like nvida, that's just a fact they cannot get away from, no matter how many people claim it's a thing of the past it comes up every single launch, and then continues - that INCLUDES this current / latest amd card released.
    So... it's not a thing of the past.
    No matter how many amd liars say so, they're lying.
  • CeriseCogburn - Saturday, August 25, 2012 - link

    I saw this when their article hit but here is a good laugh... after the you know who fans found it so much fun to attack nVidia about " rejected chips" that couldn't make the cut, look what those criticizers got from their mad amd masters !
    " These numbers paint an interesting picture, albeit not one that is particularly rosy. For the 7970 AMD was already working with top bin Tahiti GPUs, so to make a 7970GE they just needed to apply a bit more voltage and call it a day. The 7950 on the other hand is largely composed of salvaged GPUs that failed to meet 7970 specifications. GPUs that failed due to damaged units aren’t such a big problem here, but GPUs that failed to meet clockspeed targets are another matter. As a result of the fact that AMD is working with salvaged GPUs, AMD has to apply a lot more voltage to a 7950 to guarantee that those poorly clocking GPUs will correctly hit the 925MHz boost clock. "
    ha
    ROFLMHO - oh that great, great, great 40% overclocker needs LOTS OF EXTRA VOLTAGE - TO HIT 925 mhz ..
    LOL
    http://www.anandtech.com/show/6152/amd-announces-n...
    Oh man you can't even make this stuff up !
    HAHAHAHAHHAHAHAHAHAAAAAaaaa
  • Ambilogy - Saturday, August 25, 2012 - link

    Oh you were comparing it to the 7950? I was promoting the 7870 :) in the spanish forums they did they own kind of review because they don't trust this kind of page reviews and the OC 7870 of a member performs better than the OC 660TI.

    So if we talk about the 7950

    The winner is clear, the 7950 wins, you are all facts well deal with this:
    Techpowerup did the most quantity of games, also reviewed the 660TI in 4 diferent reviews for each edition, you can talk all you want nvidia fanboys but techpowerup showed that for your 1080p, 7950 is 5% slower than 660TI, but then w1zzard himself has a post in the forum that you have to suppose a 5% increase in performance for 7950 for the boost he did not include. Which yields equal performance at average, not only that but tom's hardware shows something you have forgotten, minimum FPS rendered in the games, which shows 660TI horrible minimum FPS that indicate a very unstable card, my guess is your god card has very high highs for the good GPU core but when things get demanding the memory bandwidth can't keep the pace, inducing some kind of lag segments.

    It's easy, if they render the same performance average in games with almost the same price, the card that wins is the one with the better features: That is GPGPU, frame stability and overclock, which is by far much more important than closed source Physx for 2 games every hundred years. Why? OpenCL is starting to get used more and more, and it's showing awesome results. Why does nvidia cards sell more? well they still tell the reviewers how to review the card to make it look nice, they made a huge hype of their products and they have a huge fanbase that cannot see:

    1- Nvidia is selling chips which only look good today so they have faster obsolescence and therefore they can sell their next series better.
    2- They are completely oblivious to the fact that they see amd cards with a non objective point of view.
    3- Proofs of equally performing amd cards with more OC rom is ussuallly defended by them talking about the past and attacking the so called amd fanboys as follows:

    "REALLY IS CRAPPY CHIPS from the low end loser harvest they had to OVER VOLT to get to their boost...

    LOL
    LOL\
    OLO
    I mean there it is man - the same JUNK amd fanboys always use to attack nVida talking about rejected chips for lower clocked down the line variants has NOW COME TRUE IN FULL BLOWN REALITY FOR AMD....~!
    HAHHAHAHAHAHAHAHHAHA
    AHHAHAHAHAHAA
    omg !
    hahahahahahhahaha
    ahhahaha
    ahaha
    Holy moly. hahahahahhahahha"

    Telling chips are bad without using them, manipulating info showing reviews that favor nvidia, exaggerating features that are not so important, ignoring some that are.

    Explain to me how what I quoted (in example) changes the fact that I can go and buy a 7950 with pre OC and have same performance in average due to w1zz studies, and even OC more and forget about 660TI. Explain to me, how that overly exaggerated laugh changes the minimum frame rates of the TI and makes it good for no reason. Well It doesn't change anything actually.

    The only cure I see for you fan-guys is get a 7950 and OC it, or buy a good version already, then you would stop complaining the moment you see its not a bad card. And also get the 660TI so you can compare also. You will see no difference that could make you still think AMD cards are crap, you will not see the driver issues, you will notice that physx don't make the difference, and hopefully you will be a more balanced person.

    I'm not a fanboy, I like nvidia cards, I have had a couple, and to me the 670 is a great card, but not this 660TI crap, I'm not a fanboy because I know to see when a company makes a meh release.
  • CeriseCogburn - Sunday, August 26, 2012 - link

    I've already had better, so you assume far too much, and of course, are a fool. YOU need to go get the card and see the driver problems, PERSONALLY, instead of talking about two other people on some forum...
    Get some personal experience.
    NEXT: Check out the Civ 5 COMPUTE Perf above - this site has the 6970 going up 6+fps while the GTX570 goes down 30 fps... from the former bench...
    http://www.anandtech.com/show/4061/amds-radeon-hd-...
    LOLOL
    No bias here.....
    The 580 that was left out of this review for COMPUTE scored EQUIVALENT to the 7970, 265.7 fps December of 2010.
    So you want to explain how the 570 goes down, the 580 is left out, and the amd card rises ?
    Yeah, see.... there ya go and famboy - enjoy the cheatie lies.
  • Cliffro - Saturday, September 1, 2012 - link

    The comment section is filled with delusional fanboys from both camps.

    To the Nvidia fanboys, the 600 series is great when you get a working card, that doesn't just randomly start losing performance and then eventually refuse to work at all. Doesn't Red Screen of Death. or get constant "Driver Stopped Responding" errors etc etc. No review mentions these issues.

    To the AMD Fanboys, the drivers really do suck, the grey screen of death issue is/was a pain, card not responding after turning off the monitors after being idle for however long also sounds like a PITA. Again no review has ever mention these issues.

    I've been using Nvidia the majority of my time gaming, and have used ATI/AMD as well though. Neither one is perfect, both have moments where they just plain SUCK ASS!

    I'm currently using 2 GTX 560 Ti's and am currently considering up/sidegrading to a single 670/680 or 7970/7950, and during my research I've read horror stories about both the 600 series and the 7000 series. What's funny is everyone ALWAYS says look at the reviews, none of which mention the failures from both camps. none speak of reliability of the cards, because they have them and test them in what a week's time period at most?

    Here's a good example, one of the fastest 670's was the Asus 670 DCII Top, it got rave reviews, but got horrible user reviews because of reliability issues, got discontinued, and is no longer available at Newegg.

    I can see why EVGA dropped their lifetime warranty.

    All of this said, I'm actually leaning towards AMD this round, sure they have issues and even outright failures but they aren't as prominent as the ones I'm reading about from Nvidia. I don't like feeling like I'm playing the lottery when buying a video card, and with the 600 series from Nvidia that's the feeling I'm getting.
  • Cliffro - Saturday, September 1, 2012 - link

    I forgot to say YMMV at the end there.
  • CeriseCogburn - Monday, September 3, 2012 - link

    Right at the newegg 680 card you cherry picked for problems..

    "Cons: More expensive than the other 670's

    Other Thoughts: This card at stock settings will beat a stock GTX680 at stock settings in most games. I think this is the best deal for a video card at the moment.

    I sold my 7970 and bought this as AMD's drivers are so bad right now. Anytime your computer sleeps it will crash, and I was experiencing blue screens in some games. I switched from 6970's in crossfire to the 7970 and wished I had my 6970's back because of the driver issues. This card however has been perfect so far and runs much much cooler than my 6970's! They would heat my office up 20 degrees!

    I also have a 7770 in my HTPC and am experiencing driver issues with it as well. AMD really needs to get there act together with their driver releases! "

    LOL - and I'm sure there isn't an amd model design that has been broken for a lot of purchasers....
    Sure....
    One card, and "others here are rabid fanboys" - well if so, you're a rabid idiot.
  • mrfunk10 - Thursday, September 6, 2012 - link

    lol you've gotta be one of the most ridiculous, blind, hard-headed fanboy troll noobs i've ever seen on the internet. The amd 7 series atm are great cards and at $300 for the 7950 i'm sure they make nvidia sweat. I myself am running a gigabyte windforce 660ti an am very happy with it but mygod can the 79's oc.
  • Cliffro - Saturday, September 8, 2012 - link

    "One card, and "others here are rabid fanboys" - well if so, you're a rabid idiot. "

    Have you not noticed your own constant posting of Pro Nvidia statements, and at the same time bashing AMD? And I said delusional not rabid. Though you may be on to something with that.....

    EVGA recalled a lot of 670 SC's, gave out FTW models(680 PCB) as replacements. Something about a "bad batch".

    Maybe it's an partner problem, maybe it's an Nvidia problem I don't know. But I know Asus DCII cards have lots of low ratings regardless if it's AMD or Nvidia. The Asus 79xx cards with DCII have 3 eggs or less overall, similar to the 6xx series from them. Gigabyte has better ratings, and less negatives than Asus, MSI and even EVGA on some models. So maybe it is a partner problem.

    I also must be imagining my Nvidia TDR errors or drivers/cards crashing (with no recovery) while playing a simple game (Bejeweled 3, yeah I know...) and other games occasionally as well since Nvidia can do nothing wrong in the driver department right? Just like my AMD friend seemed to think I was imagining my AMD driver issues when I had my HD 2900 Pro.

    It's also funny that I'm being attacked by a "Devoted Nvidia fan", and my friends usually consider me a "Devoted Nvidia fan". Go figure. I've never been totally against any company, never anti-Intel or AMD, or Nvidia or ATI/AMD. The only company I have avoided is Hitatchi and their hard drives, and Intel initially because honestly their stuff seemed overpriced during the P4 days.

    Maybe I'm just getting cynical as I get older...but Hard Drives started becoming unreliable the last couple of years, and now video cards are suffering more failures than I'm used to seeing. And SSD's with Sandforce seem to suck ass as well reliability wise, they are almost comparable with the 600 series, high speed and more failures than I'm comfortable with. Though in Nvidia's defense even the 600 series isn't as bad as Sandforce or OCZ or Seagate.

Log in

Don't have an account? Sign up now