Meet The Gigabyte GeForce GTX 660 Ti OC

Our final GTX 660 Ti of the day is Gigabyte’s entry, the Gigabyte GeForce GTX 660 Ti OC. Unlike the other cards in our review today this is not a semi-custom card but rather a fully-custom card, which brings with it some interesting performance ramifications.

GeForce GTX 660 Ti Partner Card Specification Comparison
  GeForce GTX 660 Ti(Ref) EVGA GTX 660 Ti Superclocked Zotac GTX 660 Ti AMP! Gigabyte GTX 660 Ti OC
Base Clock 915MHz 980MHz 1033MHz 1033MHz
Boost Clock 980MHz 1059MHz 1111MHz 1111MHz
Memory Clock 6008MHz 6008MHz 6608MHz 6008MHz
Frame Buffer 2GB 2GB 2GB 2GB
TDP 150W 150W 150W ~170W
Width Double Slot Double Slot Double Slot Double Slot
Length N/A 9.5" 7.5" 10,5"
Warranty N/A 3 Year 3 Year + Life 3 Year
Price Point $299 $309 $329 $319

The big difference between a semi-custom and fully-custom card is of course the PCB; fully-custom cards pair a custom cooler with a custom PCB instead of a reference PCB. Partners can go in a few different directions with custom PCBs, using them to reduce the BoM, reduce the size of the card, or even to increase the capabilities of a product. For their GTX 660 Ti OC, Gigabyte has gone in the latter direction, using a custom PCB to improve the card.

On the surface the specs of the Gigabyte GeForce GTX 660 Ti OC are relatively close to our other cards, primarily the Zotac. Like Zotac Gigabyte is pushing the base clock to 1033MHz and the boost clock to 1111MHz, representing a sizable 118MHz (13%) base overclock and a 131MHz (13%) boost overclock respectively. Unlike the Zotac however there is no memory overclocking taking place, with Gigabyte shipping the card at the standard 6GHz.

What sets Gigabyte apart here in the specs is that they’ve equipped their custom PCB with better VRM circuitry, which means NVIDIA is allowing them to increase their power target from the GTX 660 Ti standard of 134W to an estimated 141W. This may not sound like much (especially since we’re working with an estimate on the Gigabyte board), but as we’ve seen time and time again GK104 is power-limited in most scenarios. A good GPU can boost to higher bins than there is power available to allow it, which means increasing the power target in a roundabout way increases performance. We’ll see how this works in detail in our benchmarks, but for now it’s good enough to say that even with the same GPU overclock as Zotac the Gigabyte card is usually clocking higher.

Moving on, Gigabyte’s custom PCB measures 8.4” long, and in terms of design it doesn’t bear a great resemblance to either the reference GTX 680 PCB nor the reference GTX 670 PCB; as near as we can tell it’s completely custom. In terms of design it’s nothing fancy – though like the reference GTX 670 the VRMs are located in the front – and as we’ve said before the real significance is the higher power target it allows. Otherwise the memory layout is the same as the reference GTX 660 Ti with 6 chips on the front and 2 on the back. Due to its length we’d normally insist on there being some kind of stiffener for an open air card, but since Gigabyte has put the GPU back far enough, the heatsink mounting alone provides enough rigidity to the card.

Sitting on top of Gigabyte’s PCB is a dual fan version of Gigabyte’s new Windforce cooler. The Windforce 2X cooler on their GTX 660 Ti is a bit of an abnormal dual fan cooler, with a relatively sparse aluminum heatsink attached to unusually large 100mm fans. This makes the card quite large and more fan than heatsink in the process, which is not something we’ve seen before.

The heatsink itself is divided up into three segments over the length of the card, with a pair of copper heatpipes connecting them. The bulk of the heatsink is over the GPU, while a smaller portion is at the rear and an even smaller portion is at the front, which is also attached to the VRMs. The frame holding the 100mm fans is then attached at the top, anchored at either end of the heatsink. Altogether this cooling contraption is both longer and taller than the PCB itself, making the final length of the card nearly 10” long.

Finishing up the card we find the usual collection of ports and connections. This means 2 PCIe power sockets and 2 SLI connectors on the top, and 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2 on the front. Meanwhile toolless case users will be happy to see that the heatsink is well clear of the bracket, so toolless clips are more or less guaranteed to work here.

Rounding out the package is the usual collection of power adapters and a quick start guide. While it’s not included in the box or listed on the box, the Gigabyte GeForce GTX 660 Ti OC works with Gigabyte’s OC Guru II overclocking software, which is available on Gigabyte’s website. Gigabyte has had OC Guru for a number of years now, and with this being the first time we’ve seen OC Guru II we can say it’s greatly improved from the functional and aesthetic mess that defined the previous versions.

While it won’t be winning any gold medals, in our testing OC Guru II gets the job done. Gigabyte offers all of the usual tweaking controls (including the necessary power target control), along with card monitoring/graphing and an OSD. It’s only real sin is that Gigabyte hasn’t implemented sliders on their controls, meaning that you’ll need to press and hold down buttons in order to dial in a setting. This is less than ideal, especially when you’re trying to crank up the 6000MHz memory clock by an appreciable amount.

Wrapping things up, the Gigebyte GeForce GTX 660 Ti OC comes with Gigabyte’s standard 3 year warranty. Gigabyte will be releasing it at an MSRP of $319, $20 over the price of a reference-clocked GTX 660 Ti and $10 less than the most expensive card in our roundup today.

Meet The Zotac GeForce GTX 660 Ti AMP! Edition The First TXAA Game & The Test
Comments Locked

313 Comments

View All Comments

  • CeriseCogburn - Thursday, August 23, 2012 - link

    No contribution there.
  • claysm - Wednesday, August 22, 2012 - link

    The article says that the 660 Ti is an average of 10-15% faster than the 7870, and that's true. But I feel that that average doesn't reflect how close those two cards really are in most games. If you throw out the results for Portal 2 and Battlefield 3 (since they are nVidia blowouts), the 660 Ti is only about 5% faster than the 7870.
    Now obviously you can't just throw those results away because you don't like them, but if you're not playing BF3 or Portal 2, then the 660 Ti and the 7870 are actually very close. And given the recent price drop of the 7870, it would definitely win the price/performance mark.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    At what resolution ?
    Oh, doesn't matter apparently.
  • claysm - Thursday, August 23, 2012 - link

    At every resolution tested. 1680x1050, 1920x1200, and 2560x1600.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    Not true nice try though not really was pathetic
  • CeriseCogburn - Thursday, August 23, 2012 - link

    No PhysX, no adaptive v-sync, inferior 3D, inferior 3 panel gaming, no target frame rate, poorer IQ, the list goes on and on.
    you have to be a fanboy fool to buy amd, and there are a lot of fools around, you being one of them.
  • claysm - Thursday, August 23, 2012 - link

    PhysX is not that great. There is only a single this year that will have PhysX support, and that is Borderlands 2. Most of the effects that PhysX adds are just smoke and more fluid and cloth dynamics. Sometimes a slightly more destructible environment.
    Adaptive V-Sync is cool, I saw a demonstration video of it.
    Inferior 3D is true, although your next point is stupid. AMD's Eyefinity is much better than nVidia Surround.
    I'm not a fanboy, Go to the bench and look at the results, do the math if you want. Barring BF3 and Portal 2, again since they are huge wins for nVidia, every other game on the list is extremely close. Of the 35 benchmarks that were run, it's the 8 from BF3 and Portal 2 that completely blow the average. The 660 Ti is more powerful, but the 7870 is a lot closer to the 660 Ti than the average would lead you to believe.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    Yeah whatever - buy the slow loser without the features, say they don't matter, get the one with crappy drivers, say that doesn't matter.. throw out a few games, say they don't matter, ignore the driver support that goes back to the nVidia 6 series, that doesn't matter, ignore the pathetic release drivers of amd, say that doesn't matter... put in the screwy amd extra download junk for taskbar control in eyefinity, pretend that doesn't matter - no bezel peek pretend that doesn't matter...

    DUDE - WHATEVER ... you're a FOOL to buy amd.
  • Ambilogy - Thursday, August 23, 2012 - link

    "Yeah whatever"

    Essentially when someone said that theres 5% diference you 'just forget about it, nothings here to notice I'm busy trolling'. bullshit

    "- buy the slow loser"

    So for you slower loser is a little dif in framerates only in the allmyghty 19XX x 1XXX res? where everything is playable with other cards also? what when new titles come and then some stuff starts to go wrong with 660ti?. you can actually ignore the difference now and future titles could go better for AMD for opencl stuff. You should have said "a little slower now, if we are lucky, still a little slower on the future". bullshit

    "without the features, say they don't matter"

    I don't actually notice phyxs playing... and... if 2% of people play in very high res, how many do you think plays at your marvelous nvidia 3d? bullshit. Its like saying this is bad because only 2% uses it, and this is good but the percentage is even less. bullshit

    "get the one with crappy drivers"

    You read that a lot of people had amd driver issues, nice, like a lot of people also has nvidia driver issues... do you know the percentage of driver failures? the failures stand out only because normal working drivers don't drive attention. Does not mean that its plagged by bugs. bullshit.

    ", say that doesn't matter.. throw out a few games, say they don't matter, ignore the driver support that goes back to the nVidia 6 series, that doesn't matter, ignore the pathetic release drivers of amd, say that doesn't matter... "

    Hey nice! i know how to repeat stuff that I have already said without proving anything also! look: bullshit, bullshit, bullshit. The games you think matter can still be played, its future games that will tax this cards to new limits, then we will see, and if those include opencl, where will be your god? "well I could play battlefield 3 better some time ago, im sure these new games don't matter". or maybe a "yeah whatever" ? :)

    And im tyred now, I think this card is a fail, what does it do that cards already didn't do? what market do they cover that was not previously covered?

    OH NO BUT WE HAVE BETTER FPS FOR MAIN RESOLUTIONS
    Well, good luck with that in the future... I'm sure a man will buy a good 7950 with factory oc that will go just about as well, still playable and nice, and when the future comes then what? you can cry, cry hard.

    You cannot accept that your card is:

    1. Easy to equalize in performance, with little performance difference in most games or actually none if OC is considered.
    2. Focused on the marketing of some today games and completely forgot about future, memory bandwidth and so on.
    3. Overly marketised by nvidia.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    You cannot accept (your lies) that your card is:

    1. Easy to equalize in performance, with little performance difference in most games or actually none if OC is considered.

    I don't have a problem with that. 660Ti is hitting 1300+ on core and 7000+ on memory, and so you have a problem with that.
    The general idea you state, though I'M ALL FOR IT MAN!

    A FEW FPS SHOULD NOT BE THE THING YOU FOCUS ON, ESPECIALLY WHEN #1 ! ALL FOR IT ! 100% !

    Thus we get down to the added features- whoops ! nVidia is about 10 ahead on that now. That settles it.
    Hello ? Can YOU accept THAT ?
    If FOLLOWS 100% from your #1
    I'd like an answer about your acceptance level.

    2. Focused on the marketing of some today games and completely forgot about future, memory bandwidth and so on.

    Nope, it's already been proven it's a misnomer. Cores are gone , fps is too, before memory can be used. In the present, a bit faster now, cranked to the max, and FAILING on both sides with CURRENT GAMES - but some fantasy future is viable ? It's already been aborted.
    You need to ACCEPT THAT FACT.
    The other possibility would be driver enhancements, but both sides do that, and usually nvidia does it much better, and SERVICES PAST CARDS all the way back to 6 series AGP so amd loses that battle "years down the road" - dude...
    Accept or not ? Those are current facts.

    3. Overly marketised by nvidia. "

    Okay, so whatever that means...all I see is insane amd fanboysim - that's the PR call of the loser - MARKETING to get their failure hyped - hence we see the mind infected amd fanboys everywhere, in fact, you probably said that because you have the pr pumped nVidia hatred.
    Here's an example of "marketised""
    http://www.verdetrol.com/
    ROFL - your few and far between and dollars still hard at work.
    AMD adverts your butt in CCC - install and bang - the ads start flowing right onto your CCC screen...
    Is that " Overly marketised" ?

    I'm sorry you're going to have to do much better than that.

Log in

Don't have an account? Sign up now