Meet The Gigabyte GeForce GTX 660 Ti OC

Our final GTX 660 Ti of the day is Gigabyte’s entry, the Gigabyte GeForce GTX 660 Ti OC. Unlike the other cards in our review today this is not a semi-custom card but rather a fully-custom card, which brings with it some interesting performance ramifications.

GeForce GTX 660 Ti Partner Card Specification Comparison
  GeForce GTX 660 Ti(Ref) EVGA GTX 660 Ti Superclocked Zotac GTX 660 Ti AMP! Gigabyte GTX 660 Ti OC
Base Clock 915MHz 980MHz 1033MHz 1033MHz
Boost Clock 980MHz 1059MHz 1111MHz 1111MHz
Memory Clock 6008MHz 6008MHz 6608MHz 6008MHz
Frame Buffer 2GB 2GB 2GB 2GB
TDP 150W 150W 150W ~170W
Width Double Slot Double Slot Double Slot Double Slot
Length N/A 9.5" 7.5" 10,5"
Warranty N/A 3 Year 3 Year + Life 3 Year
Price Point $299 $309 $329 $319

The big difference between a semi-custom and fully-custom card is of course the PCB; fully-custom cards pair a custom cooler with a custom PCB instead of a reference PCB. Partners can go in a few different directions with custom PCBs, using them to reduce the BoM, reduce the size of the card, or even to increase the capabilities of a product. For their GTX 660 Ti OC, Gigabyte has gone in the latter direction, using a custom PCB to improve the card.

On the surface the specs of the Gigabyte GeForce GTX 660 Ti OC are relatively close to our other cards, primarily the Zotac. Like Zotac Gigabyte is pushing the base clock to 1033MHz and the boost clock to 1111MHz, representing a sizable 118MHz (13%) base overclock and a 131MHz (13%) boost overclock respectively. Unlike the Zotac however there is no memory overclocking taking place, with Gigabyte shipping the card at the standard 6GHz.

What sets Gigabyte apart here in the specs is that they’ve equipped their custom PCB with better VRM circuitry, which means NVIDIA is allowing them to increase their power target from the GTX 660 Ti standard of 134W to an estimated 141W. This may not sound like much (especially since we’re working with an estimate on the Gigabyte board), but as we’ve seen time and time again GK104 is power-limited in most scenarios. A good GPU can boost to higher bins than there is power available to allow it, which means increasing the power target in a roundabout way increases performance. We’ll see how this works in detail in our benchmarks, but for now it’s good enough to say that even with the same GPU overclock as Zotac the Gigabyte card is usually clocking higher.

Moving on, Gigabyte’s custom PCB measures 8.4” long, and in terms of design it doesn’t bear a great resemblance to either the reference GTX 680 PCB nor the reference GTX 670 PCB; as near as we can tell it’s completely custom. In terms of design it’s nothing fancy – though like the reference GTX 670 the VRMs are located in the front – and as we’ve said before the real significance is the higher power target it allows. Otherwise the memory layout is the same as the reference GTX 660 Ti with 6 chips on the front and 2 on the back. Due to its length we’d normally insist on there being some kind of stiffener for an open air card, but since Gigabyte has put the GPU back far enough, the heatsink mounting alone provides enough rigidity to the card.

Sitting on top of Gigabyte’s PCB is a dual fan version of Gigabyte’s new Windforce cooler. The Windforce 2X cooler on their GTX 660 Ti is a bit of an abnormal dual fan cooler, with a relatively sparse aluminum heatsink attached to unusually large 100mm fans. This makes the card quite large and more fan than heatsink in the process, which is not something we’ve seen before.

The heatsink itself is divided up into three segments over the length of the card, with a pair of copper heatpipes connecting them. The bulk of the heatsink is over the GPU, while a smaller portion is at the rear and an even smaller portion is at the front, which is also attached to the VRMs. The frame holding the 100mm fans is then attached at the top, anchored at either end of the heatsink. Altogether this cooling contraption is both longer and taller than the PCB itself, making the final length of the card nearly 10” long.

Finishing up the card we find the usual collection of ports and connections. This means 2 PCIe power sockets and 2 SLI connectors on the top, and 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2 on the front. Meanwhile toolless case users will be happy to see that the heatsink is well clear of the bracket, so toolless clips are more or less guaranteed to work here.

Rounding out the package is the usual collection of power adapters and a quick start guide. While it’s not included in the box or listed on the box, the Gigabyte GeForce GTX 660 Ti OC works with Gigabyte’s OC Guru II overclocking software, which is available on Gigabyte’s website. Gigabyte has had OC Guru for a number of years now, and with this being the first time we’ve seen OC Guru II we can say it’s greatly improved from the functional and aesthetic mess that defined the previous versions.

While it won’t be winning any gold medals, in our testing OC Guru II gets the job done. Gigabyte offers all of the usual tweaking controls (including the necessary power target control), along with card monitoring/graphing and an OSD. It’s only real sin is that Gigabyte hasn’t implemented sliders on their controls, meaning that you’ll need to press and hold down buttons in order to dial in a setting. This is less than ideal, especially when you’re trying to crank up the 6000MHz memory clock by an appreciable amount.

Wrapping things up, the Gigebyte GeForce GTX 660 Ti OC comes with Gigabyte’s standard 3 year warranty. Gigabyte will be releasing it at an MSRP of $319, $20 over the price of a reference-clocked GTX 660 Ti and $10 less than the most expensive card in our roundup today.

Meet The Zotac GeForce GTX 660 Ti AMP! Edition The First TXAA Game & The Test
Comments Locked

313 Comments

View All Comments

  • Oxford Guy - Thursday, August 16, 2012 - link

    What is with the 285 being included? It's not even a DX 11 card.

    Where is the 480? Why is the 570 included instead of the 580?

    Where is the 680?
  • Ryan Smith - Saturday, August 18, 2012 - link

    The 285 was included because I wanted to quickly throw in a GTX 285 card where applicable, since NVIDIA is promoting the GTX 660 Ti as a GTX 200 series upgrade. Basically there was no harm in including it where we could.

    As for the 480, it's equivalent to the 570 in performance (eerily so), so there's never a need to break it out separately.

    And the 680 is in Bench. It didn't make much sense to include a card $200 more expensive which would just compress the results among the $300 cards.
  • CeriseCogburn - Sunday, August 19, 2012 - link

    So you're saying the 680 is way faster than the 7970 which you included in every chart, since the 7970 won't compress those $300 card results.
    Thanks for admitting that the 7970 is so much slower.
  • Pixelpusher6 - Friday, August 17, 2012 - link

    Thanks Ryan. Great review as always.

    I know one of the differentiating factors for the Radeon 7950s is the 3GB of ram but I was curious are there any current games which will max out 2GB of RAM with high resolution, AA, etc.?

    I think it's interesting how similar AMDs and Nvidias GPUs are this generation. I believe Nvidia will be releasing the GTX 660 non Ti based on GK106. Leaked specs seem to be similar to this card but the texture units will be reduced to 64. I wonder how much of a performance reduction this will account for. I think it will be hard for Nvidia to get the same type of performance / $ as say GTX 460 / 560 Ti this generation because of having to have GK104 fill in more market segments.

    Also I wasn't aware that Nvidia was still having trouble meeting demand with GK104 chips I thought those issues were all cleared up. I think when AMD released their 7000 series chips they should have taken advantage of being first to market and been more competitive on price to grow market share rather than increase margins. At that time someone sitting on 8800GT era hardware would be hard pressed to upgrade knowing that AMDs inflated prices would come down once Nvidia brought their GPUs to market. People who hold on to their cards for a number of years is unlikely to upgrade 6 months later to Nvidias product. If AMD cards were priced lower at this time a lot more people would have bought them, thereby beating Nvidia before they even have a card to market. I do give some credit to AMD for preparing for this launch and adjusting prices, but in my opinion this should have been done much earlier. AMD management needs to be more aggressive and catch Nvidia off guard, rather than just reacting to whatever they do. I would "preemptively" strike at the GTX 660 non Ti by lowering prices on the 7850 to $199. Instead it seems they'll follow the trend and keep it at $240-250 right up until the launch of the GTX 660 then lower it to $199.
  • Ryan Smith - Saturday, August 18, 2012 - link

    Pixelpusher, there are no games we test that max out 2GB of VRAM out of the box. 3GB may one day prove to be advantageous, but right even at multi-monitor resolutions 2GB is doing the job (since we're seeing these cards run out of compute/render performance before they run out of RAM).
  • Sudarshan_SMD - Friday, August 17, 2012 - link

    Where are naked images of the card?
  • CeriseCogburn - Thursday, August 23, 2012 - link

    You don't undress somebody you don't love.
  • dalearyous - Friday, August 17, 2012 - link

    it seems the biggest disappointment i see in comments is the price point.

    but if this card comes bundled with borderlands 2, and you were already planning on buying borderlands 2 then this puts the card at $240, worth it IMO.
  • rarson - Friday, August 17, 2012 - link

    but it's the middle of freaking August. While Tahiti was unfortunately clocked a bit lower than it probably should have been, and AMD took a bit too long to bring out the GE edition cards, Nvidia is now practically 8 months behind AMD, having only just released a $300 card. (In the 8 months that have gone by since the release of the 7950, its price has dropped from $450 to $320, effectively making it a competitor to the 660 Ti. AMD is able to compete on price with a better-performing card by virtue of the fact that it simply took Nvidia too damn long to get their product to market.) By the time the bottom end appears, AMD will be ready for Canary Islands.

    It's bad enough that Kepler (and Fermi, for that matter) was so late and so not available for several months, but it's taking forever to simply roll out the lower-tier products (and yes, I know 28nm wafers have been in short supply, but that's partially due to Nvidia's crappy Kepler yields... AMD have not had such supply problems). Can you imagine what would have happened if Nvidia actually tried to release GK110 as a consumer card? We'd have NOTHING. Hot, unmanufacturable nothing.

    Nvidia needs to get their shit together. At the rate they're going, they'll have to skip an entire generation just to get back on track. I liked the 680 because it was a good performer, but that doesn't do consumers any good when it's 4 months late to the party and almost completely unavailable. Perhaps by the end of the year, 28nm will have matured enough and Nvidia will be able to design something that yields decently while still offering the competitiveness that the 680 brought us, because what I'd really like to see is both companies releasing good cards at the same time. Thanks to Fermi and Kepler, that hasn't happened for a while now. Us consumers benefit from healthy competition and Nvidia has been screwing that up for everyone. Get it together, Nvidia!
  • CeriseCogburn - Sunday, August 19, 2012 - link

    So as any wacko fanboy does, you fault nVidia for releasing a card later that drives the very top end tier amd cards down from the 579+ shipping I paid to $170 less plus 3 free games.
    Yeah buddy, it's all nVidia's fault, and they need to get their act together, and if they do in fact get their act together, you can buy the very top amd card for $150, because that's likely all it will be worth.
    Good to know it's all nVidia's fault. AMD from $579+plus ship to $409 and 3 free games and nVidia sucks for not having it's act together.
    The FDA as well as the EPA should ban the koolaid you're drinking.

Log in

Don't have an account? Sign up now