Meet The Gigabyte GeForce GTX 660 Ti OC

Our final GTX 660 Ti of the day is Gigabyte’s entry, the Gigabyte GeForce GTX 660 Ti OC. Unlike the other cards in our review today this is not a semi-custom card but rather a fully-custom card, which brings with it some interesting performance ramifications.

GeForce GTX 660 Ti Partner Card Specification Comparison
  GeForce GTX 660 Ti(Ref) EVGA GTX 660 Ti Superclocked Zotac GTX 660 Ti AMP! Gigabyte GTX 660 Ti OC
Base Clock 915MHz 980MHz 1033MHz 1033MHz
Boost Clock 980MHz 1059MHz 1111MHz 1111MHz
Memory Clock 6008MHz 6008MHz 6608MHz 6008MHz
Frame Buffer 2GB 2GB 2GB 2GB
TDP 150W 150W 150W ~170W
Width Double Slot Double Slot Double Slot Double Slot
Length N/A 9.5" 7.5" 10,5"
Warranty N/A 3 Year 3 Year + Life 3 Year
Price Point $299 $309 $329 $319

The big difference between a semi-custom and fully-custom card is of course the PCB; fully-custom cards pair a custom cooler with a custom PCB instead of a reference PCB. Partners can go in a few different directions with custom PCBs, using them to reduce the BoM, reduce the size of the card, or even to increase the capabilities of a product. For their GTX 660 Ti OC, Gigabyte has gone in the latter direction, using a custom PCB to improve the card.

On the surface the specs of the Gigabyte GeForce GTX 660 Ti OC are relatively close to our other cards, primarily the Zotac. Like Zotac Gigabyte is pushing the base clock to 1033MHz and the boost clock to 1111MHz, representing a sizable 118MHz (13%) base overclock and a 131MHz (13%) boost overclock respectively. Unlike the Zotac however there is no memory overclocking taking place, with Gigabyte shipping the card at the standard 6GHz.

What sets Gigabyte apart here in the specs is that they’ve equipped their custom PCB with better VRM circuitry, which means NVIDIA is allowing them to increase their power target from the GTX 660 Ti standard of 134W to an estimated 141W. This may not sound like much (especially since we’re working with an estimate on the Gigabyte board), but as we’ve seen time and time again GK104 is power-limited in most scenarios. A good GPU can boost to higher bins than there is power available to allow it, which means increasing the power target in a roundabout way increases performance. We’ll see how this works in detail in our benchmarks, but for now it’s good enough to say that even with the same GPU overclock as Zotac the Gigabyte card is usually clocking higher.

Moving on, Gigabyte’s custom PCB measures 8.4” long, and in terms of design it doesn’t bear a great resemblance to either the reference GTX 680 PCB nor the reference GTX 670 PCB; as near as we can tell it’s completely custom. In terms of design it’s nothing fancy – though like the reference GTX 670 the VRMs are located in the front – and as we’ve said before the real significance is the higher power target it allows. Otherwise the memory layout is the same as the reference GTX 660 Ti with 6 chips on the front and 2 on the back. Due to its length we’d normally insist on there being some kind of stiffener for an open air card, but since Gigabyte has put the GPU back far enough, the heatsink mounting alone provides enough rigidity to the card.

Sitting on top of Gigabyte’s PCB is a dual fan version of Gigabyte’s new Windforce cooler. The Windforce 2X cooler on their GTX 660 Ti is a bit of an abnormal dual fan cooler, with a relatively sparse aluminum heatsink attached to unusually large 100mm fans. This makes the card quite large and more fan than heatsink in the process, which is not something we’ve seen before.

The heatsink itself is divided up into three segments over the length of the card, with a pair of copper heatpipes connecting them. The bulk of the heatsink is over the GPU, while a smaller portion is at the rear and an even smaller portion is at the front, which is also attached to the VRMs. The frame holding the 100mm fans is then attached at the top, anchored at either end of the heatsink. Altogether this cooling contraption is both longer and taller than the PCB itself, making the final length of the card nearly 10” long.

Finishing up the card we find the usual collection of ports and connections. This means 2 PCIe power sockets and 2 SLI connectors on the top, and 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2 on the front. Meanwhile toolless case users will be happy to see that the heatsink is well clear of the bracket, so toolless clips are more or less guaranteed to work here.

Rounding out the package is the usual collection of power adapters and a quick start guide. While it’s not included in the box or listed on the box, the Gigabyte GeForce GTX 660 Ti OC works with Gigabyte’s OC Guru II overclocking software, which is available on Gigabyte’s website. Gigabyte has had OC Guru for a number of years now, and with this being the first time we’ve seen OC Guru II we can say it’s greatly improved from the functional and aesthetic mess that defined the previous versions.

While it won’t be winning any gold medals, in our testing OC Guru II gets the job done. Gigabyte offers all of the usual tweaking controls (including the necessary power target control), along with card monitoring/graphing and an OSD. It’s only real sin is that Gigabyte hasn’t implemented sliders on their controls, meaning that you’ll need to press and hold down buttons in order to dial in a setting. This is less than ideal, especially when you’re trying to crank up the 6000MHz memory clock by an appreciable amount.

Wrapping things up, the Gigebyte GeForce GTX 660 Ti OC comes with Gigabyte’s standard 3 year warranty. Gigabyte will be releasing it at an MSRP of $319, $20 over the price of a reference-clocked GTX 660 Ti and $10 less than the most expensive card in our roundup today.

Meet The Zotac GeForce GTX 660 Ti AMP! Edition The First TXAA Game & The Test
Comments Locked

313 Comments

View All Comments

  • CeriseCogburn - Thursday, August 23, 2012 - link

    I really didn't read your rant just skimmed your crybaby whine.
    So who cares you had an emotional blowout. Take some midol.
  • Galidou - Thursday, August 23, 2012 - link

    Attacking and attacking again, you have so much respect it's almost admirable. Respect is the most important thing in the world, if you can't have some for even people you don't know, I'm sorry but you're missing on something here.
  • Galidou - Thursday, August 23, 2012 - link

    I love it when people state their disrespectful opinion as a fact. Really drives their point home, yep.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    Take a look at your 7950 SKYRIM LOSS in triple monitor to the 660Ti and the 660Ti also beats the 7950 boost and the 7970 !

    5760x1080 4x aa 16x af

    ROFLMAO !
    http://www.bit-tech.net/hardware/2012/08/16/nvidia...

    YES, YOU DID YOUR "RESEARCH"... now you've lost every stupid argument you started. Stupid.
  • Galidou - Tuesday, September 4, 2012 - link

    http://www.techpowerup.com/reviews/ASUS/GeForce_GT...

    http://www.hardwarecanucks.com/forum/hardware-canu...

    http://www.tomshardware.com/reviews/geforce-gtx-66...

    http://www.anandtech.com/show/6159/the-geforce-gtx...

    Every review shows the 660ti under EVEN the 7870 and your review shows the 660 ti performing to the level of a 7970, flawed bullscrap. Your website has a problem, the same you have, it has a choosen side aka Fanboyism.

    I have both right now my wife uses the 660 ti in her pc for Guild wars 2 at 1080p and I bought the 7950 and overclocked both in my pc to test and the 7950 hands down tramples over the gtx 660 ti even both fully overclocked. I tested with skyrim on 3 monitor 5760*1080 and that's the only game I play.

    Now don't get MAD, I never said the gtx 660 ti is a bad card, it works wonders. But it gets trampled at 5760*1080 in skyrim end of the line...
  • TheJian - Monday, August 20, 2012 - link

    Actually I think they need to raise the clocks, and charge more, accepting the fact they will run hotter and use more watts. At least they can get more for the product, rather than having people saying you can OC them to 1100. Clock the normals at 900/1000 and the 7970@1050/1100 or so. Then charge more. Of course Nv is putting pricing pressure on them at the same time, but this move would allow them to be worth more out of the box so it wouldn't be as unreasonable. AT out of the box right now you can't charge more because they perform so poorly against what is being sold (and benchmarked) in the stores.

    With NV/Intel chewing them from both ends AMD isn't making money. But I think that's their fault with the mhz/pricing they're doing to themselves. They haven't ripped us off since the Athlon won for 3 years straight. Even then, they weren't getting real rich. Just making the profits they should have deserved. Check their 10yr profit summary and you'll see, they have lost 6bil. So I'd have to say they are NOT pricing/clocking their chips correctly, at least for this generation. These guys need to start making more money or they're going to be in bankruptcy by 2014 xmas.
    Last 12 months= sales 6.38bil = PROFITS= - 629 million! They aren't gouging us...They are losing their collective A$$es :(
    http://investing.money.msn.com/investments/stock-p...
    That's a LOSS of 629 million. Go back 10yrs its about a 6.x billion loss.

    While I hate the way Ryan did his review, AMD needs all the help they can get I guess... :) But Ryan needs to redo his recommendation (or lack of one) because he just looks like a buffoon when no monitors sell at 2560x1600 (30inchers? only 11, and less than this res), and steampowered.com shows less than 2% use this res also. He looks foolish at best not recommending based on 1920x1200 results which 98% of us use. He also needs to admit that Warhead is from 2008, and should have used Crysis 2 which is using an engine based on 27 games instead of CryEngine 2 from 2007 and only 7 games based on it. It's useless.
  • Galidou - Tuesday, August 21, 2012 - link

    ''profits they should have deserved''

    You speak like if they had to overcome Intel and Nvidia's performance is easy and it's all their fault because they work bad. AMD got a wonderful team, you speak like you ever worked there and they don't do shit, they sit on their chair and that's the result of their work.

    Well it isn't, if you wanan speak like that about AMD, do it if you work there. No one is better placed to say if a company is really good or bad than the employees themselves. So just stop speaking like if designing these over 3 billions transistor things is as easy as saying ''hello, my name is Nvidia fanboy and AMD is crap''.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    AMD is crap. It's crap man, no getting around it.
  • Galidou - Thursday, August 23, 2012 - link

    Too late Cerise, you lost all credibility by not being able to have an objective(it means it is undistorted by emotions) opinion and you rather proved you're way too much emotive to speak about video cards manufacturer.

    You too speak like if you ever worked at AMD and sure it is not the case, just visiting their headquarters would make your eyes bleed because in your world, this place is related to hell, with an ambient temperature averaging 200 degrees celsius, surrounded by walls of flesh, where torture is a common thing. And in the end, the demons poop video cards and force you to buy or kill your family.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    Your opinion - " i'm did my research ima getting my 7950 for my triple monitor SKYRIM..."

    Take a look at your 7950 SKYRIM LOSS in triple monitor to the 660Ti and the 660Ti also beats the 7950 boost and the 7970 !

    5760x1080 4x aa 16x af

    ROFLMAO !

    http://www.bit-tech.net/hardware/2012/08/16/nvidia...

    There isn't a palm big enough in the world to cover your face.

Log in

Don't have an account? Sign up now