Meet The Gigabyte GeForce GTX 660 Ti OC

Our final GTX 660 Ti of the day is Gigabyte’s entry, the Gigabyte GeForce GTX 660 Ti OC. Unlike the other cards in our review today this is not a semi-custom card but rather a fully-custom card, which brings with it some interesting performance ramifications.

GeForce GTX 660 Ti Partner Card Specification Comparison
  GeForce GTX 660 Ti(Ref) EVGA GTX 660 Ti Superclocked Zotac GTX 660 Ti AMP! Gigabyte GTX 660 Ti OC
Base Clock 915MHz 980MHz 1033MHz 1033MHz
Boost Clock 980MHz 1059MHz 1111MHz 1111MHz
Memory Clock 6008MHz 6008MHz 6608MHz 6008MHz
Frame Buffer 2GB 2GB 2GB 2GB
TDP 150W 150W 150W ~170W
Width Double Slot Double Slot Double Slot Double Slot
Length N/A 9.5" 7.5" 10,5"
Warranty N/A 3 Year 3 Year + Life 3 Year
Price Point $299 $309 $329 $319

The big difference between a semi-custom and fully-custom card is of course the PCB; fully-custom cards pair a custom cooler with a custom PCB instead of a reference PCB. Partners can go in a few different directions with custom PCBs, using them to reduce the BoM, reduce the size of the card, or even to increase the capabilities of a product. For their GTX 660 Ti OC, Gigabyte has gone in the latter direction, using a custom PCB to improve the card.

On the surface the specs of the Gigabyte GeForce GTX 660 Ti OC are relatively close to our other cards, primarily the Zotac. Like Zotac Gigabyte is pushing the base clock to 1033MHz and the boost clock to 1111MHz, representing a sizable 118MHz (13%) base overclock and a 131MHz (13%) boost overclock respectively. Unlike the Zotac however there is no memory overclocking taking place, with Gigabyte shipping the card at the standard 6GHz.

What sets Gigabyte apart here in the specs is that they’ve equipped their custom PCB with better VRM circuitry, which means NVIDIA is allowing them to increase their power target from the GTX 660 Ti standard of 134W to an estimated 141W. This may not sound like much (especially since we’re working with an estimate on the Gigabyte board), but as we’ve seen time and time again GK104 is power-limited in most scenarios. A good GPU can boost to higher bins than there is power available to allow it, which means increasing the power target in a roundabout way increases performance. We’ll see how this works in detail in our benchmarks, but for now it’s good enough to say that even with the same GPU overclock as Zotac the Gigabyte card is usually clocking higher.

Moving on, Gigabyte’s custom PCB measures 8.4” long, and in terms of design it doesn’t bear a great resemblance to either the reference GTX 680 PCB nor the reference GTX 670 PCB; as near as we can tell it’s completely custom. In terms of design it’s nothing fancy – though like the reference GTX 670 the VRMs are located in the front – and as we’ve said before the real significance is the higher power target it allows. Otherwise the memory layout is the same as the reference GTX 660 Ti with 6 chips on the front and 2 on the back. Due to its length we’d normally insist on there being some kind of stiffener for an open air card, but since Gigabyte has put the GPU back far enough, the heatsink mounting alone provides enough rigidity to the card.

Sitting on top of Gigabyte’s PCB is a dual fan version of Gigabyte’s new Windforce cooler. The Windforce 2X cooler on their GTX 660 Ti is a bit of an abnormal dual fan cooler, with a relatively sparse aluminum heatsink attached to unusually large 100mm fans. This makes the card quite large and more fan than heatsink in the process, which is not something we’ve seen before.

The heatsink itself is divided up into three segments over the length of the card, with a pair of copper heatpipes connecting them. The bulk of the heatsink is over the GPU, while a smaller portion is at the rear and an even smaller portion is at the front, which is also attached to the VRMs. The frame holding the 100mm fans is then attached at the top, anchored at either end of the heatsink. Altogether this cooling contraption is both longer and taller than the PCB itself, making the final length of the card nearly 10” long.

Finishing up the card we find the usual collection of ports and connections. This means 2 PCIe power sockets and 2 SLI connectors on the top, and 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2 on the front. Meanwhile toolless case users will be happy to see that the heatsink is well clear of the bracket, so toolless clips are more or less guaranteed to work here.

Rounding out the package is the usual collection of power adapters and a quick start guide. While it’s not included in the box or listed on the box, the Gigabyte GeForce GTX 660 Ti OC works with Gigabyte’s OC Guru II overclocking software, which is available on Gigabyte’s website. Gigabyte has had OC Guru for a number of years now, and with this being the first time we’ve seen OC Guru II we can say it’s greatly improved from the functional and aesthetic mess that defined the previous versions.

While it won’t be winning any gold medals, in our testing OC Guru II gets the job done. Gigabyte offers all of the usual tweaking controls (including the necessary power target control), along with card monitoring/graphing and an OSD. It’s only real sin is that Gigabyte hasn’t implemented sliders on their controls, meaning that you’ll need to press and hold down buttons in order to dial in a setting. This is less than ideal, especially when you’re trying to crank up the 6000MHz memory clock by an appreciable amount.

Wrapping things up, the Gigebyte GeForce GTX 660 Ti OC comes with Gigabyte’s standard 3 year warranty. Gigabyte will be releasing it at an MSRP of $319, $20 over the price of a reference-clocked GTX 660 Ti and $10 less than the most expensive card in our roundup today.

Meet The Zotac GeForce GTX 660 Ti AMP! Edition The First TXAA Game & The Test
Comments Locked

313 Comments

View All Comments

  • TheJian - Sunday, August 19, 2012 - link

    http://www.newegg.com/Product/Product.aspx?Item=N8...

    And it's $350. The only BOOST edition on newegg 2 days after this review.

    A full 6 660 TI's for $299 (one after rebate). So, unfair to not include a card that looks like there's a $50 premium to the TI? I beg to differ. Also there are 11 cards available to BUY for 660 TI. Nuff said?

    It was rightly picked on.
    Google 7950 boost, you get $349 cheapest and availability is next to none. Google 7950b you don't even get a result for shopping. The radeon 7950 cheapest at newegg is already $319.99 (most after rebate). If you're looking at 1920x1200 and below the 660 TI is a no brainer. It is close in the games it loses in, and dominates in a few it wins in. Not sure why the nvidia 660 ti is even in the list, you don't buy that. Zotac's $299 is basically the bottom you buy and is faster than the ref design at 928mhz/1006 boost (not 915/boost 980), so consider the TI GREEN bar slower than what you'll actually buy for $299. Heck the 6th card I mentioned at $299 after rebate is running it's base at 1019 boost at 1097! So they are clocking regular cards at a full 100mhz faster than REF for $299. Another at $309 is also like this (1006/1084 boost). Knowing this you should be comparing the Zotac AMP (barely faster than the two I mention for $299 and 309) vs. the 7950 which is $320 at minimum!

    Zotac AMP (only 14mhz faster base than $299/309 card) vs. 7950 (again more expensive by $20) @ 1920x1200
    Civ5 <5% slower
    Skyrim >7% faster
    Battlefield3 >25% faster (above 40% or so in FXAA High)
    Portal 2 >54% faster (same in 2560x...even though it's useless IMHO)
    Batman Arkham >6% faster
    Shogun 2 >25% faster
    Dirt3 >6% faster
    Metro 2033 =WASH (ztac 51.5 vs. 7950 51...margin of error..LOL)
    Crysis Warhead >19% loss.
    Power@load 315w zotac amp vs. 353 7950 (vs 373w for 7950B)! Not only is the 660TI usually faster by a whopping amount, it's also going to cost you less at the register, and far less at the electric bill (all year for 2-4 years you probably have it - assuming you spend $300-350 for a gaming card to GAME on it).

    For $299 or $309 I'll RUN home with the 660 TI over 7950 @ $319. The games where it loses, you won't notice the difference at those frame rates. At todays BOOST prices ($350) there really isn't a comparison to be made. I believe it will be a while before the 7950B is $320, let along $299 of the 660 TI.

    NVIDIA did an awesome job here for gamers. I'll wait for black friday in a few months, but unless something changes, perf/watt wise I know what I'm upgrading to. I don't play crysis much :) (ok, none). Seeding higher clocked cards or not, you can BUY them for $299, can't buy a BOOST for under $350. By your own account, only two makers of 7950 BOOST. Feel free to retract your comment ;)
  • CeriseCogburn - Sunday, August 19, 2012 - link

    NO ONE plays crysis anymore, it's merely a placeholder to prop up AMD card stats. It's blatantly sick as Crysis 2 is out.
    It's IMMENSE bias for amd.
  • Galidou - Sunday, August 19, 2012 - link

    They use Crysis 2 almost everywhere on the internet again because of one reason, it's heavy, no one plays 3dMark because it's not a game still it's always included in reviews because it's relevant to performance.
  • TheJian - Monday, August 20, 2012 - link

    Read it again...He said NOBODY plays CRYSIS. He's confirming what I said.

    The complaint wasn't about crysis 1...It was about benchmarking a game from 2008 that isn't played, and is based on CryEngine 2 which a total of 7 games were based on since 2007. Crysis 1, warhead, Blues Mars (what? Not one metacritic review), Vigilance (what? no pc version),Merchants of Brooklyn, no reviews, The Day (?) and Merchants of Brooklyn,(?) Entropia Brooklyn (?). Who cares?

    The complaint is Anantech should use CRYSIS 2! With the hires patch and DX11 patch, with everything turned on. The CryEngine 3 game engine is used in 23 games, including the coming crysis 3! Though after a little more homework I still think this will be a victory for AMD, it's far more relevant and not a landslide by any means. But it IS relevant NV loser or not. Crysis 2 is still being played and I'm sure crysis 3 will for at least a while soon. 3x the games made on this engine...Warhead should be tossed and Crysis 2 used. But not without loading the 3 patches that get you all this goodness.
  • Galidou - Monday, August 20, 2012 - link

    Well I meant Crysis, not the 2, confused there. Even if no one plays the first one it's still very intensive but true, they should use crysis 2 as it's more relevant of games played now...
  • CeriseCogburn - Thursday, August 23, 2012 - link

    Yes we all play 3dmark and upload our scores and compare.
    Not sure about you, you only play one game that now conveniently got an amd driver boost.
    Good for amd they actually did something for once - although i'll be glad to hear how many times it crashes for you each night @ 1300 WC.
    It will be a LOT. Believe me. 30 mods, not as many as myself, but you'll be going down with CCC often.
  • Galidou - Thursday, August 23, 2012 - link

    Of all the video cards I had, and I had ALOT from the geforce 2 GTS up to my actually retreated 6850 crossfire(just received my Sapphire 7950 OC) I had close to 0 problems. How could you know anything about CCC while it's obvious you didn't have an AMD video card in years.

    I have 30 mods because it was already straining my limited video memory and I had a problem with one of them already(realistic sounds of thunder) which was related to my hi-fi sound card driver(asus xonar STX) that I found lately.

    I had no problem with CCC at all, other than using it to scale my LCD TV so it fits all the screen and using my game profiles. I didn't touch it much in the last year. It played Dirt 2, 3, Skyrim, GTA 4, Fallout 3, Fallout NV, Oblivion!!, and so on without a problem. And yet, you try to tell me I'll have problem with a program you don't know a thing about.

    But just so you might appreciate me for my efforts, my wife decided to change the 4870 for the forthcoming Guild wars 2 for energy and temperature reason. So I got her a 660 ti as my 6850 were already sold to a friend. She game at 1080p only and I didn't want to overclock her stuff so, it was obvious. At the same time I'll be able to compare both, but I already know I like Nvidia's UI more than AMD's CCC though they look quite alike now.

    BTW just for the sake of it I researched with google:

    AMD drivers keep crashing:
    3,54 million results

    Nvidia drivers keep crashing:
    3,37 million results
  • CeriseCogburn - Thursday, August 23, 2012 - link

    The reason I say what I do is because I DO HAVE A LOT of amd cards, you DUMMY.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    you're another idiot that gets everything wrong, attacks others for what they HAVE NOT SAID, gets corrected again and again, makes another crap offshot lie, then, OF COURSE - HAS A PERFECT DUAL AMD SETUP THAT HAS NEVER HAD A PROBLEM, EVA!
    That means you have very little experience, a freaking teensy tiny tiny bit.
    Look in the mirror dummy.
  • Galidou - Sunday, August 19, 2012 - link

    The 7950b is crap, I don't even want to hear about a reference design with a little boost. On newegg there are 4 cards out of 18 that are reference and the others are mainly overclocked models with coolers ALOT better which will overclock terribly good.

    It's easy for the average user to see the win for nvidia considering 20% of the overclock has been already done and there's not much headroom left..... Once overclocked, the only one that's faster for the 660 ti, remains portal 2.

    The Zotac might only have 14mhz more on base clock but the core clock is not the thing here, the zotac is the better of the pack because it comes with memory overcloked to 6,6ghz which is the only weakness of the 660ti, memory bandwidth. There's a weird thing in here tho, I found the minimum fps on another review, but on anandtech, the minimum appeared only in the games that it was less noticeable, good job again Nvidia.

Log in

Don't have an account? Sign up now