Meet The EVGA GeForce GTX 660 Ti Superclocked

Our first card of the day is EVGA’s entry, the EVGA GeForce GTX 660 Ti Superclocked.  Among all of the GTX 670 cards we’ve looked at and all of the GTX 660 Ti cards we’re going to be looking at, this is the card that is the most like its older sibling. In fact with only a couple cosmetic differences it’s practically identical in construction.

GeForce GTX 660 Ti Partner Card Specification Comparison
  GeForce GTX 660 Ti(Ref) EVGA GTX 660 Ti Superclocked Zotac GTX 660 Ti AMP! Gigabyte GTX 660 Ti OC
Base Clock 915MHz 980MHz 1033MHz 1033MHz
Boost Clock 980MHz 1059MHz 1111MHz 1111MHz
Memory Clock 6008MHz 6008MHz 6608MHz 6008MHz
Frame Buffer 2GB 2GB 2GB 2GB
TDP 150W 150W 150W ~170W
Width Double Slot Double Slot Double Slot Double Slot
Length N/A 9.5" 7.5" 10,5"
Warranty N/A 3 Year 3 Year + Life 3 Year
Price Point $299 $309 $329 $319

EVGA will be clocking the GTX 660 Ti SC at 980MHz for the base clock and 1059MHz for the boost clock, which represents a 65MHz (7%) and 79MHz (8%) overclock respectively. Meanwhile EVGA has left the memory clocked untouched at 6GHz, the reference memory clockspeed for all of NVIDIA’s GTX 600 parts thus far.

The GTX 660 Ti is otherwise identical to the GTX 670, for all of the benefits that entails. While NVIDIA isn’t shipping a proper reference card for the GTX 660 Ti, they did create a reference design, and this appears to be what it’s based on. Both the EVGA and Zotac cards are using identical PCBs derived from the GTX 670’s PCB, which is not unexpected given the power consumption of the GTX 660 Ti. The only difference we can find on this PCB is that instead of there being solder pads for 16 memory chips there are solder pads for 12, reflecting the fact that the GTX 660 Ti can have at most 12 memory chips attached.

With this PCB design the PCB measures only 6.75” long, with the bulk of the VRM components located at the front of the card rather than the rear. Hynix 2Gb 6GHz memory chips are placed both on the front of the PCB and the back, with 6 on the front and 2 on the rear. The rear chips are directly behind a pair of front chips, reflecting the fact that all 4 of these chips are connected to a single memory controller.

With the effective reuse of the GTX 670 PCB, EVGA is also reusing their GTX 670 cooler. This cooler is a blower, which due to the positioning of the GPU and various electronic components means that the blower fan is off of the PCB entirely by necessity. Instead the blower fan is located behind the card in a piece of enclosed housing. This housing pushes the total length of the card out to 9.5”. Housed inside of the enclosure is a block-style aluminum heatsink with a copper baseplate that is providing cooling for the GPU. Elsewhere, attached to the PCB we’ll see a moderately sized aluminum heatsink clamped down on top of the VRMs towards the front of the card. There is no cooling provided for the GDDR5 RAM.

Elsewhere, at the top of the card we’ll find the 2 PCIe power sockets and 2 SLI connectors. Meanwhile at the front of the card EVGA is using the same I/O port configuration and bracket that we saw with the GTX 670. This means they’re using the NVIDIA standard: 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. This also means that the card features EVGA’s high-flow bracket, a bracket with less shielding in order to maximize the amount of air that can be exhausted.

Rounding out the package is EVGA’s typical collection of accessories and knick-knacks. In the box you’ll find a pair of molex power adapters, a quick start guide, and some stickers. The real meat of EVGA’s offering is on their website, where EVGA card owners can download their wonderful video card overclocking utility (Precision X), and their stress test utility (OC Scanner X). The powered-by-RivaTuner Precision X and OC Scanner X still set the gold standard for video card utilities thanks to their functionality and ease of use. Though personally I’m not a fan of the new UI – circular UIs and sliders aren’t particularly easy to read – but it gets the job done.

Gallery: EVGA X Tools

Next, as with all EVGA cards, the EVGA GeForce GTX 660 Ti Superclocked comes with EVGA’s standard 3 year transferable warranty, with individual 2 or 7 year extensions available for purchase upon registration, which will also unlock access to EVGA’s step-up upgrade program. Finally, the EVGA GeForce GTX 660 Ti Superclocked will be hitting retail with an MSRP of $309, $10 over the MSRP for reference cards.

That Darn Memory Bus Meet The Zotac GeForce GTX 660 Ti AMP! Edition
Comments Locked

313 Comments

View All Comments

  • CeriseCogburn - Thursday, August 23, 2012 - link

    I really didn't read your rant just skimmed your crybaby whine.
    So who cares you had an emotional blowout. Take some midol.
  • Galidou - Thursday, August 23, 2012 - link

    Attacking and attacking again, you have so much respect it's almost admirable. Respect is the most important thing in the world, if you can't have some for even people you don't know, I'm sorry but you're missing on something here.
  • Galidou - Thursday, August 23, 2012 - link

    I love it when people state their disrespectful opinion as a fact. Really drives their point home, yep.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    Take a look at your 7950 SKYRIM LOSS in triple monitor to the 660Ti and the 660Ti also beats the 7950 boost and the 7970 !

    5760x1080 4x aa 16x af

    ROFLMAO !
    http://www.bit-tech.net/hardware/2012/08/16/nvidia...

    YES, YOU DID YOUR "RESEARCH"... now you've lost every stupid argument you started. Stupid.
  • Galidou - Tuesday, September 4, 2012 - link

    http://www.techpowerup.com/reviews/ASUS/GeForce_GT...

    http://www.hardwarecanucks.com/forum/hardware-canu...

    http://www.tomshardware.com/reviews/geforce-gtx-66...

    http://www.anandtech.com/show/6159/the-geforce-gtx...

    Every review shows the 660ti under EVEN the 7870 and your review shows the 660 ti performing to the level of a 7970, flawed bullscrap. Your website has a problem, the same you have, it has a choosen side aka Fanboyism.

    I have both right now my wife uses the 660 ti in her pc for Guild wars 2 at 1080p and I bought the 7950 and overclocked both in my pc to test and the 7950 hands down tramples over the gtx 660 ti even both fully overclocked. I tested with skyrim on 3 monitor 5760*1080 and that's the only game I play.

    Now don't get MAD, I never said the gtx 660 ti is a bad card, it works wonders. But it gets trampled at 5760*1080 in skyrim end of the line...
  • TheJian - Monday, August 20, 2012 - link

    Actually I think they need to raise the clocks, and charge more, accepting the fact they will run hotter and use more watts. At least they can get more for the product, rather than having people saying you can OC them to 1100. Clock the normals at 900/1000 and the 7970@1050/1100 or so. Then charge more. Of course Nv is putting pricing pressure on them at the same time, but this move would allow them to be worth more out of the box so it wouldn't be as unreasonable. AT out of the box right now you can't charge more because they perform so poorly against what is being sold (and benchmarked) in the stores.

    With NV/Intel chewing them from both ends AMD isn't making money. But I think that's their fault with the mhz/pricing they're doing to themselves. They haven't ripped us off since the Athlon won for 3 years straight. Even then, they weren't getting real rich. Just making the profits they should have deserved. Check their 10yr profit summary and you'll see, they have lost 6bil. So I'd have to say they are NOT pricing/clocking their chips correctly, at least for this generation. These guys need to start making more money or they're going to be in bankruptcy by 2014 xmas.
    Last 12 months= sales 6.38bil = PROFITS= - 629 million! They aren't gouging us...They are losing their collective A$$es :(
    http://investing.money.msn.com/investments/stock-p...
    That's a LOSS of 629 million. Go back 10yrs its about a 6.x billion loss.

    While I hate the way Ryan did his review, AMD needs all the help they can get I guess... :) But Ryan needs to redo his recommendation (or lack of one) because he just looks like a buffoon when no monitors sell at 2560x1600 (30inchers? only 11, and less than this res), and steampowered.com shows less than 2% use this res also. He looks foolish at best not recommending based on 1920x1200 results which 98% of us use. He also needs to admit that Warhead is from 2008, and should have used Crysis 2 which is using an engine based on 27 games instead of CryEngine 2 from 2007 and only 7 games based on it. It's useless.
  • Galidou - Tuesday, August 21, 2012 - link

    ''profits they should have deserved''

    You speak like if they had to overcome Intel and Nvidia's performance is easy and it's all their fault because they work bad. AMD got a wonderful team, you speak like you ever worked there and they don't do shit, they sit on their chair and that's the result of their work.

    Well it isn't, if you wanan speak like that about AMD, do it if you work there. No one is better placed to say if a company is really good or bad than the employees themselves. So just stop speaking like if designing these over 3 billions transistor things is as easy as saying ''hello, my name is Nvidia fanboy and AMD is crap''.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    AMD is crap. It's crap man, no getting around it.
  • Galidou - Thursday, August 23, 2012 - link

    Too late Cerise, you lost all credibility by not being able to have an objective(it means it is undistorted by emotions) opinion and you rather proved you're way too much emotive to speak about video cards manufacturer.

    You too speak like if you ever worked at AMD and sure it is not the case, just visiting their headquarters would make your eyes bleed because in your world, this place is related to hell, with an ambient temperature averaging 200 degrees celsius, surrounded by walls of flesh, where torture is a common thing. And in the end, the demons poop video cards and force you to buy or kill your family.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    Your opinion - " i'm did my research ima getting my 7950 for my triple monitor SKYRIM..."

    Take a look at your 7950 SKYRIM LOSS in triple monitor to the 660Ti and the 660Ti also beats the 7950 boost and the 7970 !

    5760x1080 4x aa 16x af

    ROFLMAO !

    http://www.bit-tech.net/hardware/2012/08/16/nvidia...

    There isn't a palm big enough in the world to cover your face.

Log in

Don't have an account? Sign up now