Meet The GeForce GTX 660

For virtual launches it’s often difficult for us to acquire reference clocked cards since NVIDIA doesn’t directly sample the press with reference cards, and today’s launch of the GeForce GTX 660 launch is one of those times. The problem stems from the fact that NVIDIA’s partners are hesitant to offer reference clocked cards to the press since they don’t want to lose to factory overclocked cards in benchmarks, which is an odd (but reasonable) concern.

For today’s launch we were able to get a reference clocked card, but in order to do so we had to agree not to show the card or name the partner who supplied the card. As it turns out this isn’t a big deal since the card we received is for all practical purposes identical to NVIDIA’s reference GTX 660, which NVIDIA has supplied pictures of. So let’s take a look at the “reference” GTX 660.

The reference GTX 660 is in many ways identical to the GTX 670, which comes as no great surprise given the similar size of their PCBs, which in turn allows NVIDIA to reuse the same cooler with little modification. Like the GTX 670, the reference GTX 660 is 9.5” long, with the PCB itself composing just 6.75” of that length while the blower and its housing composes the rest. The size of retail cards will vary between these two lengths as partners like EVGA will be implementing their own blowers similar to NVIDIA’s, while other partners like Zotac will be using open air coolers not much larger than the reference PCB itself.

Breaking open one of our factory overclocked GTX 660 (specifically, our EVGA 660 SC using the NV reference PCB), we can see that while the GTX 670 and GTX 660 are superficially similar on the outside, the PCB itself is quite different. The biggest change here is that while the 670 PCB made the unusual move of putting the VRM circuitry towards the front of the card, the GTX 660 PCB once more puts it on the far side. With the GTX 670 this was a design choice to get the GTX 670 PCB down to 6.75”, whereas with the GTX 660 it requires so little VRM circuitry in the first place that it’s no longer necessary to put that circuitry at the front of the card to find the necessary space.

Looking at the GK106 GPU itself, we can see that not only is the GPU smaller than GK104, but the entire GPU package itself has been reduced in size. Meanwhile, not that it has any functional difference, but GK106 is a bit more rectangular than GK104.

Moving on to the GTX 660’s RAM, we find something quite interesting. Up until now NVIDIA and their partners have regularly used Hynix 6GHz GDDR5 memory modules, with that specific RAM showing up on every GTX 680, GTX 670, and GTX 660 Ti we’ve tested. The GTX 660 meanwhile is the very first card we’ve seen that’s equipped with Samsung’s 6GHz GDDR5 memory modules, marking the first time we’ve seen non-Hynix memory on a GeForce GTX 600 card. Truth be told, though it has no technical implications we’ve seen so many Hynix equipped cards from both AMD and NVIDIA that it’s refreshing to see that there is in fact more than one GDDR5 supplier in the marketplace.

For the 2GB GTX 660, NVIDIA has outfit the card with 8 2Gb memory modules, 4 on the front and 4 on the rear. Oddly enough there aren’t any vacant RAM pads on the 2GB reference PCB, so it’s not entirely clear what partners are doing for their 3GB cards; presumably there’s a second reference PCB specifically built to house the 12 memory modules needed for 3GB cards.

Elsewhere we can find the GTX 660’s sole PCIe power socket on the rear of the card, responsible for supplying the other 75W the card needs. As for the front of the card, here we can find the card’s one SLI connector, which like previous generation mainstream video cards supports up to 2-way SLI.

Finally, looking at display connectivity we once more see the return of NVIDIA’s standard GTX 600 series display configuration. The reference GTX 660 is equipped with 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. Like GK104 and GK107, GK106 can drive up to 4 displays, meaning all 4 ports can be put into use simultaneously.

The GeForce GTX 660 Review Just What Is NVIDIA’s Competition & The Test
Comments Locked

147 Comments

View All Comments

  • Amgal - Friday, September 14, 2012 - link

    A little off topic, but does anandtech have an article explaining TU's, SMXes, ROPs, shader clock, etc- basically explaining the new age graphics card architectures? I really enjoy their informative articles, and am having some trouble finding one on that area that isn't littered with incomprehensible computer science macroes. Thanks.
  • pattycake0147 - Friday, September 14, 2012 - link

    If the majority of cards available for sale have custom coolers, why are noise measurements taken for only the reference card? Especially when you've stated that you have custom cards in the lab.
  • Jad77 - Friday, September 14, 2012 - link

    but shouldn't AMD be releasing their next generation sometime soon?
  • Patflute - Friday, September 14, 2012 - link

    Months from now.
  • rarson - Friday, September 14, 2012 - link

    Can we please stop pretending that Nvidia's supply issues are anybody's fault but their own? Is it just a coincidence that Fermi and Kepler both were huge, horrible misfires or is it possible that Nvidia has struggled to design things that actually yield decently? Can we stop ignoring the fact that AMD has had an entire lineup of 28nm parts since March (you know, like 2 months before Kepler ever appeared in reasonable quantities)? Yeah, 28nm IS constrained, but other companies are still putting out parts. Nvidia can't put out parts because they have to throw them away. They're eating the wafers (they must be eating a lot of them if it took them this long to bring out a $300 part).

    I hope Nvidia can pull it together because at this rate, AMD's going to start launching a generation ahead of them (they already have all of the console business).
  • CeriseCogburn - Thursday, November 29, 2012 - link

    nVidia dropped it production purchased spots, so you amd fanboys could blow giant dollars on nearly unavailable amd crap overpriced crashing non pci-e3 gen compliant video card trash
    you did so
    Well not you, but you know what I mean
    Then nVidia released and 2 days before amd "magically" had supply in the channels.
    If you're too stupid to know that - well - sorry since it's obvious
    Then amd crashed it's prices 4 times, and amd fanboys were left raped
    Then amd fired 10% more and now 15% more
    I hope the amd golden parachutes for the criminal executives pleased you
    What's your guess on the amd buyout rumors ?
    My guess is that 3G of ram you fools tried to lie about having an advantage with the totaled and incapable gpu choking on dirt below it at frame rates no Skyrim player could possibly stand, won't be recieving "driver updates" for that "glorious future" when "new games" that "can make use of it" "become available" !
    right fan boy ?
    RIGHT
    LOL
    Have a nice cry, err I meant day.
  • Lepton87 - Friday, September 14, 2012 - link

    This card is obviously slower than 7870.

    http://tpucdn.com/reviews/MSI/GTX_660_Twin_Frozr_I...

    Just look at performance summaries from other sites. But the most glaring flaw of this review is NOT comparing it to OC'ed AMD cards. After OC even 7850 is going to obliterate this overpriced card with almost no clock headroom.
  • Lepton87 - Friday, September 14, 2012 - link

    Unfortunately Anandtech is playing favourites. It's the only site that I know that has somewhat decent reputation that just couldn't admit that 7970GE is simply a faster card than GTX680 and now this....
  • CeriseCogburn - Thursday, November 29, 2012 - link

    Oh come on quarky, Crysis Warhead and Metro first on every review doesn't do it for you ?
    The alphabet here goes A for amd first, then C, the jumps to M, for amd , again and again.
    Why so sour, because amd is almost toast ?
  • CeriseCogburn - Thursday, November 29, 2012 - link

    100%, vs 103%, at a single resolution, the 1920x1200, when 1920x1080 shows another story, and the 7850 is down low at 85%.

    LOL - yeah amd fanboy, you sure are telling this amd fanboy site..

    Can we count how CRAPPY amd drivers are ? Can we count no adaptive v-sync on amd crap cards, can we count no 4 monitors out of the box on amd cards, can we count no auto overclocking, can we count amd slashing it's staff and driver writers aka catalusy maker issues ?
    Can we count any of that, or should we just count 3% ? LOL
    Oh wait fair and above it all amd fanboy, I know the answer...
    We will just count 3 more frames per 100 frame rate, at a single resolution, at your single link, and ignore everything else.
    LOL
    Thank you for your support.

Log in

Don't have an account? Sign up now