Meet The GeForce GTX 660

For virtual launches it’s often difficult for us to acquire reference clocked cards since NVIDIA doesn’t directly sample the press with reference cards, and today’s launch of the GeForce GTX 660 launch is one of those times. The problem stems from the fact that NVIDIA’s partners are hesitant to offer reference clocked cards to the press since they don’t want to lose to factory overclocked cards in benchmarks, which is an odd (but reasonable) concern.

For today’s launch we were able to get a reference clocked card, but in order to do so we had to agree not to show the card or name the partner who supplied the card. As it turns out this isn’t a big deal since the card we received is for all practical purposes identical to NVIDIA’s reference GTX 660, which NVIDIA has supplied pictures of. So let’s take a look at the “reference” GTX 660.

The reference GTX 660 is in many ways identical to the GTX 670, which comes as no great surprise given the similar size of their PCBs, which in turn allows NVIDIA to reuse the same cooler with little modification. Like the GTX 670, the reference GTX 660 is 9.5” long, with the PCB itself composing just 6.75” of that length while the blower and its housing composes the rest. The size of retail cards will vary between these two lengths as partners like EVGA will be implementing their own blowers similar to NVIDIA’s, while other partners like Zotac will be using open air coolers not much larger than the reference PCB itself.

Breaking open one of our factory overclocked GTX 660 (specifically, our EVGA 660 SC using the NV reference PCB), we can see that while the GTX 670 and GTX 660 are superficially similar on the outside, the PCB itself is quite different. The biggest change here is that while the 670 PCB made the unusual move of putting the VRM circuitry towards the front of the card, the GTX 660 PCB once more puts it on the far side. With the GTX 670 this was a design choice to get the GTX 670 PCB down to 6.75”, whereas with the GTX 660 it requires so little VRM circuitry in the first place that it’s no longer necessary to put that circuitry at the front of the card to find the necessary space.

Looking at the GK106 GPU itself, we can see that not only is the GPU smaller than GK104, but the entire GPU package itself has been reduced in size. Meanwhile, not that it has any functional difference, but GK106 is a bit more rectangular than GK104.

Moving on to the GTX 660’s RAM, we find something quite interesting. Up until now NVIDIA and their partners have regularly used Hynix 6GHz GDDR5 memory modules, with that specific RAM showing up on every GTX 680, GTX 670, and GTX 660 Ti we’ve tested. The GTX 660 meanwhile is the very first card we’ve seen that’s equipped with Samsung’s 6GHz GDDR5 memory modules, marking the first time we’ve seen non-Hynix memory on a GeForce GTX 600 card. Truth be told, though it has no technical implications we’ve seen so many Hynix equipped cards from both AMD and NVIDIA that it’s refreshing to see that there is in fact more than one GDDR5 supplier in the marketplace.

For the 2GB GTX 660, NVIDIA has outfit the card with 8 2Gb memory modules, 4 on the front and 4 on the rear. Oddly enough there aren’t any vacant RAM pads on the 2GB reference PCB, so it’s not entirely clear what partners are doing for their 3GB cards; presumably there’s a second reference PCB specifically built to house the 12 memory modules needed for 3GB cards.

Elsewhere we can find the GTX 660’s sole PCIe power socket on the rear of the card, responsible for supplying the other 75W the card needs. As for the front of the card, here we can find the card’s one SLI connector, which like previous generation mainstream video cards supports up to 2-way SLI.

Finally, looking at display connectivity we once more see the return of NVIDIA’s standard GTX 600 series display configuration. The reference GTX 660 is equipped with 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. Like GK104 and GK107, GK106 can drive up to 4 displays, meaning all 4 ports can be put into use simultaneously.

The GeForce GTX 660 Review Just What Is NVIDIA’s Competition & The Test
Comments Locked

147 Comments

View All Comments

  • yeeeeman - Saturday, September 15, 2012 - link

    Really, G80 was a revolution on its own. Spectacular jump in performance compared to the previous generation, and combined with 65nm process technology gave birth to some of the finest video cards.
    The real setback here, is the fact that the gaming industry is driven by the lowest common denominator, and we all know that consoles are the most important. They are sold in the largest quantities, and most games are designed for their power, not higher.
    For PCs, games receive a DX11 treatment, with some fancy features, than enhance the quality a little bit, but it can never make up for the fact that the textures and the game is designed for a much slower platform.
    So given these facts, why change my 9600GT, when it can handle pretty much everything?
  • steelnewfie - Saturday, September 15, 2012 - link

    "For the 2GB GTX 660, NVIDIA has outfit the card with 8 2Gb memory modules"

    Should read outfitted.

    Also 8 2Gb memory modules? Did you mean 2GB? Either is incorrect by my math.

    If there are 8 banks should not each module be 256 MB?

    Otherwise, great articles, keep up the good work!
  • Ryan Smith - Saturday, September 15, 2012 - link

    Individual memory modules are labeled by their capacity in bits, not bytes. So each module is 2 gigabits (Gb), which is 256MB. 8x2Gb is how the card ends up with 2 gigabytes (GB) of RAM.
  • MrBubbles - Saturday, September 15, 2012 - link

    Cool, I have a GTX 260 and since NVidia is deliberately breaking their driver support for games like Civ 5 I guess this is the card to get.
  • saturn85 - Saturday, September 15, 2012 - link

    nice folding@home benchmark.
  • JWill97 - Thursday, September 27, 2012 - link

    For me, I really think it's the best card you can buy at this price. Not a fan (neutral) of both NVidia or AMD, but really, at $200+ segment nvidia takes it. But I still wondering, why all reviewers aren't using Maxpayne3 as one of the game benchmark? A lot of cards would be struggle playing it.
  • Grawbad - Friday, March 1, 2013 - link

    "NVIDIA has spent a lot of time in the past couple of years worrying about the 8800GT/9800GT in particular. “The only card that matters” was a massive hit for the company straight up through 2010, which has made it difficult to get users to upgrade even 4 years later."

    I am one of those. I purchased a 9800 GTX and that sucker runs everything. Mind you, all my other components were quality too so I didn't bottleneck myself. But this card has run everything I have ever thrown at it.. Only recently have I had to start watching the AA a bit. Which is why I am now, 5 years later in the market for a new card. 5 Years.

    Indeed, those cards were astounding.

    Mine was an EVGA 9800 GTX with a lifetime warranty. Thank goodness for that as it finally went out on me this year and I had to RMA it. And now that I am looking into getting a new card it seems EVGA has dropped their lifetime warranty. That makes me sad.

    Anyways, yeah, those were are are still great cards. I mean, if you picked up a 9800 GTX today, you will be able to run even the newest games. Albeit youll need to turn down aa and such, but you can still get GREAT graphics out of most anything even today.

Log in

Don't have an account? Sign up now