As our regular readers are well aware, NVIDIA’s 28nm supply constraints have proven to be a constant thorn in the side of the company. Since Q2 the message in financial statements has been clear: NVIDIA could be selling more GPUs if they had access to more 28nm capacity. As a result of this capacity constraint they have had to prioritize the high-profit mainstream mobile and high-end desktop markets above other consumer markets, leaving holes in their product lineups. In the intervening time they have launched products like the GK104-based GeForce GTX 660 Ti to help bridge that gap, but even that still left a hole between $100 and $300.

Now nearly 6 months after the launch of the first Kepler GPUs – and 9 months after the launch of the first 28nm GPUs – NVIDIA’s situation has finally improved to the point where they can finish filling out the first iteration of the Kepler GPU family. With GK104 at the high-end and GK107 at the low-end, the task of filling out the middle falls to NVIDIA’s latest GPU: GK106.

As given away by the model number, GK106 is designed to fit in between GK104 and GK107. GK106 offers a more modest collection of functional blocks in exchange for a smaller die size and lower power consumption, making it a perfect fit for NVIDIA’s mainstream desktop products. Even so, we have to admit that until a month ago we weren’t quite sure whether there would even be a GK106 since NVIDIA has covered so much of their typical product lineup with GK104 and GK107, leaving open the possibility of using those GPUs to also cover the rest. So the arrival of GK106 comes as a pleasant surprise amidst what for the last 6 months has been a very small GPU family.

GK106’s launch vehicle will be the GeForce GTX 660, the central member of NVIDIA’s mainstream video card lineup. GTX 660 is designed to come in between GTX 660 Ti and GTX 650 (also launching today), bringing Kepler and its improved performance down to the same $230 price range that the GTX 460 launched at nearly two years ago. NVIDIA has had a tremendous amount of success with the GTX 560 and GTX 460 families, so they’re looking to maintain this momentum with the GTX 660.

  GTX 660 Ti GTX 660 GTX 650 GT 640
Stream Processors 1344 960 384 384
Texture Units 112 80 32 32
ROPs 24 24 16 16
Core Clock 915MHz 980MHz 1058MHz 900MHz
Shader Clock N/A N/A N/A N/A
Boost Clock 980MHz 1033MHz N/A N/A
Memory Clock 6.008GHz GDDR5 6.008GHz GDDR5 5GHz GDDR5 1.782GHz DDR3
Memory Bus Width 192-bit 192-bit 128-bit 128-bit
VRAM 2GB 2GB 1GB/2GB 2GB
FP64 1/24 FP32 1/24 FP32 1/24 FP32 1/24 FP32
TDP 150W 140W 64W 65W
GPU GK104 GK106 GK107 GK107
Transistor Count 3.5B 2.54B 1.3B 1.3B
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm TSMC 28nm
Launch Price $299 $229 $109 $99

Diving right into the guts of things, the GeForce GTX 660 will be utilizing a fully enabled GK106 GPU. A fully enabled GK106 in turn is composed of 5 SMXes – arranged in an asymmetric 3 GPC configuration – along with 24 ROPs, 3 64bit memory controllers, and 384KB of L2 cache. Design-wise this basically splits the difference between the 8 SMX + 32 ROP GK104 and the 2 SMX + 16 ROP GK107. This also means that GTX 660 ends up looking a great deal like a GTX 660 Ti with fewer SMXes.

Meanwhile the reduction in functional units has had the expected impact on die size and transistor count, with GK106 packing 2.54B transistors into 214mm2. This also means that GK106 is only 2mm2 larger than AMD’s Pitcairn GPU, which sets up a very obvious product showdown.

In breaking down GK106, it’s interesting to note that this is the first time since 2008’s G9x family of GPUs that NVIDIA’s consumer GPU has had this level of consistency. The 200 series was split between 3 different architectures (G9x, GT200, and GT21x), and the 400/500 series was split between Big Fermi (GF1x0) and Little Fermi (GF1x4/1x6/1x8). The 600 series on the other hand is architecturally consistent from top to bottom in all respects, which is why NVIDIA’s split of the GTX 660 series between GK104 and GK106 makes no practical difference. As a result GK104, GK106, and GK107 all offer the same Kepler family features – such as the NVENC hardware H.264 encoder, VP5 video decoder, FastHDMI support, TXAA anti-aliasing, and PCIe 3.0 connectivity – with only the number of functional units differing.

As GK106’s launch vehicle, GTX 660 will be the highest performing implementation of GK106 that we expect to see. NVIDIA is setting the reference clocks for the GTX 660 at 980MHz for the core and 6GHz for the memory, the second to only the GTX 680 in core clockspeed and still the same common 6GHz memory clockspeed we’ve seen across all of NVIDIA’s GDDR5 desktop Kepler parts this far. Compared to GTX 660 Ti this means that on paper GTX 660 has around 76% of the shading and texturing performance of the GTX 660 Ti, 80% of the rasterization performance, 100% of the memory bandwidth, and a full 107% of the ROP performance.

These figures mean that the performance of the GTX 660 relative to the GTX 660 Ti is going to be heavily dependent on shading and rasterization. Shader-heavy games will suffer the most while memory bandwidth-bound and ROP-bound games are likely to perform very similarly between the two video cards. Interestingly enough this is effectively opposite the difference between the GTX 670 and GTX 660 Ti, where the differences between the two of those cards were all in memory bandwidth and ROPs. So in scenarios where GTX 660 Ti’s configuration exacerbated GK104’s memory bandwidth limitations GTX 660 should emerge relatively unscathed.

On the power front, GTX 660 has power target of 115W with a TDP of 140W. Once again drawing a GTX 660 Ti comparison, this puts the TDP of the GTX 660 at only 10W lower than its larger sibling, but the power target is a full 19W lower. In practice power consumption on the GTX 600 series has been much more closely tracking the power target than it has the TDP, so as we’ll see the GTX 660 is often pulling 20W+ less than the GTX 660 Ti. This lower level of power consumption also means that the GTX 660 is the first GTX 600 product to only require 1 supplementary PCIe power connection.

Moving on, for today’s launch NVIDIA is once again going all virtual, with partners being left to their own designs. However given that this is the first GK106 part and that partners have had relatively little time with the GPU, in practice partners are using NVIDIA’s PCB designs with their own coolers – many of which have been lifted from their GTX 660 Ti designs – meaning that all of the cards being launched today are merely semi-custom as opposed to some fully custom designs like we saw with the GTX 660 Ti. This means that though there’s going to be a wide range designs with respect to cooling, all of today’s launch cards will be extremely consistent with regard to clockspeeds and power delivery.

Like the GTX 660 Ti launch, partners have the option of going with either 2GB or 3GB of RAM, with the former once more taking advantage of NVIDIA’s asymmetrical memory controller functionality. For partners that do offer cards in both memory capacities we’re expecting most partners to charge $30-$40 more for the extra 1GB of RAM.

NVIDIA has set the MSRP on the GTX 660 at $229, which NVIDIA’s partners will be adhering to almost to a fault. Of the 3 cards we’re looking at in our upcoming companion GTX 660 launch roundup article, every last card is going for $229 despite the fact that every last card is also factory overclocked. Because NVIDIA does not provide an exhaustive list of cards and prices it’s not possible to say for sure just what the retail market will look like ahead of time, but at this point it looks like most $229 cards will be shipping with some kind of factory overclock. This is very similar to how the GTX 560 launch played out, though if it parallels the GTX 560 launch close enough then reference-clocked cards will still be plentiful in time.

At $229 the GTX 660 is going to be coming in just under AMD’s Radeon HD 7870. AMD’s official MSRP on the 7870 is $249, but at this point in time the 7870 is commonly available for $10 cheaper at $239 after rebate. Meanwhile the 2GB 7850 will be boxing in the GTX 660 in from the other side, with the 7850 regularly found at $199. Like we saw with the GTX 660 Ti launch, these prices are no mistake by AMD, with AMD once again having preemptively cut prices so that NVIDIA doesn’t undercut them at launch. It’s also worth noting that NVIDIA will not be extending their Borderlands 2 promotion to the GTX 660, so this is $229 without any bundled games, whereas AMD’s Sleeping Dogs promotion is still active for the 7870.

Finally, along with the GTX 660 the GK107-based GTX 650 is also launching today at $109. For the full details of that launch please see our GTX 650 companion article. Supplies of both cards are expected to be plentiful.

Summer 2012 GPU Pricing Comparison
AMD Price NVIDIA
Radeon HD 7950 $329  
  $299 GeForce GTX 660 Ti
Radeon HD 7870 $239  
  $229 GeForce GTX 660
Radeon HD 7850 $199  
Radeon HD 7770 $109 GeForce GTX 650
Radeon HD 7750 $99 GeForce GT 640

 

Meet The GeForce GTX 660
POST A COMMENT

147 Comments

View All Comments

  • yeeeeman - Saturday, September 15, 2012 - link

    Really, G80 was a revolution on its own. Spectacular jump in performance compared to the previous generation, and combined with 65nm process technology gave birth to some of the finest video cards.
    The real setback here, is the fact that the gaming industry is driven by the lowest common denominator, and we all know that consoles are the most important. They are sold in the largest quantities, and most games are designed for their power, not higher.
    For PCs, games receive a DX11 treatment, with some fancy features, than enhance the quality a little bit, but it can never make up for the fact that the textures and the game is designed for a much slower platform.
    So given these facts, why change my 9600GT, when it can handle pretty much everything?
    Reply
  • steelnewfie - Saturday, September 15, 2012 - link

    "For the 2GB GTX 660, NVIDIA has outfit the card with 8 2Gb memory modules"

    Should read outfitted.

    Also 8 2Gb memory modules? Did you mean 2GB? Either is incorrect by my math.

    If there are 8 banks should not each module be 256 MB?

    Otherwise, great articles, keep up the good work!
    Reply
  • Ryan Smith - Saturday, September 15, 2012 - link

    Individual memory modules are labeled by their capacity in bits, not bytes. So each module is 2 gigabits (Gb), which is 256MB. 8x2Gb is how the card ends up with 2 gigabytes (GB) of RAM. Reply
  • MrBubbles - Saturday, September 15, 2012 - link

    Cool, I have a GTX 260 and since NVidia is deliberately breaking their driver support for games like Civ 5 I guess this is the card to get. Reply
  • saturn85 - Saturday, September 15, 2012 - link

    nice folding@home benchmark. Reply
  • JWill97 - Thursday, September 27, 2012 - link

    For me, I really think it's the best card you can buy at this price. Not a fan (neutral) of both NVidia or AMD, but really, at $200+ segment nvidia takes it. But I still wondering, why all reviewers aren't using Maxpayne3 as one of the game benchmark? A lot of cards would be struggle playing it. Reply
  • Grawbad - Friday, March 01, 2013 - link

    "NVIDIA has spent a lot of time in the past couple of years worrying about the 8800GT/9800GT in particular. “The only card that matters” was a massive hit for the company straight up through 2010, which has made it difficult to get users to upgrade even 4 years later."

    I am one of those. I purchased a 9800 GTX and that sucker runs everything. Mind you, all my other components were quality too so I didn't bottleneck myself. But this card has run everything I have ever thrown at it.. Only recently have I had to start watching the AA a bit. Which is why I am now, 5 years later in the market for a new card. 5 Years.

    Indeed, those cards were astounding.

    Mine was an EVGA 9800 GTX with a lifetime warranty. Thank goodness for that as it finally went out on me this year and I had to RMA it. And now that I am looking into getting a new card it seems EVGA has dropped their lifetime warranty. That makes me sad.

    Anyways, yeah, those were are are still great cards. I mean, if you picked up a 9800 GTX today, you will be able to run even the newest games. Albeit youll need to turn down aa and such, but you can still get GREAT graphics out of most anything even today.
    Reply

Log in

Don't have an account? Sign up now