It’s hard not to notice that NVIDIA has a bit of a problem right now. In the months since the launch of their first Kepler product, the GeForce GTX 680, the company has introduced several other Kepler products into the desktop 600 series. With the exception of the GeForce GT 640 – their only budget part – all of those 600 series parts have been targeted at the high end, where they became popular, well received products that significantly tilted the market in NVIDIA’s favor.

The problem with this is almost paradoxical: these products are too popular. Between the GK104-heavy desktop GeForce lineup, the GK104 based Tesla K10, and the GK107-heavy mobile GeForce lineup, NVIDIA is selling every 28nm chip they can make. For a business prone to boom and bust cycles this is not a bad problem to have, but it means NVIDIA has been unable to expand their market presence as quickly as customers would like. For the desktop in particular this means NVIDIA has a very large, very noticeable hole in their product lineup between $100 and $400, which composes the mainstream and performance market segments. These market segments aren’t quite the high margin markets NVIDIA is currently servicing, but they are important to fill because they’re where product volumes increase and where most of their regular customers reside.

Long-term NVIDIA needs more production capacity and a wider selection of GPUs to fill this hole, but in the meantime they can at least begin to fill it with what they have to work with. This brings us to today’s product launch: the GeForce GTX 660 Ti. With nothing between GK104 and GK107 at the moment, NVIDIA is pushing out one more desktop product based on GK104 in order to bring Kepler to the performance market. Serving as an outlet for further binned GK104 GPUs, the GTX 660 Ti will be launching today as NVIDIA’s $300 performance part.

  GTX 680 GTX 670 GTX 660 Ti GTX 570
Stream Processors 1536 1344 1344 480
Texture Units 128 112 112 60
ROPs 32 32 24 40
Core Clock 1006MHz 915MHz 915MHz 732MHz
Shader Clock N/A N/A N/A 1464MHz
Boost Clock 1058MHz 980MHz 980MHz N/A
Memory Clock 6.008GHz GDDR5 6.008GHz GDDR5 6.008GHz GDDR5 3.8GHz GDDR5
Memory Bus Width 256-bit 256-bit 192-bit 320-bit
VRAM 2GB 2GB 2GB 1.25GB
FP64 1/24 FP32 1/24 FP32 1/24 FP32 1/8 FP32
TDP 195W 170W 150W 219W
Transistor Count 3.5B 3.5B 3.5B 3B
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm TSMC 40nm
Launch Price $499 $399 $299 $349

In the Fermi generation, NVIDIA filled the performance market with GF104 and GF114, the backbone of the very successful GTX 460 and GTX 560 series of video cards. Given Fermi’s 4 chip product stack – specifically the existence of the GF100/GF110 powerhouse – this is a move that made perfect sense. However it’s not a move that works quite as well for NVIDIA’s (so far) 2 chip product stack. In a move very reminiscent of the GeForce GTX 200 series, with GK104 already serving the GTX 690, GTX 680, and GTX 670, it is also being called upon to fill out the GTX 660 Ti.

All things considered the GTX 660 Ti is extremely similar to the GTX 670.  The base clock is the same, the boost clock is the same, the memory clock is the same, and even the number of shaders is the same. In fact there’s only a single significant difference between the GTX 670 and GTX 660 Ti: the GTX 660 Ti surrenders one of GK104’s four ROP/L2/Memory clusters, reducing it from a 32 ROP, 512KB L2, 4 memory channel part to a 24 ROP, 384KB L2, 3 memory channel part. With NVIDIA already binning chips for assignment to GTX 680 and GTX 670, this allows NVIDIA to further bin those GTX 670 parts without much additional effort. Though given the relatively small size of a ROP/L2/Memory cluster, it’s a bit surprising they have all that many chips that don’t meet GTX 670 standards.

In any case, as a result of these design choices the GTX 660 Ti is a fairly straightforward part. The 915MHz base clock and 980MHz boost clock of the chip along with the 7 SMXes means that GTX 660 Ti has the same theoretical compute, geometry, and texturing performance as GTX 670. The real difference between the two is on the render operation and memory bandwidth side of things, where the loss of the ROP/L2/Memory cluster means that GTX 660 Ti surrenders a full 25% of its render performance and its memory bandwidth. Interestingly NVIDIA has kept their memory clocks at 6GHz – in previous generations they would lower them to enable the use of cheaper memory – which is significant for performance since it keeps the memory bandwidth loss at just 25%.

How this loss of render operation performance and memory bandwidth will play out is going to depend heavily on the task at hand. We’ve already seen GK104 struggle with a lack of memory bandwidth in games like Crysis, so coming from GTX 670 this is only going to exacerbate that problem; a full 25% drop in performance is not out of the question here. However in games that are shader heavy (but not necessarily memory bandwidth heavy) like Portal 2, this means that GTX 660 Ti can hang very close to its more powerful sibling. There’s also the question of how NVIDIA’s nebulous asymmetrical memory bank design will impact performance, since 2GB of RAM doesn’t fit cleanly into 3 memory banks. All of these are issues where we’ll have to turn to benchmarking to better understand.

The impact on power consumption on the other hand is relatively straightforward. With clocks identical to the GTX 670, power consumption has only been reduced marginally due to the disabling of the ROP cluster. NVIDIA’s official TDP is 150W, with a power target of 134W. This compares to a TDP of 170W and a power target of 141W for the GTW 670. Given the mechanisms at work for NVIDIA’s GPU boost technology, it’s the power target that is a far better reflection of what to expect relative to the GTX 670. On paper this means that GK104 could probably be stuffed into a sub-150W card with some further functional units being disabled, but in practice desktop GK104 GPUs are probably a bit too power hungry for that.

Moving on, this launch will be what NVIDIA calls a “virtual” launch, which is to say that there aren’t any reference cards being shipped to partners to sell or to press to sample. Instead all of NVIDIA’s partners will be launching with semi-custom and fully-custom cards right away. This means we’re going to see a wide variety of cards right off the bat, however it also means that there will be less consistency between partners since no two cards are going to be quite alike. For that reason we’ll be looking at a slightly wider selection of partner designs today, with cards from EVGA, Zotac, and Gigabyte occupying our charts.

As for the launch supply, with NVIDIA having licked their GK104 supply problems a couple of months ago the supply of GTX 660 Ti cards looks like it should be plentiful. Some cards are going to be more popular than others and for that reason we expect we’ll see some cards sell out, but at the end of the day there shouldn’t be any problem grabbing a GTX 660 Ti on today’s launch day.

Pricing for GTX 660 Ti cards will start at $299, continuing NVIDIA’s tidy hierarchy of a GeForce 600 at every $100 price point. With the launch of the GTX 660 Ti NVIDIA will finally be able to start clearing out the GTX 570, a not-unwelcome thing as the GTX 660 Ti brings with it the Kepler family features (NVENC, TXAA, GPU boost, and D3D 11.1) along with nearly twice as much RAM and much lower power consumption. However this also means that despite the name, the GTX 660 Ti is a de facto replacement for the GTX 570 rather than the GTX 560 Ti. The sub-$250 market the GTX 560 Ti launched will continue to be served by Fermi parts for the time being. NVIDIA will no doubt see quite a bit of success even at $300, but it probably won’t be quite the hot item that the GTX 560 Ti was.

Meanwhile for a limited period of time NVIDIA will be sweeting the deal by throwing in a copy of Borderlands 2 with all GTX 600 series cards as a GTX 660 Ti launch promotion. Borderlands 2 is the sequel to Gearbox’s 2009 FPS/RPG hybrid, and is a TWIMTBP game that will have PhysX support along with planned support for TXAA. Like their prior promotions this is being done through retailers in North America, so you will need to check and ensure your retailer is throwing in Borderlands 2 vouchers with any GTX 600 card you purchase.

On the marketing front, as a performance part NVIDIA is looking to not only sell the GTX 660 Ti as an upgrade to 400/500 series owners, but to also entice existing GTX 200 series owners to upgrade. The GTX 660 Ti will be quite a bit faster than any GTX 200 series part (and cooler/quieter than all of them), with the question being of whether it’s going to be enough to spur those owners to upgrade. NVIDIA did see a lot of success last year with the GTX 560 driving the retirement of the 8800GT/9800GT, so we’ll see how that goes.

Anyhow, as with the launch of the GTX 670 cards virtually every partner is also launching one or more factory overclocked model, so the entire lineup of launch cards will be between $299 and $339 or so. This price range will put NVIDIA and its partners smack-dab between AMD’s existing 7000 series cards, which have already been shuffling in price some due to the GTX 670 and the impending launch of the GTX 660 Ti. Reference-clocked cards will sit right between the $279 Radeon HD 7870 and $329 Radeon HD 7950, which means that factory overclocked cards will be going head-to-head with the 7950.

On that note, with the launch of the GTX 660 Ti we can finally shed some further light on this week’s unexpected announcement of a new Radeon HD 7950 revision from AMD. As you’ll see in our benchmarks the existing 7950 maintains an uncomfortably slight lead over the GTX 660 Ti, which has spurred on AMD to bump up the 7950’s clockspeeds at the cost of power consumption in order to avoid having it end up as a sub-$300 product. The new 7950B is still scheduled to show up at the end of this week, with AMD’s already-battered product launch credibility hanging in the balance.

For this review we’re going to include both the 7950 and 7950B in our results. We’re not at all happy with how AMD is handling this – it’s the kind of slimy thing that has already gotten NVIDIA in trouble in the past – and while we don’t want to reward such actions it would be remiss of us not to include it since it is a new reference part. And if AMD’s credibility is worth anything it will be on the shelves tomorrow anyhow.

Summer 2012 GPU Pricing Comparison
AMD Price NVIDIA
Radeon HD 7970 GHz Edition $469/$499 GeForce GTX 680
Radeon HD 7970 $419/$399 GeForce GTX 670
Radeon HD 7950 $329  
  $299 GeForce GTX 660 Ti
Radeon HD 7870 $279  
  $279 GeForce GTX 570
Radeon HD 7850 $239  

 

That Darn Memory Bus
Comments Locked

313 Comments

View All Comments

  • Galidou - Saturday, August 18, 2012 - link

    That's because Tom didn't use Portal 2 in benches and Nvidia is so gooood at it! Plus, instead of Dirt 3, he used dirt showdown and AMD is soooo good at it. So if you don't play: Balltefield 3, Dirt 3 and portal 2, there's a good chance that the 7870 might be better for you considering it wil performe equially/very close to the higher priced gtx 660.

    But again, if I'd be a heavy battlefield 3/portal 2 player, the choice is obvious...
  • Galidou - Saturday, August 18, 2012 - link

    Correcting myself higher priced GTX 660 ti. But gotta remember at the same time, there's a limited quantity of Borderlands 2 they give away if you buy an Nvidia video card which should be a testament that their card perform well with this game and it's worth 60$ so you save that if you ever planned to buy it anyway....
  • CeriseCogburn - Sunday, August 19, 2012 - link

    Said the fella at the site that has been milking crysis one for amd fanboys for how long now, even as Crysis 2 has been out for almost a year now...
    Yeah, sure is all nVidia here (gag)(rolls eyes)(sees the way every review is worded)
  • TheJian - Monday, August 20, 2012 - link

    Obvious for everyone else too. Quit looking at ryans comments and 2560x1600 where 98% of us don't run. He bases most of his junk comments and conclusion on 2560x1600...WHAT FOR?
    68 24in monitors on newegg...NOT ONE over 1920x1200.
    52 27in monitors on newegg. 41 1920x1200 or less, only 11 at 2560x1440 (NOT 1600). HIs recommendations and crap comments are about a res you can't even get until you use multimonitor or a 30incher!

    Already proved all the games run better at 1920x1200. See my other post to you...Its a lot more than battlefield and portal2, dirt 3..shogun2, Skyrim, Batman AC, Witcher2, Battlefield 3 Multiplayer, max payne 3, Civ5 (landslide again at 1920x1200 here anandtech). How many more you need? Don't point me to wins at 2560x1600 either... :) Unless we all are getting 30inchers for free soon (obama handouts?), it doesn't matter.

    What games are OK to test without you calling them biased towards NV?
  • CeriseCogburn - Thursday, August 23, 2012 - link

    I thank you for telling the truth and putting up with the amd fanboys who can't find the truth with the swirling retarded amd fanboy redeye goggles sandblasted to their skulls.
    I really appreciate it as I don't feel like using the truth to refute the loons and taking so many hours to do so and having to rinse and repeat over and over again since nothing sinks into their insane skulls and manning up is something they never can do.
    I do have hope though since a few obviously kept responding to you, and hence likely reading some (ADD and dyslexia may still be a problem by the looks of it though) so maybe in 20 years with a lot of global warming and hence more blood flow to their brains as they run around screaming the end is nigh, the facts everywhere presented over and over again will perhaps start just a tiny bit to begin sinking in to the mush.
    LOL
    I have to say, you definitely deserve a medal for putting up with them, and doing that large a favor pointing out the facts. I appreciate it, as the amd lies are really not friendly to us gamers and especially amd fanboys who always claim they pinch every penny (which is really sad, too).
  • Galidou - Thursday, August 23, 2012 - link

    It wins at 1920*1080 often because the games are cpu limited and Nvidia has an advantage to use lower cpu ressources. It does mean something else too, if it's cpu limited, it means the graphics doesn'T push the system enough and at the same time means that when intensive graphically new games will come out, the cpu will be less in the way. What's bad in buying a video card that already maxes everything at 1080p and will do this for you in the future because these games are just not pushing it enough?

    I remember the gtx 580 when it came out, it was running everything in 1920*1080 while gtx 570 and radeon 6970 were already doing this still people bought gtx 580 and now they are more taxed it's useful at 1080p. But it's obvious gtx 660 ti is superior in many ways and many games but what I want you to understand you two(Cerise and Jian) well I should just say Jian, I understood a long time ago that Cerise has a closed mind on the subject, is that AMD has strenghts too it loses overall at 1080p and stock clocked cards, but someone can be happy with a card like that anyway..... While all along I've been discussing, I never said Nvidia was bad, I never dismissed their older gen card either as amazing parts too, while you just continued and try to make people beleive that you'll see AMAZING difference, HUMONGOUS GAINS by buying Nvidia and that AMD is cancer(or at least it looks like that in your eyes).

    It's quite hard for anyone right now who's running a 7950 like I now do and my friend do and my 6850 crossfire did and my 4870 did. like my 8800gt, the gtx 460 I bought building computers for many of my friends do to just understand what rabble you might say about such a difference when 90% of their game is pegged at 60fps from high to ultra details. All these graphs reviews and everything else, they're not reflecting what the average user feels when they play their game.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    " It's quite hard for anyone right now who's running a 7950 like I now do and my friend do "

    Wow, you didn't listen at all amd fanboy. So you have a report on triple mon Skyrim, or ... your mons not working in triple - you need some adapter or something else you have to buy ?

    Let's have your SkyRim loss and crash numbers.... LET'S SEE unplayable 29-34 fps if you're sporting a sandy 2500k

    http://www.bit-tech.net/hardware/2012/08/16/nvidia...

    Wow, cool, you lost.
  • TheJian - Monday, August 20, 2012 - link

    I debunked all this already (see other posts). Besides they ran all cards at ref speeds...LOL. Bandwidth is NOT an issue where 98% of us run. 1920x1200 or below even on 27in monitors (only 11 27in at newegg have above 1920x1200 and it's less than tested here at 2560x1440). Ryan misleads over an over in this review as if we run on 30in monitors. WE DON'T. @1920x1200 you won't have memory problems from either side. Not even at msaa/af/fxaa etc. ALL of the 24in monitors at newegg have 1920x1200/1080. NOT ONE at 2560x anything. Only 11 27in that are 2560x1440 all 41 others are 1920x1080 (even less taxing than 1920x1200!). Ryan is just trying to stop people from buying Nvidia I guess. I'm not sure why he thinks 2560x1600 is important as I've already shown <2% use it in steampowered's hardware survey and you basically have to have a special 27 or 30in to run above 1920x1200 native. Raise your hand if you are running in that 2% user group? I don't see many hands...LOL. Also note of that 2% most are running multi-monitor & usually multi card setups. But ryan can't make a recommendation...LOL. He could if he would quit pretending we all use 2560x1600...ROFLMAO. I admit, at that res you MAY run into memory issues (for the 2% that do it).
  • saturn85 - Thursday, August 16, 2012 - link

    nice folding@home benchmark.
  • martyrant - Thursday, August 16, 2012 - link

    Is there anyone out there trying to mod this thing to a 670 yet if it's all the identical parts with one of the four rop/mem buses disabled. I'd imagine some of these things, even if binned as failed 670s, a few would most likely have all 4 rop/mem buses functional.

    This would be a pretty sweet upgrade path if so :) Would be the Radeon 6950 all over again (and all the previous generations that were able to either do softmods or if anyone remembers the pencil graphite trick back in the day).

    Thanks for the review, I've been waiting for this one...even though I'm pretty disappointed. The 7970 I've seen on sale for $360 lately and right now it's looking like it's going to be the best bang for your buck. That's cheaper than a 670/680, only slightly more than a 660 Ti, and it's pretty much the performance crown single GPU for the most part, though AMD's drivers lately are scaring me.

Log in

Don't have an account? Sign up now