It’s hard not to notice that NVIDIA has a bit of a problem right now. In the months since the launch of their first Kepler product, the GeForce GTX 680, the company has introduced several other Kepler products into the desktop 600 series. With the exception of the GeForce GT 640 – their only budget part – all of those 600 series parts have been targeted at the high end, where they became popular, well received products that significantly tilted the market in NVIDIA’s favor.

The problem with this is almost paradoxical: these products are too popular. Between the GK104-heavy desktop GeForce lineup, the GK104 based Tesla K10, and the GK107-heavy mobile GeForce lineup, NVIDIA is selling every 28nm chip they can make. For a business prone to boom and bust cycles this is not a bad problem to have, but it means NVIDIA has been unable to expand their market presence as quickly as customers would like. For the desktop in particular this means NVIDIA has a very large, very noticeable hole in their product lineup between $100 and $400, which composes the mainstream and performance market segments. These market segments aren’t quite the high margin markets NVIDIA is currently servicing, but they are important to fill because they’re where product volumes increase and where most of their regular customers reside.

Long-term NVIDIA needs more production capacity and a wider selection of GPUs to fill this hole, but in the meantime they can at least begin to fill it with what they have to work with. This brings us to today’s product launch: the GeForce GTX 660 Ti. With nothing between GK104 and GK107 at the moment, NVIDIA is pushing out one more desktop product based on GK104 in order to bring Kepler to the performance market. Serving as an outlet for further binned GK104 GPUs, the GTX 660 Ti will be launching today as NVIDIA’s $300 performance part.

  GTX 680 GTX 670 GTX 660 Ti GTX 570
Stream Processors 1536 1344 1344 480
Texture Units 128 112 112 60
ROPs 32 32 24 40
Core Clock 1006MHz 915MHz 915MHz 732MHz
Shader Clock N/A N/A N/A 1464MHz
Boost Clock 1058MHz 980MHz 980MHz N/A
Memory Clock 6.008GHz GDDR5 6.008GHz GDDR5 6.008GHz GDDR5 3.8GHz GDDR5
Memory Bus Width 256-bit 256-bit 192-bit 320-bit
VRAM 2GB 2GB 2GB 1.25GB
FP64 1/24 FP32 1/24 FP32 1/24 FP32 1/8 FP32
TDP 195W 170W 150W 219W
Transistor Count 3.5B 3.5B 3.5B 3B
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm TSMC 40nm
Launch Price $499 $399 $299 $349

In the Fermi generation, NVIDIA filled the performance market with GF104 and GF114, the backbone of the very successful GTX 460 and GTX 560 series of video cards. Given Fermi’s 4 chip product stack – specifically the existence of the GF100/GF110 powerhouse – this is a move that made perfect sense. However it’s not a move that works quite as well for NVIDIA’s (so far) 2 chip product stack. In a move very reminiscent of the GeForce GTX 200 series, with GK104 already serving the GTX 690, GTX 680, and GTX 670, it is also being called upon to fill out the GTX 660 Ti.

All things considered the GTX 660 Ti is extremely similar to the GTX 670.  The base clock is the same, the boost clock is the same, the memory clock is the same, and even the number of shaders is the same. In fact there’s only a single significant difference between the GTX 670 and GTX 660 Ti: the GTX 660 Ti surrenders one of GK104’s four ROP/L2/Memory clusters, reducing it from a 32 ROP, 512KB L2, 4 memory channel part to a 24 ROP, 384KB L2, 3 memory channel part. With NVIDIA already binning chips for assignment to GTX 680 and GTX 670, this allows NVIDIA to further bin those GTX 670 parts without much additional effort. Though given the relatively small size of a ROP/L2/Memory cluster, it’s a bit surprising they have all that many chips that don’t meet GTX 670 standards.

In any case, as a result of these design choices the GTX 660 Ti is a fairly straightforward part. The 915MHz base clock and 980MHz boost clock of the chip along with the 7 SMXes means that GTX 660 Ti has the same theoretical compute, geometry, and texturing performance as GTX 670. The real difference between the two is on the render operation and memory bandwidth side of things, where the loss of the ROP/L2/Memory cluster means that GTX 660 Ti surrenders a full 25% of its render performance and its memory bandwidth. Interestingly NVIDIA has kept their memory clocks at 6GHz – in previous generations they would lower them to enable the use of cheaper memory – which is significant for performance since it keeps the memory bandwidth loss at just 25%.

How this loss of render operation performance and memory bandwidth will play out is going to depend heavily on the task at hand. We’ve already seen GK104 struggle with a lack of memory bandwidth in games like Crysis, so coming from GTX 670 this is only going to exacerbate that problem; a full 25% drop in performance is not out of the question here. However in games that are shader heavy (but not necessarily memory bandwidth heavy) like Portal 2, this means that GTX 660 Ti can hang very close to its more powerful sibling. There’s also the question of how NVIDIA’s nebulous asymmetrical memory bank design will impact performance, since 2GB of RAM doesn’t fit cleanly into 3 memory banks. All of these are issues where we’ll have to turn to benchmarking to better understand.

The impact on power consumption on the other hand is relatively straightforward. With clocks identical to the GTX 670, power consumption has only been reduced marginally due to the disabling of the ROP cluster. NVIDIA’s official TDP is 150W, with a power target of 134W. This compares to a TDP of 170W and a power target of 141W for the GTW 670. Given the mechanisms at work for NVIDIA’s GPU boost technology, it’s the power target that is a far better reflection of what to expect relative to the GTX 670. On paper this means that GK104 could probably be stuffed into a sub-150W card with some further functional units being disabled, but in practice desktop GK104 GPUs are probably a bit too power hungry for that.

Moving on, this launch will be what NVIDIA calls a “virtual” launch, which is to say that there aren’t any reference cards being shipped to partners to sell or to press to sample. Instead all of NVIDIA’s partners will be launching with semi-custom and fully-custom cards right away. This means we’re going to see a wide variety of cards right off the bat, however it also means that there will be less consistency between partners since no two cards are going to be quite alike. For that reason we’ll be looking at a slightly wider selection of partner designs today, with cards from EVGA, Zotac, and Gigabyte occupying our charts.

As for the launch supply, with NVIDIA having licked their GK104 supply problems a couple of months ago the supply of GTX 660 Ti cards looks like it should be plentiful. Some cards are going to be more popular than others and for that reason we expect we’ll see some cards sell out, but at the end of the day there shouldn’t be any problem grabbing a GTX 660 Ti on today’s launch day.

Pricing for GTX 660 Ti cards will start at $299, continuing NVIDIA’s tidy hierarchy of a GeForce 600 at every $100 price point. With the launch of the GTX 660 Ti NVIDIA will finally be able to start clearing out the GTX 570, a not-unwelcome thing as the GTX 660 Ti brings with it the Kepler family features (NVENC, TXAA, GPU boost, and D3D 11.1) along with nearly twice as much RAM and much lower power consumption. However this also means that despite the name, the GTX 660 Ti is a de facto replacement for the GTX 570 rather than the GTX 560 Ti. The sub-$250 market the GTX 560 Ti launched will continue to be served by Fermi parts for the time being. NVIDIA will no doubt see quite a bit of success even at $300, but it probably won’t be quite the hot item that the GTX 560 Ti was.

Meanwhile for a limited period of time NVIDIA will be sweeting the deal by throwing in a copy of Borderlands 2 with all GTX 600 series cards as a GTX 660 Ti launch promotion. Borderlands 2 is the sequel to Gearbox’s 2009 FPS/RPG hybrid, and is a TWIMTBP game that will have PhysX support along with planned support for TXAA. Like their prior promotions this is being done through retailers in North America, so you will need to check and ensure your retailer is throwing in Borderlands 2 vouchers with any GTX 600 card you purchase.

On the marketing front, as a performance part NVIDIA is looking to not only sell the GTX 660 Ti as an upgrade to 400/500 series owners, but to also entice existing GTX 200 series owners to upgrade. The GTX 660 Ti will be quite a bit faster than any GTX 200 series part (and cooler/quieter than all of them), with the question being of whether it’s going to be enough to spur those owners to upgrade. NVIDIA did see a lot of success last year with the GTX 560 driving the retirement of the 8800GT/9800GT, so we’ll see how that goes.

Anyhow, as with the launch of the GTX 670 cards virtually every partner is also launching one or more factory overclocked model, so the entire lineup of launch cards will be between $299 and $339 or so. This price range will put NVIDIA and its partners smack-dab between AMD’s existing 7000 series cards, which have already been shuffling in price some due to the GTX 670 and the impending launch of the GTX 660 Ti. Reference-clocked cards will sit right between the $279 Radeon HD 7870 and $329 Radeon HD 7950, which means that factory overclocked cards will be going head-to-head with the 7950.

On that note, with the launch of the GTX 660 Ti we can finally shed some further light on this week’s unexpected announcement of a new Radeon HD 7950 revision from AMD. As you’ll see in our benchmarks the existing 7950 maintains an uncomfortably slight lead over the GTX 660 Ti, which has spurred on AMD to bump up the 7950’s clockspeeds at the cost of power consumption in order to avoid having it end up as a sub-$300 product. The new 7950B is still scheduled to show up at the end of this week, with AMD’s already-battered product launch credibility hanging in the balance.

For this review we’re going to include both the 7950 and 7950B in our results. We’re not at all happy with how AMD is handling this – it’s the kind of slimy thing that has already gotten NVIDIA in trouble in the past – and while we don’t want to reward such actions it would be remiss of us not to include it since it is a new reference part. And if AMD’s credibility is worth anything it will be on the shelves tomorrow anyhow.

Summer 2012 GPU Pricing Comparison
AMD Price NVIDIA
Radeon HD 7970 GHz Edition $469/$499 GeForce GTX 680
Radeon HD 7970 $419/$399 GeForce GTX 670
Radeon HD 7950 $329  
  $299 GeForce GTX 660 Ti
Radeon HD 7870 $279  
  $279 GeForce GTX 570
Radeon HD 7850 $239  

 

That Darn Memory Bus
Comments Locked

313 Comments

View All Comments

  • Galidou - Monday, August 20, 2012 - link

    I play on either my 1080p TV or my 3 monitors 1920*1200 monitors(mainly on the monitors now. I only had the tv before so it was ok with my 6850 crossfire but now I'll need more memory and the main game I play, almost the only one, is Skyrim. I can't use the texture pack and some high details because in some caves and some places, the memory is limiting me badly. I plan to change the tv for one of those new high resolution one when they come out. In crossfire, you don't add the memory, you ahve the same video memory of only one card, so no it's not 2gb it's 1gb, thanks.

    I want the 7950 OC with 950mhz core I already did my research prior to the 660 ti review. They've been out for many months, I just work too much during the summer so I wasn't in a hurry and I wanted to see the competition.

    There you go, they even have thermal pictures of the whole system/card which is something I was looking for. It's not a performance review against competition it's only on overclock and power usage. I don't care about the 80 watts supplementary too much because I was ready to buy an GTX 580 at a low enough price which has stock clock power usage like 7950 overclocked power usage.

    http://www.behardware.com/articles/853-18/roundup-...

    Page 11 is the one I'm looking for, I just want to get 1100 mhz which seems everyone gets 1150 with this card.

    Before the patch in AMD drivers, Nvidia had a clear advantage so the 660 ti was like the obvious choice when it would come out(I thought so). Then these drivers and optimized games are mixing all the stuff up... If only they would care about us and not only their product performance, thus, their profit, but I know it will never happen. Money makes wars everywhere and it will NEVER stop.
  • TheJian - Monday, August 20, 2012 - link

    In SLI/Crossfire this card will give you 2GB (double your 1GB now, in either single or sli/cross you'll get 2GB), that's what I meant. I was talking about the 660TI, or whatever upgrade you do. You get the size of 2GB (one card's worth - SLI same thing). 2GBx2GB should be fine for skyrim at 1920x1200 (no card here was punished at 2GB 1920x1200). I already pointed to an article that shows NO difference from 2gb to 4gb in ANY game they tested.:
    http://www.guru3d.com/article/palit-geforce-gtx-68...
    "But 2GB really covers 98% of the games in the highest resolutions." It even worked at 5760x1200 below :) Look @ hardocp who tested on 3 monitors@5760x1200. 30fps min, 58.6avg 104max. On a single gtx680. Two 660's should smoke this single 680.

    Not quite sure I understand your gaming on 3 monitors comment...You mean 5760x1200? Or are you only gaming on ONE of them 1920x1200? or you mean something else?
    So you are planning on buying 2 video cards again? I thought you were just replacing with one at 2GB, but if you're gaming at 5760x1200 that's another story. I'd just buy a GTX680 and OC.
    http://www.hardocp.com/article/2012/03/22/nvidia_k...
    For the apples-to-apples test we lowered the AA setting down to 4X AA, just in case there were some hidden bottlenecks. Lowering the AA setting to 4X AA only made things worse for the Radeon HD 7970. The GeForce GTX 680 increased its advantage to 29% performance advantage over the HD 7970!"
    Granted as shown below with anand 12.7 they got better, but still lost, so don't expect NV to become behind in the above test even if they changed drivers. It didn't help below. NV won both tests anyway in anandtech testing... :)
    Anand 12.7 drivers skyrim, 7970ghz edition, slaughtered at 1920x1200
    http://www.anandtech.com/show/6025/radeon-hd-7970-...
    All NV cards are cpu limited at that res, gtx580/670/680. But they all win by 10fps with 4xmsaa/16af. Even at the useless 2560x1600 res 680 still wins...Thats a REF model they're comparing it to also. 680 does much better than this with what you BUY, but this is vs. ghz edition, so don't expect much more from your overclock and heat/noise will get worse. You'll only get another 15% at 1150 than they are already benching here (if that, scaling isn't perfect).

    From your own quoted article voltages vary, and as I pointed to another guy needed 1.25v to hit 1150.
    "Secondly, PowerTune doesn’t register increases in the GPU voltage and the big energy consumption increases that come with it. The technology is therefore incapable of fully protecting the GPU and the card. AMD says that OVP (Over Voltage Protection) and OCP (Over Current Protection) are still in place but, as we were able to observe, these technologies cut all power to the card when limits are exceeded."
    YOU can damage your card as I stated before this can't happen on 600's.
    http://www.guru3d.com/article/radeon-hd-7950-overc...
    1.25v 1150. Not all cards are the same.
    Look at your own chart on your link. Scroll down to where they show the chart and 1.20+ being REQUIRED to hit this statement:
    "The maximum clock on the Radeon HD 7900s generally seems to be between 1125 and 1200 MHz when the GPU voltage is adjusted.". Look at the chart above that line...1.20 is REQUIRED in most cases to hit anything over 1100. That is the FIRST voltage where they all meet 1100. @1.174 two couldn't even hit 1100 (the HIS and the Reference card). ONLY 3 cards hit 1125@1.25, only ONE went above. RUSSIAN, are you listening?...LOL. There are 11 cards in the list, be careful assuming all Sapphire OC's are the same. They are not. From page 19:
    "When the GPU voltage is changed this goes up to an increase of between 21 and 78%, which is enough to put the power stages of these cards under stress."

    That's 78% more power draw is a lot of watts, and they'll product a lot of heat/noise. Thanks for the link, it's a good article, I look forward to the 660 TI article there to see the comparison (hopefully he'll make one, there were only radeons in this article). Also note his page 14 comments and the charts showing heat stuck inside the pc frying other components as they don't expel the heat out the rear well:"The reference HIS, MSI Twin Frozr III and Sapphire OverClock Radeon HD 7950s and Radeon HD 7970 Lightning however tend to direct more hot air towards the hard drives.". Your card is in there...Only two cards didn't do that. Another downer for the 7950 cards in my mind. The CPU in your case (your OC card) would be 5C higher as shown in the chart vs. reference model. That's a lot of C added to your CPU. Paying attention russian? All of this affects your glorious 1150 speeds.

    Galidou has a better article than I used, as it is more complete regarding the "OTHER" effects a 1150mhz overclock on your radeon will get you (cpu raises, HD's, mem sticks etc), not to mention they don't expel the heat. I'm wondering if his 660TI article will show case temps sucking also now...ROFL. Thanks again Galidou. GOOD INFO. Right now my decision is the same, but after he does a 660 article I hope the issue doesn't become confused :)
  • Galidou - Monday, August 20, 2012 - link

    Lol you're so fun, people overclocked gtx 580 and it gave out more heat and still ate more current than anything this gen can ever imagine and no system died as of this yet... I gave you an example it can easily hit 1100-1150 without breaking 80w more and still you have to speak and speak and always say the same thing and try to make me beleive that if I ever buy a radeon card I will be deceived, I will eventually catch cancer and die of it.... thanks your fanaticism is appreciated. The more I speak with fanboys of both side, the more I think I'll have to stop playing computer for the risk of becoming like them.

    Skyrim,3 monitors is clearly ahead, I really thought you'd have sense after telling you the games I play but it seems you're as stubborn as someone can be. My wallet is speaking, my radeon 6850 crossfire made more heat in my system than this 7950 overclocked alone will do and yet I'll gain in performance.

    Skyrim with mods isn'T shown anywhere, it fills over 2gb of ram as soon as you ramp up the mods in there, I'm playing with 30 and my 1gb of ram is crying at me to stop. If you don't know the game you're speaking of, just don'T comment please.

    You can't damage your 660 or 670? lol fun stuff I'm a fan of reading 1 egg review on newegg and there's plenty on both side(AMD and Nvidia) claiming that their card died, for ati they died in the first week. 670 is newer than 7950 and still there are numerous cards that have 10 to 30% of 1egg-2eggs because of fried cards with and without overclock ranging from 1 days of ownership to 3 months of ownership.

    I guess 28nm isn't at it's peak and that is reflected. One guy claimed he overclocked his 670 and when the boost would go past 1290 it would black screen and shut down in battlefield 3(verified owner in the forums). Why would someone own the card say stuff like that?

    The articles you'Ve shown me in skyrim are before the big patch in AMD drivers, don'T get what I say, look at anandtech review of the gtx 680(4 months ago I think) and the review of the gtx 660 ti and watch the difference. You obviously didn'T do the research I did. The driver is about a month old and it gave VERY big improvements in skyrim, just watch any new review dating end of july to now and you will understand.

    Seeing you had to speak again so much and seeing the lack of information you have, this is ending now, thanks anyway but your stubbornness dragged me out of myself and I'm tired to speak with someone who already has a choosen side that blinds him to the point he cannot bring arguments that are valid on every front. Been nice tho.
  • Galidou - Monday, August 20, 2012 - link

    I meant over 2gb of Vram with texture pack. I'm an overclocker, I've been overclocking things for the past 15 years and still you try to explain me some things about heat while you can'T even find articles relative the the real performance of AMD with the newer drivers.

    I'm the kind of enthusiast you can find on overclock.net forums, you've argued the wrong way with the wrong person sorry if my english wasn'tperfect all along but I'm a french canadian.

    Still, temperatures raises on components like you said has been experienced for years, my radeon 4870 is super hot and still is working in my wife system and still nothing died, it leaves more heat in my whole system than the radeon 7950 alone will and I overclocked the darn thing.

    You're speaking like you'Re trying to make a show to everyone reading you but no one is reading cauz it's too long for them to bare and it isn'T even actual. So cut the show and the ''Galidou has a better article than I used'' like if you'Re speaking to someone, we'Re arguing together and you got lost in your information and didn'T even know of the BIG perf improvements as of 12.8 catalyst drivers show your inexperience. GG
  • Galidou - Monday, August 20, 2012 - link

    Oh and I forgot, last thing then you'll NEVER ever hear from me again on this forum so free to you to speak in the emptiness, the more we add, the less people will get to read it.

    2560*1600 = 4 millions pixels

    1920*1200 x 3 monitors = 7 millions pixels

    2560*1600 is cpu limited on anadtech, meaning all video cards in the review that are stuck at 84-86fps will go higher, so the 7950 will be ahead of the 7970 and ahead of the gtx 670 which is 100$ over the price I will pay because I'll wait for the special to come back and there's never any special on Nvidia cards because they are too good.
  • Galidou - Monday, August 20, 2012 - link

    And stop showing overclocked results and damaging thing about reference lousy stupid board, my system is watercooled and the video card will surely be in the end and stop shpowing me gtx 680 results it's above what I want to pay, I'm replacing my crossfire for one 7950 and watercooled they get EASILY to 1300 core clock which is rapeage of even the gtx 680 stock clocked ANYWHERE.
  • Galidou - Monday, August 20, 2012 - link

    http://www.techpowerup.com/reviews/Gigabyte/HD_797...

    now look at 5760*1080(lower resolution than I'll use) and look at where the 7950(not overclocked reference lousy board) just get te picture with the new drivers now. I leave up to you to find any 5760*1080 before drivers release and in the future, learn to find results by yourself.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    enjoy nVidia 660Ti's sweet victory over your planned to buy 7950 at your triple monitor rez there bucky
    LOL
    http://www.bit-tech.net/hardware/2012/08/16/nvidia...

    HAHHAHAHAHAHAHAHA
    OMG ! HAHAHA
  • Galidou - Monday, August 20, 2012 - link

    I know you don't want any AMD cards, it'S pretty obvious while I don't care about the brand it's only the fanboys from each side seems convinced the gap is SO IMMENSE while I don't see much of a gap when you consider price/performance and as always it depends on the game.

    Considering I'll play Skyrim on 3 monitors, unless you're a freaking blind fanboy, it would be hard to recommend the 660 ti... The sapphire OC to 950mhz isn't even on this site, it's simply a reference 7970 board with an 6+8 pin PEG connector.Which supports the overclock with the reference voltage on 1000mhz core easily.

    If I didn't have the knowledge and desire about overclocking I have now, the choice would be freaking obvious, 1080p gaming without overclocking, welcome 660 ti the card would be on it's way. But I want to overclock and everyone in the forums at overclock.net already know it, and whatever how big your doubts are, 90% of them report super overclock.

    Now don't bring me some of your comparison with the 7950b and reference coolers or I just do not answer back to you, AMD has as much non reference boards than Nvidia has. WTF wake up.... guru 3d overclocking a reference board with the lousy worthless not selling reference fan when 75% of their cards have way better coolers, good way of representing the reality..... COMON.

    BTW, I live in Canada, province of quebec, and where I live, the electricity is quite cheap, like really cheap.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    " Considering I'll play Skyrim on 3 monitors, unless you're a freaking blind fanboy, it would be hard to recommend the 660 ti."

    http://www.bit-tech.net/hardware/2012/08/16/nvidia...

    EVEN THE 7970 LOSES LMAO !!!

    So, you're saying your a totally freaking blind fanboy...
    Cool.

Log in

Don't have an account? Sign up now