As is typically the case for NVIDIA when it comes to OEM products, they have once again quietly released their newest OEM video card. Their latest OEM addition is the GeForce GTX 660 OEM, which comes hot on the heels of last week’s launch of the retail GeForce GTX 660 Ti.

  GTX 680 GTX 670 GTX 660 Ti GTX 660 OEM
Stream Processors 1536 1344 1344 1152
Texture Units 128 112 112 96
ROPs 32 32 24 24
Core Clock 1006MHz 915MHz 915MHz 823MHz
Boost Clock 1058MHz 980MHz 980MHz 888MHz
Memory Clock 6.008GHz GDDR5 6.008GHz GDDR5 6.008GHz GDDR5 5.8GHz GDDR5
Memory Bus Width 256-bit 256-bit 192-bit 192-bit
VRAM 2GB 2GB 2GB 1.5/3GB
FP64 1/24 FP32 1/24 FP32 1/24 FP32 1/24 FP32
TDP 195W 170W 150W 130W
Transistor Count 3.5B 3.5B 3.5B 3.5B
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm TSMC 28nm
Launch Price $499 $399 $299 N/A

The GTX 660 OEM is another GK104 based video card, and like NVIDIA’s other OEM Kepler parts it’s a fairly conservative configuration. NVIDIA is shipping this card with only 6 of 8 SMXes enabled, the first time we’ve seen a desktop GK104 part with fewer than 7 SMXes. This further reduction in SMXes brings the GTX 660 down to 1152 CUDA Cores and 96 texture units, while on the raster side of things it’s unknown whether NVIDIA has disabled a whole GPC, or if they’re disabling SMXes in two separate GPCs. Meanwhile like the retail GTX 660 Ti this part has also had a ROP/L2/memory cluster disabled, giving it the same combination of 24 ROPs, 384KB of L2 cache, and a 192bit memory bus.

Looking at its specs NVIDIA seems to be particularly interested in getting a sub-150W Kepler card out – the GTX 660 OEM is rated for 130W and only requires 1 PCIe power connector – so compared to the other desktop GK104 parts the clockspeeds have also taken a hit. The GTX 660 OEM is clocked at just 823MHz core with an 888MHz boost clock, which is about 10% lower than the GTX 660 Ti. The memory clock is also a hair lower at 5.8GHz, an odd configuration since it means NVIDIA still has to equip the card with 6GHz GDDR5. At the same time NVIDIA is equipping the GTX 660 OEM with a fully symmetrical 1.5GB or 3GB of RAM, which coming from the asymmetrical 2GB GTX 660 Ti is probably a combination of cost-cutting and recognition of the fact that the OEM market isn’t quite as finicky about memory capacity as the retail market is.

Taken altogether, this puts the theoretical compute/rendering performance of the GTX 660 OEM at around 75% of the performance of the GTX 660 Ti, with the wildcard once again being the impact of memory bandwidth, which is almost unchanged. This is a larger step between cards than what we saw in the past generation of products (e.g. GTX 560 vs GTX 560 Ti), but at the same time NVIDIA’s OEM products are usually underspeced compared to their retail counterparts. For that same reason however we’d caution against looking too hard at the GTX 660 OEM for a sign of what the eventual retail GTX 660 will be like. NVIDIA’s been known to use different clockspeeds, different core configurations, and even different GPUs entirely, so everything is still on the table for now.

Source: Fudzilla

Comments Locked

36 Comments

View All Comments

  • HisDivineOrder - Wednesday, August 22, 2012 - link

    A $200 price point might be a huge loss or it might not. It depends on if the 680 core that wound up being reused across the 670, 660 Ti, and now the 660 OEM are truly the 560/560 Ti/460 replacements they seem to be.

    If so, then it's not a huge loss for nVidia to use these flawed chips for lower end cards because they were built from the ground up to be mainstream GPU's to begin with.

    Sure, they're "losing" money by not selling them as 670's or 680's, but the fact is you can only sell so many $400 and $500 GPU's. The market for that's only so big. Lots of consumers want to buy 560/560 Ti/460-class GPU's. The market for THAT is so much bigger and if nVidia has a good supply of chips just waiting on consumers to buy them, they might chop some performance off and throw 'em out there rather than let chips collect dust waiting on consumers to decide to buy $400+ video cards. Especially when said chips are so much smaller than competition's GPU's and they have basically one product line at 28nm while their competitor is busy churning out several 28nm products, each one being squeezed by nVidia's solitary line.

    I suspect these chips are a lot cheaper than you're giving them credit for. You imagine this like it's a 580-sized GPU when in fact it's a 560-sized GPU that they're upcharging the price on simply because their competitor's chip lacked in performance and couldn't keep up. I suspect nVidia makes a lot of money off selling these chips at $200, $300, $400, and $500. Because it was built for cards in the $200-$300 range to begin with.

    Great news for nVidia. Not so great for AMD or the consumers, but AMD's slacking off is what got us here.
  • Malphas - Thursday, August 23, 2012 - link

    "A $200 price point might be a huge loss or it might not. "

    "I suspect these chips are a lot cheaper than you're giving them credit for."

    I don't think people realise how cheap chips actually are to manufacture, with the sale price actually being used to contribute to R&D, marketing, building fabs, etc. rather than the cost of the chip itself. Chips are usually sold for at least ten times more than it costs to manufacture them, so even though it might seem as though selling a binned chip that sells for $500 in a high end card for $200 in a low end card is getting close towards selling for a loss, it's almost certainly still selling for much more than cost price (which is probably under $50).
  • CeriseCogburn - Thursday, August 23, 2012 - link

    It seems that way because A M *'in D always manages to lose money on every chip they put in every card.
    LOL
    Oh dat....
    While nVidia always makes that sweet and large profit margin...
    Oh doggone dat ! (angry face)
    The commies no like.
  • Malphas - Friday, August 24, 2012 - link

    Your posts are embarrassing and childish. Being a fanboy of a corporate (be it nVidia or AMD, or anyone else) is imbecilic.
  • CeriseCogburn - Saturday, August 25, 2012 - link

    No, my points are 100% CORRECT.
    The fanboys and FOOLS are the idiots telling each other with exactly ZERO data what the cost is card to market.
    Here, I'll help the crybaby idiots like you.
    Go to Charlie's and dig up his cost analysis (with DOLLAR DATA) on the two card wars. It's Fermi vs amd's junk I believe.
    At least there, one could start to talk like they had a SINGLE FACT to use.
    Without that, dig up the wafer cost, manufacturing cost, parts good per wafer, AND THEN YOU NEED PRICE CHARGED BY AMD OR NVIDIA ...
    Good luck without some big bucks to buy up the facts, because they ARE NOT AROUND.
    If you really want to do the work buy the sales data, then extract discrete, then go to the same Q, and get the profit/loss chart from amd or nvidia, then good luck separating out discrete cards from that... including a SINGLE release.
    No, YOU'LL BE CALLING WIKILEAKS FOR ANY ANSWER AT ALL.
  • Galidou - Wednesday, August 29, 2012 - link

    ''The fanboys and FOOLS are the idiots''

    You just said it yourself, you're the biggest Nvidia fanboy I've ever seen in my whole life. So in the end, from what I've said and the above statement we can only deduct one thing from that and I leave it up to you...

    He never said your points were not correct. he only said they are displayed childishly and for that, they gotta be the most childish things I've ever read about video cards. I've been arguing about you just to tell how embarassing you are making things since we had that first discussion about the 7970. To this day, you didn't realize a freaking thing. You're either really dumb or a kid in his teenage hormonal crysis.

    He doesn't wanna go anywhere to check you're comments because they're displayed surrounded by so much hate that no one cares about you. Speak like a normal person and maybe you'll get some credit.
  • Malphas - Thursday, August 23, 2012 - link

    And yes, in addition to that, like you said the GK104 was originally intended as a midrange part before Nvidia realised how much they'd overestimated the competition by and decided to rebrand it as high end instead. The GTX 680 was originally intended as a 560 Ti replacement and thus intended to sell for that kind of price point. So in this instance the profit margin is no doubt even more huge than it normally is with chips.
  • CeriseCogburn - Saturday, August 25, 2012 - link

    I can't believe how stupid you people are, and how incorrect rumors drive your gourds like cowboys whippings pigs into the corral.
  • Galidou - Wednesday, August 29, 2012 - link

    I'll give you one freaking clear example of what you could do to have credit and so people will read you without saying you'Re crazy. If you don'T care about my example NOW, then you'Re just lost.

    Your above comment above would have the same result if you just wrote:

    This is just speculations Malphas, rumors, we can't say iwhat the gtx 680 was supposed to replace as it is a totally different part from the last gen. But based on how it performs, they surely knew it wasn't meant to replace the midstream gtx 560ti.

    There you go, polite still as effective, readable and respectful. Your point was understood and people will not get mad at you for telling them:

    Stupid gourds cowboys whippings pigs....... we got it, your insult vocabulary has been exposed we know what you're capable of, now write things with more sense please, would you?
  • leliel - Wednesday, August 22, 2012 - link

    The 550/660Ti VRAM setup makes my skin crawl. I'm sure it's a great marketing point, but I'd rather not pay for extra memory when there's a performance hit involved, especially when I doubt I do anything that uses more than 1.5GB in the first place. Very interested in a retail product and OCability!

Log in

Don't have an account? Sign up now