Meet The GeForce GTX 670

Because of the relatively low power consumption of GK104 relative to past high-end NVIDIA GPUs, NVIDIA has developed a penchant for small cards. While the GTX 680 was a rather standard 10” long, NVIDIA also managed to cram the GTX 690 into the same amount of space. Meanwhile the GTX 670 takes this to a whole new level.

We’ll start at the back as this is really where NVIDIA’s fascination with small size makes itself apparent. The complete card is 9.5” long, however the actual PCB is far shorter at only 6.75” long, 3.25” shorter than the GTX 680’s PCB. In fact it would be fair to say that rather than strapping a cooler onto a card, NVIDIA strapped a card onto a cooler. NVIDIA has certainly done short PCBs before – such as with one of the latest GTX 560 Ti designs – but never on a GTX x70 part before. But given the similarities between GK104 and GF114, this isn’t wholly surprising, if not to be expected.

In any case this odd pairing of a small PCB with a large cooler is no accident. With a TDP of only 170W NVIDIA doesn’t necessarily need a huge PCB, but because they wanted a blower for a cooler they needed a large cooler. The positioning of the GPU and various electronic components meant that the only place to put a blower fan was off of the PCB entirely, as the GK104 GPU is already fairly close to the rear of the card. Meanwhile the choice of a blower seems largely driven by the fact that this is an x70 card – NVIDIA did an excellent job with the GTX 560 Ti’s open air cooler, which was designed for the same 170W TDP, so the choice is effectively arbitrary from a technical standpoint (there’s no reason to believe $400 customers are any less likely to have a well-ventilated case than $250 buyers). Accordingly, it will be NVIDIA’s partners that will be stepping in with open air coolers of their own designs.

Starting as always at the top, as we previously mentioned the reference GTX 670 is outfitted with a 9.5” long fully shrouded blower. NVIDIA tells us that the GTX 670 uses the same fan as the GTX 680, and while they’re nearly identical in design, based on our noise tests they’re likely not identical. On that note unlike the GTX 680 the fan is no longer placed high to line up with the exhaust vent, so the GTX 670 is a bit more symmetrical in design than the GTX 680 was.


Note: We dissaembled the virtually identical EVGA card here instead

Lifting the cooler we can see that NVIDIA has gone with a fairly simple design here. The fan vents into a block-style aluminum heatsink with a copper baseplate, providing cooling for the GPU. Elsewhere we’ll see a moderately sized aluminum heatsink clamped down on top of the VRMs towards the front of the card. There is no cooling provided for the GDDR5 RAM.


Note: We dissaembled the virtually identical EVGA card here instead

As for the PCB, as we mentioned previously due to the lower TDP of the GTX 670 NVIDIA has been able to save some space. The VRM circuitry has been moved to the front of the card, leaving the GPU and the RAM towards the rear and allowing NVIDIA to simply omit a fair bit of PCB space. Of course with such small VRM circuitry the reference GTX 670 isn’t built for heavy overclocking – like the other GTX 600 cards NVIDIA isn’t even allowing overvolting on reference GTX 670 PCBs – so it will be up to partners with custom PCBs to enable that kind of functionality. Curiously only 4 of the 8 Hynix R0C GDDR5 RAM chips are on the front side of the PCB; the other 4 are on the rear. We typically only see rear-mounted RAM in cards with 16/24 chips, as 8/12 will easily fit on the same side.

Elsewhere at the top of the card we’ll find the PCIe power sockets and SLI connectors. Since NVIDIA isn’t scrambling to save space like they were with the GTX 680, the GTX 670’s PCIe power sockets are laid out in a traditional side-by-side manner. As for the SLI connectors, since this is a high-end GeForce card NVIDIA provides 2 connectors, allowing for the card to be used in 3-way SLI.

Finally at the front of the card NVIDIA is using the same I/O port configuration and bracket that we first saw with the GTX 680. This means 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. This also means the GTX 670 follows the same rules as the GTX 680 when it comes to being able to idle with multiple monitors.

NVIDIA GeForce GTX 670 Meet The EVGA GeForce GTX 670 Superclocked
Comments Locked

414 Comments

View All Comments

  • chizow - Sunday, May 13, 2012 - link



    For shame...once again you fail horribly, once you're done cherrypicking individual benchmarks you'll see Fermi does outclass Tesla by close to 100%, especially once AA is enabled:

    http://www.anandtech.com/bench/Product/317?vs=305

    Feel free to also compare to techpowerup or computerbase compilations to see similar results, since you obviously haven't been following the industry long enough to experience these revolutionary upgrades first-hand.

    This is all common knowledge to GPU enthusiasts though, including the author of this article. Please update your frame of reference before you comment further. Thanks.
  • CeriseCogburn - Sunday, May 13, 2012 - link

    Cherry picking is this very site and using the firswt bench at every release page CRYSIS your favorite amd game right now ?

    Cherry picking when the SAME VENDOR CARD nVidia is used ?

    Now nVidia is against itself in Crysis, and that's cherry picking ? I used the first game to actually look, because all we've has otherwise is you and your amd fanboys flapping lips... with ZERO evidence...

    Now I'll look at your stupid link which is ZERO EVIDENCE for now since you cannot even be beyond lazy and copy a bit of text from it or claim what you think it proves.

    LOL - so it proves nothing right ? You've got nothing. You have failed.
  • CeriseCogburn - Sunday, May 13, 2012 - link

    " ince you obviously haven't been following the industry long enough to experience these revolutionary upgrades first-hand. "

    Dude, I have been following daily for likely more years than your age.
    Good luck with that stupidity.
    80% and 75% is crap.

    33% down to 20% or so is fine, but we have another problem - the GTX580 was an enormous movement in total as it was tacked on improved after the 480 and was just prior distorting your tiny moaning brain, as you add on your ten years blabbering talking point with your long time amd fanboyism to bolster it.

    It took me a few minutes to prove you lied. :)

    You've got a bit of a point, but not much, and the added features of this new nVidia release ( adaptive v-sync, frame rate target (precisionX) ) negates all of that anyway - not to mention the driver add going all the way back to the 8 series.

    You're just moaning for no reason and lying too much while doing it.
  • Iketh - Thursday, May 10, 2012 - link

    your logic is flawed beyond belief
  • chizow - Thursday, May 10, 2012 - link

    Really? How so? If it was so flawed it should be simple for someone so clever as you to poke it full of holes.

    I'm waiting. :/
  • CeriseCogburn - Saturday, May 12, 2012 - link

    Here's a hole so large you'll be moaning when I'm done, and not about pricing or performance increase.

    A lot of people may wait a couple generations to upgrade, or go from a top tier card 2 or 3 gen back to a secondary card now new - or vise versa, cross over the big two, etc. etc.

    In your retentive and specialized moaning, you've restricted end user reality to a single specific instance you've handily outlined as your only metric, and have declared your single path to be the only qualifying upgrade doctrine to use.
    Now there's a zeal of rectal tightness one can easily surmise no end user gaming enthusiast has ever adhered to in their purchasing history, in the entire world, not once, ever.

    So what we really have is a much varied user base in the card(s) they currently run, and a quite varied distane and jump, node, architecture, two cards to one, one card to two, using a current card as a PhysX boost for a hot and cheap upgrade, etc.

    Thus, a person can wait out the $499 nVidia flagship launch or one or two, or some in between node shrink G80 to G92b, 280 to 285, 470 or 480 to 580, etc, and make the jump NOT when your choice choking and frankly stupid single choice only stroke my moaning firebrand demanded purchase scenario rears it's stupid dead head.

    In other words, the $499 you complain about is not the second $499 the real gamer and end user customer spent, they've been sitting a round, and are only spending once, not on your miniscule upgrade single purchase own only before and after rant line....

    So people figure it out in spite of your complaining, and make an enormous jump in their upgrade, or sell off a sli of cf set and barely spend a dime for a good "reset" for a future dual card perf bump on the cheap, or take the second or third or prior tier for a spin with a healthy discount from the release you hate with passion so much.

    You see, you've become a one trick pony, the one trick an amd fanboy can rage about and pretend to have a point - now I wouldn't mind so much if your 75% and 80% crap wasn't so obviously a doubly inflated lie - but on the other hand the initial constraint you introduce is near worthless for any current end user your hoped for perfectly having a fit scenario would apply to !. -

    NOTE: I'm so close to current performance because in the last ten years of those wonderful and enormous increases chizow has so adeptly been gassing the entire room about in hyper ventilation, that I think I'll keep my recently purchased flagship(s) that enjoyed not long ago that great and gigantic leap of power chizow loves in his tiny red heart so much ! Thanks chizow ! I can sit here a big fat winner with all my money in my pocket and it's such a poor increase I am win still for zero dollars !

    See how that works genius ? :-)
  • SlyNine - Saturday, May 12, 2012 - link

    Your constant use of twisted circular logic is amusing.
  • CeriseCogburn - Sunday, May 13, 2012 - link

    That's no rebuttal at all. We'll go with you and your chizow pal's upgrade path right - the one and only you and he allow for your argument ... that's not twisted..

    (rolls eyes)
  • CeriseCogburn - Sunday, May 13, 2012 - link

    You're actually a person who did exactly what I said SlyNine, you're perfect personal proof, as you have more than once stated you went from 5870 and jumped 2 flagship releases and bought the 680.

    Now, even after personally doing this, you attack my explanation calling it circular logic.

    Look in the mirror amd fanboy. I am sorry your amd fanboy base lifestyle took an upset this round, and you personally decided 7970 sucked compared to 680, and jumped from your 5870.

    You obviously couldn't bring yourself to move to the small performance increase the just prior 6970 was, slapping chizow with that brick unconsciously, you attack me, the person who correctly outlined what actually occurs, that you actually did, by your own words, elsewhere more than once, in these posts.

    ROFL - you really, really, really blew it badly that time Slynine.
  • BulletSpongeRTR - Thursday, May 10, 2012 - link

    Exactly, I'm a lowly line cook for a large restaurant chain making $10/hr. But I have SAVED my pennies for this card and will be ordering one today. If an individual cannot reign in their expenses and put away a little here and there to buy what they want (and lets be honest, a 670 is a want NOT a NEED) then they should not be complaining. I'm nearly done acquiring parts for my first build and will be glad when it's done. One more "Summer if Ramen" is all I can do.

Log in

Don't have an account? Sign up now