Meet The GeForce GTX 670

Because of the relatively low power consumption of GK104 relative to past high-end NVIDIA GPUs, NVIDIA has developed a penchant for small cards. While the GTX 680 was a rather standard 10” long, NVIDIA also managed to cram the GTX 690 into the same amount of space. Meanwhile the GTX 670 takes this to a whole new level.

We’ll start at the back as this is really where NVIDIA’s fascination with small size makes itself apparent. The complete card is 9.5” long, however the actual PCB is far shorter at only 6.75” long, 3.25” shorter than the GTX 680’s PCB. In fact it would be fair to say that rather than strapping a cooler onto a card, NVIDIA strapped a card onto a cooler. NVIDIA has certainly done short PCBs before – such as with one of the latest GTX 560 Ti designs – but never on a GTX x70 part before. But given the similarities between GK104 and GF114, this isn’t wholly surprising, if not to be expected.

In any case this odd pairing of a small PCB with a large cooler is no accident. With a TDP of only 170W NVIDIA doesn’t necessarily need a huge PCB, but because they wanted a blower for a cooler they needed a large cooler. The positioning of the GPU and various electronic components meant that the only place to put a blower fan was off of the PCB entirely, as the GK104 GPU is already fairly close to the rear of the card. Meanwhile the choice of a blower seems largely driven by the fact that this is an x70 card – NVIDIA did an excellent job with the GTX 560 Ti’s open air cooler, which was designed for the same 170W TDP, so the choice is effectively arbitrary from a technical standpoint (there’s no reason to believe $400 customers are any less likely to have a well-ventilated case than $250 buyers). Accordingly, it will be NVIDIA’s partners that will be stepping in with open air coolers of their own designs.

Starting as always at the top, as we previously mentioned the reference GTX 670 is outfitted with a 9.5” long fully shrouded blower. NVIDIA tells us that the GTX 670 uses the same fan as the GTX 680, and while they’re nearly identical in design, based on our noise tests they’re likely not identical. On that note unlike the GTX 680 the fan is no longer placed high to line up with the exhaust vent, so the GTX 670 is a bit more symmetrical in design than the GTX 680 was.


Note: We dissaembled the virtually identical EVGA card here instead

Lifting the cooler we can see that NVIDIA has gone with a fairly simple design here. The fan vents into a block-style aluminum heatsink with a copper baseplate, providing cooling for the GPU. Elsewhere we’ll see a moderately sized aluminum heatsink clamped down on top of the VRMs towards the front of the card. There is no cooling provided for the GDDR5 RAM.


Note: We dissaembled the virtually identical EVGA card here instead

As for the PCB, as we mentioned previously due to the lower TDP of the GTX 670 NVIDIA has been able to save some space. The VRM circuitry has been moved to the front of the card, leaving the GPU and the RAM towards the rear and allowing NVIDIA to simply omit a fair bit of PCB space. Of course with such small VRM circuitry the reference GTX 670 isn’t built for heavy overclocking – like the other GTX 600 cards NVIDIA isn’t even allowing overvolting on reference GTX 670 PCBs – so it will be up to partners with custom PCBs to enable that kind of functionality. Curiously only 4 of the 8 Hynix R0C GDDR5 RAM chips are on the front side of the PCB; the other 4 are on the rear. We typically only see rear-mounted RAM in cards with 16/24 chips, as 8/12 will easily fit on the same side.

Elsewhere at the top of the card we’ll find the PCIe power sockets and SLI connectors. Since NVIDIA isn’t scrambling to save space like they were with the GTX 680, the GTX 670’s PCIe power sockets are laid out in a traditional side-by-side manner. As for the SLI connectors, since this is a high-end GeForce card NVIDIA provides 2 connectors, allowing for the card to be used in 3-way SLI.

Finally at the front of the card NVIDIA is using the same I/O port configuration and bracket that we first saw with the GTX 680. This means 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. This also means the GTX 670 follows the same rules as the GTX 680 when it comes to being able to idle with multiple monitors.

NVIDIA GeForce GTX 670 Meet The EVGA GeForce GTX 670 Superclocked
POST A COMMENT

414 Comments

View All Comments

  • Galidou - Sunday, May 13, 2012 - link

    LoL you're still so much into it's almost like you wanna make me feel that you're gonna save the world with your knowledge. Pollution will be eradicated by the light of your mesurements(in milimeters that is).

    Sorry if my english ain't at your level, maybe why you beleive I'm young, but it is in fact my third language. The only fanboy here is you, I know that this 670 is AMAZING, if I were in the market for a 300$ video card with what I see now, I'd dish an extra 100 to get it without hesitation.

    OMG sonny boy you had to mention this like it was all the supernatural 6mm difference... who gives a darn but you... If you're that old and ''responsible'' commenting like you do about video cards, WOW, it's even worse than I think, the word responsible might even have to change definition just because of you.

    I have never seen someone so irresponsible in forums when speaking about video cards. And I'm not the only one who might think that way for sure.

    You'Ve been doing personnal attacks on this forum on a regular basis and all that because of what, because of your knowledge in video cards. I'd prefer to be the most stupid man on earth instead of using any form of knowledge the way you do.
    Reply
  • CeriseCogburn - Sunday, May 13, 2012 - link

    Another gigantic wall of text with nothing on the topic, and nothing but attacking. Goodbye, you missed all the discussions, and all you've got to add now is your hatred. Reply
  • Galidou - Monday, May 14, 2012 - link

    I never said your discussions were fulled with lies or anything, it's just the way you bring up your argumentation. It makes me feel like not everyone will listen to you because of that. You do it in a harsh way and bring everyone down with you at the same time.

    You think calling everyone an ignorant is really necessary to explain anything related to the topic even if their argumentation is flawed? I don't think so. I already know Nvidia won for this gen, and they won big time. It's nice to see some nice bang for your bucks at the top, and I mean, if it wasn't for the fact that I only run in 1080p and the most anticipated game I'll play is diablo 3, I'd get one right now. But darn it dude, calm down...
    Reply
  • CeriseCogburn - Thursday, May 31, 2012 - link

    You calm down amd fanboy, you LOST, amd LOST, and is LOSING, and amd is near broke.
    And you're very upset over it.
    Reply
  • Gastec - Tuesday, November 13, 2012 - link

    Aha! Busted! You are looking forward for AMD to get broke, closed so we would all be forced to buy video cards from only one corporation: nVidia. Because that way your share of money will get bigger. You greedy bastard! Reply
  • jamyryals - Friday, May 11, 2012 - link

    While highly entertaining to me, your comments are actually a bit disturbing when one thinks about what you are like in real life. Reply
  • CeriseCogburn - Friday, May 11, 2012 - link

    Yes, don't forget to personally attack me, use the usual blabber the most ignorant and clueless fools on the net use to do so, it's a very short list, so it won't be too hard for you unless you try to remember how to spell basement.
    Now back to the topic I brought up.
    It's a sad time for amd fanboys and no amount of lies can help.
    After the reviewer smacked down all 79xx CF setups as not able to recommend, we have this very next follow up review - and it's easy to say everyone is absolutely amazed by the massive performance of this next step nVidia GTX670 - beating amd and their fanboys at the very heart and I do mean their dark little love it to death fanboy talking point amd heart - die size / power use / price perf / fan noise...

    It's a total and complete smackdown, exceeding even in muliti monitor with the nVidia 3+1surf - a complete smackdown - no area left for the and fanboy to grab onto - extreme sadness the era has ended.

    Consolation prize is vehemently claiming amd "OC's better", but it's a very difficult and voltage increasing road of heat and instability to that little nugget - while the nVidia fan comfortably uses a new technology for OC, and "enjoys some OC anyway" even without touching a thing, if we are to believe the angry and defeated amd fans protestations..
    Reply
  • SlyNine - Saturday, May 12, 2012 - link

    You need to take a step back and read your comments. You seem to think you're the only person with a valid opinion. Reply
  • CeriseCogburn - Sunday, May 13, 2012 - link

    Oh more personal attacking ?
    You can have your opinion, even if it's ridiculous and stupid, and rest assured, I may use the facts to prove that it is, and thus, you may whine I seem to think only my opinion is valid.

    I see now you went on a half page excuse rant for amd in the prior page trying to justify it's terrible loss with much spinning and distortions - look buddy, why don't you take some of your own advice ?

    It's obviously very important to you, given your actions - so it would be better if you made more attempts like the one I just referred to, and also had the guts to correct those spinning and lying for amd, as then I wouldn't be so busy.

    We can thank snakefist for pointing out past page 25 here that the nVidia core size is 294mm on a side not 300mm - so there we go again, another lie by an amd fan corrected...

    ( to be fair there was some misinformation concerning that comporting to the error the poster made)

    Thanks for telling me to read my own posts, I assure you I do, although now I've skipped reading your entire rant on the prior page. It's laughable BTW.

    Reply
  • haakon_k - Saturday, May 12, 2012 - link

    While a bit entertaining to me, your comments are actually highly disturbing when one thinks about what you are like in real life.

    *my version
    Reply

Log in

Don't have an account? Sign up now