Meet The GeForce GTX 670

Because of the relatively low power consumption of GK104 relative to past high-end NVIDIA GPUs, NVIDIA has developed a penchant for small cards. While the GTX 680 was a rather standard 10” long, NVIDIA also managed to cram the GTX 690 into the same amount of space. Meanwhile the GTX 670 takes this to a whole new level.

We’ll start at the back as this is really where NVIDIA’s fascination with small size makes itself apparent. The complete card is 9.5” long, however the actual PCB is far shorter at only 6.75” long, 3.25” shorter than the GTX 680’s PCB. In fact it would be fair to say that rather than strapping a cooler onto a card, NVIDIA strapped a card onto a cooler. NVIDIA has certainly done short PCBs before – such as with one of the latest GTX 560 Ti designs – but never on a GTX x70 part before. But given the similarities between GK104 and GF114, this isn’t wholly surprising, if not to be expected.

In any case this odd pairing of a small PCB with a large cooler is no accident. With a TDP of only 170W NVIDIA doesn’t necessarily need a huge PCB, but because they wanted a blower for a cooler they needed a large cooler. The positioning of the GPU and various electronic components meant that the only place to put a blower fan was off of the PCB entirely, as the GK104 GPU is already fairly close to the rear of the card. Meanwhile the choice of a blower seems largely driven by the fact that this is an x70 card – NVIDIA did an excellent job with the GTX 560 Ti’s open air cooler, which was designed for the same 170W TDP, so the choice is effectively arbitrary from a technical standpoint (there’s no reason to believe $400 customers are any less likely to have a well-ventilated case than $250 buyers). Accordingly, it will be NVIDIA’s partners that will be stepping in with open air coolers of their own designs.

Starting as always at the top, as we previously mentioned the reference GTX 670 is outfitted with a 9.5” long fully shrouded blower. NVIDIA tells us that the GTX 670 uses the same fan as the GTX 680, and while they’re nearly identical in design, based on our noise tests they’re likely not identical. On that note unlike the GTX 680 the fan is no longer placed high to line up with the exhaust vent, so the GTX 670 is a bit more symmetrical in design than the GTX 680 was.


Note: We dissaembled the virtually identical EVGA card here instead

Lifting the cooler we can see that NVIDIA has gone with a fairly simple design here. The fan vents into a block-style aluminum heatsink with a copper baseplate, providing cooling for the GPU. Elsewhere we’ll see a moderately sized aluminum heatsink clamped down on top of the VRMs towards the front of the card. There is no cooling provided for the GDDR5 RAM.


Note: We dissaembled the virtually identical EVGA card here instead

As for the PCB, as we mentioned previously due to the lower TDP of the GTX 670 NVIDIA has been able to save some space. The VRM circuitry has been moved to the front of the card, leaving the GPU and the RAM towards the rear and allowing NVIDIA to simply omit a fair bit of PCB space. Of course with such small VRM circuitry the reference GTX 670 isn’t built for heavy overclocking – like the other GTX 600 cards NVIDIA isn’t even allowing overvolting on reference GTX 670 PCBs – so it will be up to partners with custom PCBs to enable that kind of functionality. Curiously only 4 of the 8 Hynix R0C GDDR5 RAM chips are on the front side of the PCB; the other 4 are on the rear. We typically only see rear-mounted RAM in cards with 16/24 chips, as 8/12 will easily fit on the same side.

Elsewhere at the top of the card we’ll find the PCIe power sockets and SLI connectors. Since NVIDIA isn’t scrambling to save space like they were with the GTX 680, the GTX 670’s PCIe power sockets are laid out in a traditional side-by-side manner. As for the SLI connectors, since this is a high-end GeForce card NVIDIA provides 2 connectors, allowing for the card to be used in 3-way SLI.

Finally at the front of the card NVIDIA is using the same I/O port configuration and bracket that we first saw with the GTX 680. This means 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. This also means the GTX 670 follows the same rules as the GTX 680 when it comes to being able to idle with multiple monitors.

NVIDIA GeForce GTX 670 Meet The EVGA GeForce GTX 670 Superclocked
POST A COMMENT

414 Comments

View All Comments

  • CeriseCogburn - Sunday, May 13, 2012 - link

    28 isn't playable and yes, the nVidia card really wins that game, as we see in the 680 test, which I had to point out as you, the amd fanboy despite your claim to own a 680, never noticed like all the rest, including the author to a large degree, in the 680 release review here.

    So take your temporary Gaming Evolved amd game driver hack that disabled nVidia's winning sweep across all resolutions and celebrate, a fool of course needs to do so, you're welcome for pointing it out.

    (roll eyes at the immense ignorance, again)

    Now enjoy the video http://www.youtube.com/watch?v=J0eZEdpsgjk

    I know amd told us many, many times, as did so many little named posters here for so many years, that nVidia was evil for TWIMTBP work and what they did to the amd cards performance in those efforts.

    Maybe they should note this little problem that developed ?

    ROFL
    Reply
  • Galidou - Sunday, May 13, 2012 - link

    ''(roll eyes at the immense ignorance, again)''

    Such a troll again, such a lack of respect, indirect attacks, most useless comment on earth... immense ignorance, comon we're speaking about video cards, someone not knowing that you can change a buck for 4 quarters might be an ignorant... unless he's a tribal that lived in africa all his life... and then the ''again'' omg the inflammatory stuff you're able to say in 7 words sentence... I'm unsure you realize what you do... you're being really mean...

    All that and I could only say you're mean... I guess respect ain't give to everyone, sad it can't be bought, because it must be the most important value, ALL AROUND, a man can have. Everything starts with respect, real wisdom is acquired through respect.
    Reply
  • CeriseCogburn - Sunday, May 13, 2012 - link

    All you do is attack, this is the last response you get from me unless you're on topic with a point, and as respectful as you demand others be, which you are not, you're the worst so far, a pure troll with no points at all.

    The other posters are trying to make points, not you. Attention for you is over.
    Reply
  • Galidou - Sunday, May 13, 2012 - link

    He mentions the puny 1,25gb because the card CAN'T run it and is usually a good performer against the competition at that resolution. You say it beats the 7870 in the next page, by 1-2 fps, I don't even call that a beating. Plus in a game that favors Nvidia.

    ''This is the kind of crap we have to put up with here, at least we who have a brain and can see what's going on.''

    I think you meant ''we who are Nvidia's fanboys''

    It may not be the most neutral of comments but it's not the worst, you're just looking to find things against Nvidia and enumerate them because that's what Nvidia's fanboys do. What do they do, get mad as soon as there's a little reason to.
    Reply
  • CeriseCogburn - Sunday, May 13, 2012 - link

    No other card can run it with gaming frame rates in his test.
    Since he didn't point that out, I DID.

    I guess he'll have to work harder to find a valid reason to dis the card since he has claimed nVidia is keeping it on, and the egg sure looks like that is correct - a lot of stock present.

    Now, you validated my point, but want to call it petty, but a similar thing happens on nearly every gaming page.

    At least what I point out is some pathetic grammar nazi problem, huh, which all of the rest of you seem to love to do so much, in every review posting it appears to be a contest for that, and I agree with the reviewer that PM'ing him to offer a correction is actually adult like and responsible.

    That of course is different than what bothers me, and we shall see, a valid complaint is usually responded to in a good way, so there may be some thought ahead, I certainly expect positive results for my efforts.
    As is so often claimed here by those in charge they respond to readers and what they want, so this fits that case fine.

    On that note along those lines I already advocated a single gaming chart with the collated data of the various cards in their overclocked performance states, as it seems to me that would be a nice added feature to reviews and would settle some of the rancor on the reviewed cards sometimes having OC'ed versions added in their release.
    Reply
  • SamsungAppleFan - Thursday, May 10, 2012 - link

    first of all, thanks for the article, but you guys (anandtech) take wayyyyyy too long between new articles. get on it guys, seriously. and i'm still waiting on my gs3 full review lol. Reply
  • GlItCh017 - Thursday, May 10, 2012 - link

    This card can really shine if it likes what you like. I'm a huge FPS fan, so in scenario's such as BF3 the GTX 670 vs. Radeon HD 6970 is a no brainer. Reply
  • Morg. - Thursday, May 10, 2012 - link

    Sure, like most FPS's won't be on Unreal4 instead of frostbite ;)

    That engine, for some reason, favors nVidia and I don't think it's a good GPU performance metric, although if you're going to play frostbite content, it's clearly important.
    Reply
  • Morg. - Thursday, May 10, 2012 - link

    Nevermind, I knew why but I hadn't seen it mentioned yet.
    http://www.geforce.com/whats-new/articles/johan-an...
    So .. buying a graphics board because it is favored by a botched graphical engine which is temporary - meh. If you plan on keeping your pc 2 or 3 years, fck the marketing, get raw power instead ;)
    Reply
  • antef - Thursday, May 10, 2012 - link

    Are you saying AMD has the better GPU for most FPS titles outside ones running Frostbite? Reply

Log in

Don't have an account? Sign up now