Meet The GeForce GTX 670

Because of the relatively low power consumption of GK104 relative to past high-end NVIDIA GPUs, NVIDIA has developed a penchant for small cards. While the GTX 680 was a rather standard 10” long, NVIDIA also managed to cram the GTX 690 into the same amount of space. Meanwhile the GTX 670 takes this to a whole new level.

We’ll start at the back as this is really where NVIDIA’s fascination with small size makes itself apparent. The complete card is 9.5” long, however the actual PCB is far shorter at only 6.75” long, 3.25” shorter than the GTX 680’s PCB. In fact it would be fair to say that rather than strapping a cooler onto a card, NVIDIA strapped a card onto a cooler. NVIDIA has certainly done short PCBs before – such as with one of the latest GTX 560 Ti designs – but never on a GTX x70 part before. But given the similarities between GK104 and GF114, this isn’t wholly surprising, if not to be expected.

In any case this odd pairing of a small PCB with a large cooler is no accident. With a TDP of only 170W NVIDIA doesn’t necessarily need a huge PCB, but because they wanted a blower for a cooler they needed a large cooler. The positioning of the GPU and various electronic components meant that the only place to put a blower fan was off of the PCB entirely, as the GK104 GPU is already fairly close to the rear of the card. Meanwhile the choice of a blower seems largely driven by the fact that this is an x70 card – NVIDIA did an excellent job with the GTX 560 Ti’s open air cooler, which was designed for the same 170W TDP, so the choice is effectively arbitrary from a technical standpoint (there’s no reason to believe $400 customers are any less likely to have a well-ventilated case than $250 buyers). Accordingly, it will be NVIDIA’s partners that will be stepping in with open air coolers of their own designs.

Starting as always at the top, as we previously mentioned the reference GTX 670 is outfitted with a 9.5” long fully shrouded blower. NVIDIA tells us that the GTX 670 uses the same fan as the GTX 680, and while they’re nearly identical in design, based on our noise tests they’re likely not identical. On that note unlike the GTX 680 the fan is no longer placed high to line up with the exhaust vent, so the GTX 670 is a bit more symmetrical in design than the GTX 680 was.


Note: We dissaembled the virtually identical EVGA card here instead

Lifting the cooler we can see that NVIDIA has gone with a fairly simple design here. The fan vents into a block-style aluminum heatsink with a copper baseplate, providing cooling for the GPU. Elsewhere we’ll see a moderately sized aluminum heatsink clamped down on top of the VRMs towards the front of the card. There is no cooling provided for the GDDR5 RAM.


Note: We dissaembled the virtually identical EVGA card here instead

As for the PCB, as we mentioned previously due to the lower TDP of the GTX 670 NVIDIA has been able to save some space. The VRM circuitry has been moved to the front of the card, leaving the GPU and the RAM towards the rear and allowing NVIDIA to simply omit a fair bit of PCB space. Of course with such small VRM circuitry the reference GTX 670 isn’t built for heavy overclocking – like the other GTX 600 cards NVIDIA isn’t even allowing overvolting on reference GTX 670 PCBs – so it will be up to partners with custom PCBs to enable that kind of functionality. Curiously only 4 of the 8 Hynix R0C GDDR5 RAM chips are on the front side of the PCB; the other 4 are on the rear. We typically only see rear-mounted RAM in cards with 16/24 chips, as 8/12 will easily fit on the same side.

Elsewhere at the top of the card we’ll find the PCIe power sockets and SLI connectors. Since NVIDIA isn’t scrambling to save space like they were with the GTX 680, the GTX 670’s PCIe power sockets are laid out in a traditional side-by-side manner. As for the SLI connectors, since this is a high-end GeForce card NVIDIA provides 2 connectors, allowing for the card to be used in 3-way SLI.

Finally at the front of the card NVIDIA is using the same I/O port configuration and bracket that we first saw with the GTX 680. This means 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. This also means the GTX 670 follows the same rules as the GTX 680 when it comes to being able to idle with multiple monitors.

NVIDIA GeForce GTX 670 Meet The EVGA GeForce GTX 670 Superclocked
Comments Locked

414 Comments

View All Comments

  • Gastec - Tuesday, November 13, 2012 - link

    Your every comment is an attack at ATi/AMD video cards or people who seem to be using them( maybe). Why?
    You get payed to do negative publicity for AMD on the review sites? Because having a Ati card die on you in the middle of some important event in you gaming life( like raiding in WoW , am I close or am I close ;-) could not be the only reason.
  • shin0bi272 - Friday, May 11, 2012 - link

    I think the reason for the missing memory chips is because they will be releasing the 685 in aug or sep which is supposed to be 4gb and run on a 512bit bus. It could be possible to increase the size of the gpu core and double the amount of ram and stil have it on a card this length.

    30% faster than the 670 (685 is supposed to be 25% faster than the 680 and the 670 is 5% slower than the 680) on the same size card but using 2x8 pin connectors instead of 2x6pin. Now imagine an after market or water cooler on it... yeah.

    You'll get great FPS on all those brand new console ports.
  • KivBlue - Friday, May 11, 2012 - link

    $400 for a graphics card is just too much.
  • medi01 - Saturday, May 12, 2012 - link

    For me too. In 200$-ish range it looks like AMD 7850 / 7870 are the only reasonable options.

    PS
    Honestly I don't get all the hype about 680/670. Cards are only marginally better than AMDs offering (losing in some games, winning in some games).

    Power consumption difference according to techpowerup is only 2 watt in idle, about 9 watt at full load. Not a big deal either.

    Basically a slight price drop by AMD on 7950/7970 (for whoever really wants those) once these cards actually become available and that's it.

    I also wonder, how many "enthusiasts" with multi-monitor setups in the need of a faster card are out there.

    PPS
    Worst part of it would be nVidia releasing confusing mix of completely different cards lower end cards released under the same name, to confuse consumer.
  • CeriseCogburn - Saturday, May 12, 2012 - link

    I guess considering you think $200 equals $335 and that also equals $250, we can say your comment equals a big fat lie, and when a big fat lie is what one immediately starts off with, everyone knows something is WRONG.
  • Gastec - Tuesday, November 13, 2012 - link

    Again you attack someone who posted a comment about AMD cards, just because. You are obviously a troll and someone from this, STILL RESPECTED computer magazine should ban you.
  • Gastec - Tuesday, November 13, 2012 - link

    Yes but people who buy these have enough money to buy even the $3000-4000. Tesla K20 ones . Many of them have money from their parents, if you catch my drift.
  • RegEDDIT - Sunday, May 13, 2012 - link

    I managed to buy one from Amazon before they went out of stock, and I must say, I am pleased. BF3 plays like a champ, Skyrim is smooth as butter, and Adobe Premiere edits like a champ now with Nvidia hardware acceleration. This is on a 1920x1080 monitor with an old q6700 quad core @ 2.666 GHz and 800Mhz RAM. I do not expect to buy another card for a long while.
  • CeriseCogburn - Sunday, May 13, 2012 - link

    Here's COMPUTE SOFTWARE BASE in action.

    " Adobe Premiere edits like a champ now with Nvidia hardware acceleration "

    Nvidia wins. amd loses in compute.
  • Zebo - Sunday, May 13, 2012 - link

    7950 has 40-50% OC potential being servilely down tuned @ 800Mhz.

    If AMD is smart they will release a 1100Mhz version and wreck 670s party.

    If you're an overclocked you'd be dumb to buy 670 with its limited control and potential of 7950. Let alone of you're on water.

Log in

Don't have an account? Sign up now