Meet The GeForce GTX 670

Because of the relatively low power consumption of GK104 relative to past high-end NVIDIA GPUs, NVIDIA has developed a penchant for small cards. While the GTX 680 was a rather standard 10” long, NVIDIA also managed to cram the GTX 690 into the same amount of space. Meanwhile the GTX 670 takes this to a whole new level.

We’ll start at the back as this is really where NVIDIA’s fascination with small size makes itself apparent. The complete card is 9.5” long, however the actual PCB is far shorter at only 6.75” long, 3.25” shorter than the GTX 680’s PCB. In fact it would be fair to say that rather than strapping a cooler onto a card, NVIDIA strapped a card onto a cooler. NVIDIA has certainly done short PCBs before – such as with one of the latest GTX 560 Ti designs – but never on a GTX x70 part before. But given the similarities between GK104 and GF114, this isn’t wholly surprising, if not to be expected.

In any case this odd pairing of a small PCB with a large cooler is no accident. With a TDP of only 170W NVIDIA doesn’t necessarily need a huge PCB, but because they wanted a blower for a cooler they needed a large cooler. The positioning of the GPU and various electronic components meant that the only place to put a blower fan was off of the PCB entirely, as the GK104 GPU is already fairly close to the rear of the card. Meanwhile the choice of a blower seems largely driven by the fact that this is an x70 card – NVIDIA did an excellent job with the GTX 560 Ti’s open air cooler, which was designed for the same 170W TDP, so the choice is effectively arbitrary from a technical standpoint (there’s no reason to believe $400 customers are any less likely to have a well-ventilated case than $250 buyers). Accordingly, it will be NVIDIA’s partners that will be stepping in with open air coolers of their own designs.

Starting as always at the top, as we previously mentioned the reference GTX 670 is outfitted with a 9.5” long fully shrouded blower. NVIDIA tells us that the GTX 670 uses the same fan as the GTX 680, and while they’re nearly identical in design, based on our noise tests they’re likely not identical. On that note unlike the GTX 680 the fan is no longer placed high to line up with the exhaust vent, so the GTX 670 is a bit more symmetrical in design than the GTX 680 was.


Note: We dissaembled the virtually identical EVGA card here instead

Lifting the cooler we can see that NVIDIA has gone with a fairly simple design here. The fan vents into a block-style aluminum heatsink with a copper baseplate, providing cooling for the GPU. Elsewhere we’ll see a moderately sized aluminum heatsink clamped down on top of the VRMs towards the front of the card. There is no cooling provided for the GDDR5 RAM.


Note: We dissaembled the virtually identical EVGA card here instead

As for the PCB, as we mentioned previously due to the lower TDP of the GTX 670 NVIDIA has been able to save some space. The VRM circuitry has been moved to the front of the card, leaving the GPU and the RAM towards the rear and allowing NVIDIA to simply omit a fair bit of PCB space. Of course with such small VRM circuitry the reference GTX 670 isn’t built for heavy overclocking – like the other GTX 600 cards NVIDIA isn’t even allowing overvolting on reference GTX 670 PCBs – so it will be up to partners with custom PCBs to enable that kind of functionality. Curiously only 4 of the 8 Hynix R0C GDDR5 RAM chips are on the front side of the PCB; the other 4 are on the rear. We typically only see rear-mounted RAM in cards with 16/24 chips, as 8/12 will easily fit on the same side.

Elsewhere at the top of the card we’ll find the PCIe power sockets and SLI connectors. Since NVIDIA isn’t scrambling to save space like they were with the GTX 680, the GTX 670’s PCIe power sockets are laid out in a traditional side-by-side manner. As for the SLI connectors, since this is a high-end GeForce card NVIDIA provides 2 connectors, allowing for the card to be used in 3-way SLI.

Finally at the front of the card NVIDIA is using the same I/O port configuration and bracket that we first saw with the GTX 680. This means 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. This also means the GTX 670 follows the same rules as the GTX 680 when it comes to being able to idle with multiple monitors.

NVIDIA GeForce GTX 670 Meet The EVGA GeForce GTX 670 Superclocked
Comments Locked

414 Comments

View All Comments

  • CeriseCogburn - Saturday, May 12, 2012 - link

    Yes, too true for the ranters to respond to, and every gaming card is a want not a need.

    The node shrinking is beginning to reach it's limit anyway - were getting down to not very many rows of molecules literally in between paths to buffer electrical disturbances inside the chips that are beginning to cause uncontainable tunneling damage.

    28nm is at about a 51 atoms thick lattice, with 22mn , 16mn, and 14mn reaching down to so delicate an about 26 atom thick buffer aside electron flow that dual layering, "tri-gate" other K materials, or some quantum mechanics plasma containment breakthrough is soon needed.

    Given that level of amazing sophistication, and a coming near limit on node shrink due to reaching into counting atoms on both hands and feet that make up "the thickness of the wire protection", perhaps the pricing complainers need a bit more humility and thankfulness, instead of a constant demand of moar and faster and better for less.
  • Spunjji - Thursday, May 10, 2012 - link

    Congratulations on missing the point.
    -slow clap-

    I for one will be holding off on a purchase, because I don't feel like pissing my cash away on something that is not offering good value for money. That != expensive.
  • CeriseCogburn - Friday, May 11, 2012 - link

    That person never missed any point.
    You for one appear to be the delay artist. You run around moaning and bitching about prices, when in fact what you really want is others to purchase now so you can suck up the discounts at a later date reaping the future competition crunch and resultant price drops available because the early purchasers you attacked made it all possible.
    I don't have a problem with a person who bides their time and tries to score a deal when prices eventually drop. It does bother me though, when they go around forums biting the hand that is actually feeding them.
    Enjoy your dried up second or third tier penny pincher second class when you can finally afford it.
  • SlyNine - Saturday, May 12, 2012 - link

    I can complain about anything I want. But you're COMPLAINING about complainers. Which makes you kinda a hypocrite.
  • CeriseCogburn - Sunday, May 13, 2012 - link

    And you're complaining about a person complaining about complainers....

    See why, having some info and a real data point or a point on the cards reviewed relating to the current discussion in a post becomes important ?

    This one has none.
  • Pantsu - Thursday, May 10, 2012 - link

    At least in Germany it seems like the cheapest 7970 go for ~400€, and the same goes for GTX 670, so it's not so cut and dry win for the GTX 670 at least in Europe.

    It's a great card from Nvidia, and sure to drop the current prices to more reasonable level. I almost feel like I should've waited, but then again, I've already played with my 7970 for five months, I suppose that could be worth the 100€ it has lost from its value so far.
  • Morg. - Thursday, May 10, 2012 - link

    My poor man, read the benchmarks. Even today the 7970 is a much better card, if only because it doesn't have the 2GB bottleneck that's gonna make you hurt one year from now.
  • Spunjji - Thursday, May 10, 2012 - link

    I'm still not sure I believe this RAM "bottleneck" argument. No way to know until a year from now, unfortunately.
  • CeriseCogburn - Thursday, May 10, 2012 - link

    It's already clear the cores are maxxed out - so no future game will be bringing some 2G vs 3G gaming advantage with these cards.
    Trying to crank it up enough and hack in 50 high rez textures into Skyrim in order to cause a frame rate drop on the 680 is as ephemeral as stable amd drivers, and just doesn't happen.
    Common sense and the plain facts already make the answer absolutely clear to anyone who hasn't got a deranged mind scrubbed into fantasy by rampant fanboyism.
    If lack of personal experience is the problem, a hundred reviews already have solved the riddle - the answer is the cores cannot push more - they puke out before the memory does.
    "Enjoy" the "memory advantage" at "unplayable frame rates" and " 0 minimums" in a crawling, stuttering, frustrating "proof".
    I'm certain a thousand morons will join the crusade and like religious zealouts claim victory as blood shoots from their eyes and they fall down in epilelptic seizure from the blinking monitor and zero to fifteen fps frames.
    LOL
    A victory indeed.
  • Pantsu - Thursday, May 10, 2012 - link

    I've read and done benchmarks myself quite enough, thank you. I doubt there's going to be a bad bottleneck a year from now, and when 2 GB is not enough, a 7970 won't be enough either.

    In any case I wasn't referring to 670 being a better card, I would still buy a 7970 (custom one) since I play at 5760x1080, where it's faster than 680 when overclocked. But if I had waited for this five months, I could've saved maybe 150€. Oh well, it's not like consumer electronics is ever a good investment...

Log in

Don't have an account? Sign up now