Meet The GeForce GTX 670

Because of the relatively low power consumption of GK104 relative to past high-end NVIDIA GPUs, NVIDIA has developed a penchant for small cards. While the GTX 680 was a rather standard 10” long, NVIDIA also managed to cram the GTX 690 into the same amount of space. Meanwhile the GTX 670 takes this to a whole new level.

We’ll start at the back as this is really where NVIDIA’s fascination with small size makes itself apparent. The complete card is 9.5” long, however the actual PCB is far shorter at only 6.75” long, 3.25” shorter than the GTX 680’s PCB. In fact it would be fair to say that rather than strapping a cooler onto a card, NVIDIA strapped a card onto a cooler. NVIDIA has certainly done short PCBs before – such as with one of the latest GTX 560 Ti designs – but never on a GTX x70 part before. But given the similarities between GK104 and GF114, this isn’t wholly surprising, if not to be expected.

In any case this odd pairing of a small PCB with a large cooler is no accident. With a TDP of only 170W NVIDIA doesn’t necessarily need a huge PCB, but because they wanted a blower for a cooler they needed a large cooler. The positioning of the GPU and various electronic components meant that the only place to put a blower fan was off of the PCB entirely, as the GK104 GPU is already fairly close to the rear of the card. Meanwhile the choice of a blower seems largely driven by the fact that this is an x70 card – NVIDIA did an excellent job with the GTX 560 Ti’s open air cooler, which was designed for the same 170W TDP, so the choice is effectively arbitrary from a technical standpoint (there’s no reason to believe $400 customers are any less likely to have a well-ventilated case than $250 buyers). Accordingly, it will be NVIDIA’s partners that will be stepping in with open air coolers of their own designs.

Starting as always at the top, as we previously mentioned the reference GTX 670 is outfitted with a 9.5” long fully shrouded blower. NVIDIA tells us that the GTX 670 uses the same fan as the GTX 680, and while they’re nearly identical in design, based on our noise tests they’re likely not identical. On that note unlike the GTX 680 the fan is no longer placed high to line up with the exhaust vent, so the GTX 670 is a bit more symmetrical in design than the GTX 680 was.


Note: We dissaembled the virtually identical EVGA card here instead

Lifting the cooler we can see that NVIDIA has gone with a fairly simple design here. The fan vents into a block-style aluminum heatsink with a copper baseplate, providing cooling for the GPU. Elsewhere we’ll see a moderately sized aluminum heatsink clamped down on top of the VRMs towards the front of the card. There is no cooling provided for the GDDR5 RAM.


Note: We dissaembled the virtually identical EVGA card here instead

As for the PCB, as we mentioned previously due to the lower TDP of the GTX 670 NVIDIA has been able to save some space. The VRM circuitry has been moved to the front of the card, leaving the GPU and the RAM towards the rear and allowing NVIDIA to simply omit a fair bit of PCB space. Of course with such small VRM circuitry the reference GTX 670 isn’t built for heavy overclocking – like the other GTX 600 cards NVIDIA isn’t even allowing overvolting on reference GTX 670 PCBs – so it will be up to partners with custom PCBs to enable that kind of functionality. Curiously only 4 of the 8 Hynix R0C GDDR5 RAM chips are on the front side of the PCB; the other 4 are on the rear. We typically only see rear-mounted RAM in cards with 16/24 chips, as 8/12 will easily fit on the same side.

Elsewhere at the top of the card we’ll find the PCIe power sockets and SLI connectors. Since NVIDIA isn’t scrambling to save space like they were with the GTX 680, the GTX 670’s PCIe power sockets are laid out in a traditional side-by-side manner. As for the SLI connectors, since this is a high-end GeForce card NVIDIA provides 2 connectors, allowing for the card to be used in 3-way SLI.

Finally at the front of the card NVIDIA is using the same I/O port configuration and bracket that we first saw with the GTX 680. This means 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. This also means the GTX 670 follows the same rules as the GTX 680 when it comes to being able to idle with multiple monitors.

NVIDIA GeForce GTX 670 Meet The EVGA GeForce GTX 670 Superclocked
Comments Locked

414 Comments

View All Comments

  • chizow - Thursday, May 10, 2012 - link

    Hehe ya exactly.

    It seems as if many of the apologists willing to give AMD and Nvidia a pass on 28nm pricing are new to the GPU game, or tech toy game for that matter. They just have no historical perspective at all which I'm sure thrills the marketing/finance guys over in Silicon Valley....they can't sink their meathooks into these guys fast enough.

    But yeah its not about being able to afford it, its about being able to buy them and actually feel good about the purchase looking back, a week, a month, a year from now. Most people only need to be burned once to learn their lesson, hopefully those early adopters who bought 7970/7950 and GTX 680/690 have learned theirs.
  • CeriseCogburn - Friday, May 11, 2012 - link

    Hehe exactly read above, you're wrong. Prove otherwise or shut up. Calling everyone else stupid when you have ZERO EVIDENCE presented doesn't cut it.
  • chizow - Friday, May 11, 2012 - link

    Zero evidence? Try 10 years of GPU benchmarks. Seriously, try looking at some before commenting because its obvious you haven't paid close enough attention in the past....
  • SlyNine - Saturday, May 12, 2012 - link

    You never provide any valid evidence. But this topic has been debated and historical data is all the proof you need.
  • SlyNine - Saturday, May 12, 2012 - link

    I didn't want to spend 500$, but I did want something 2x as fast as my 5870. So the 680GTX got the bill.

    But honestly I wouldn't expect cards to keep evolving at the same rate. Cards used more slots and more power to keep doubling and tripling in performance. That trend cannot go on for long because their is not enough slots and power to do so.

    I fully expect all performance increases now to be from architecture improvements and node changes.
  • CeriseCogburn - Friday, May 11, 2012 - link

    You guys can claim anything you want with your bland, data absent talking point, so let's examine just how far out of sane bounds you two are ( you and chizow ) - and BTW I'd appreciate the reviewers talking point as well. A full quote will be fine.

    Let's skip any insane retentiveness with fancy specific wording you've used as a ruse taken absolutely literally in the hopes that those not noticing a perfectly literal and absolutely strict translation would be fooled by the idea presented, and do a cursory examination:

    We can start with the G80 - it morphed into the G92 and G92b which all you slam artists screamed was a rebranded absolute clone.

    So we'll take the 9800GTX+ vs- the next released card, the GTX280.
    GTX280 morphed into GTX285
    We can move from the GTX285 to the GTX480 - the GTX480 morphed into the GTX580.
    So we move from GTX580 to GTX680.

    Although I have not strictly gone insane talking point ruse literal and used a sort of CHEAT you people espouse with your lousy nm + new die move talking point, what I have is what people actually EXPERIENCED AS CARD RELEASES - so we'll have to go with those.

    9800GTX+ to GTX280 (wow that gigantic upgrade )

    GTX285 to GTX480 ( wow that gigantic upgrade )

    GTX580 to GTX680 ( wow that gigantic upgrade )

    Yes, you people are full of it. That's why you keep AVOIDING any real information and figured if you could spew on just the talking point, no one would have to notice what lying crap it is.
  • chizow - Friday, May 11, 2012 - link

    Once again, your arguments are full of fail or you simply don't know how to read simple benchmarks. Using your own, flawed comparisons, you would see:

    9800GTX+ to GTX280 (wow that gigantic upgrade) +70% OR MORE

    GTX285 to GTX480 ( wow that gigantic upgrade ) +60% OR MORE

    GTX580 to GTX680 ( wow that gigantic upgrade ) +30%......

    The reason your comparison is flawed however is because you are comparing half-generations when you compare a refresh to a new generation, so the gap in both time and performance is diminished which decreases value for your $$$.

    Correct comparisons are as follows, and when you look at it that way, GTX 680 and all other 28nm parts look EVEN worst in retrospect:

    8800GTX to GTX 280: +75% OR MORE
    GTX 280 to GTX 480: +80% OR MORE
    GTX 480 to GTX 680: +40%.....

    or if you prefer refresh to refresh but a full generation between them:

    9800GTX+ to GTX 285: +75% or MORE
    GTX 285 to GTX 580: +80% or MORE
    GTX 580 to GTX 685???: ???

    Seriously just read some benchmarks then come back because it seems you're the only one who doesn't seem to get it.
  • CeriseCogburn - Saturday, May 12, 2012 - link

    For shame for shame - more lies... no wonder you're yelling and you NEVER used benchmarks....
    Let's use anand's historical data !

    And let's do it correctly. We go from the card we have now, to the card they release. People now have the GTX580 - and that's what they see in the charts as you whine, bitch and moan and spread your Charlie D FUD. Likewise in former tier jumps/releases.
    So we will use the TRUTH, not some anal retentive abject cheat worded just so, as you two fools suggest we should, to spin your lies ever more "in your favor".

    9800GTX+ to GTX280 , crysis, 25fps to 34fps

    http://www.anandtech.com/show/2549/11

    There it is chizow and it ain't no 75% ! NOT EVEN CLOSE

    GTX285 to GTX480 , crysisw, 30fps to 44fps

    http://www.anandtech.com/show/2977/nvidia-s-geforc...

    Guess you lost on that 80% lie there, too.

    GTX580 to GTX680, 41fps to 48fps

    http://www.anandtech.com/show/5699/nvidia-geforce-...

    NOPE. Certainly not half of the former two moves, with NONE at any 80%,let alone 75%, not even 50%, can't even say 33% increase, EVER.

    Sorry chizow, your only lies, anf big ones at that, won't do.
  • SlyNine - Saturday, May 12, 2012 - link

    You're cherry picking. A huge fallacy. Some benchmarks do show 75%+

    Plus we are talking about 8800GTX to GTX280. We are not talking about rebaged products with very minor changes.
  • CeriseCogburn - Sunday, May 13, 2012 - link

    ROFL - I used what was first in line, I provided the information - I BROKE THE GIANT LIE you the amd fanboy have used with ZERO EVIDENCE.

    Let's face it, I'm 100% more accurate than you ever attempted to be, merely spewing your talking point in as big a fat fib fashion you could muster.

    Of course that's the usual crap from the liars.
    Now you'll just whine the facts I persented vs the no facts you ever presented or even linked to, "don't count".

    R O F L - loser ( what else do you expect me to do man - you're making it very difficult to support your opinion guy)

Log in

Don't have an account? Sign up now