Meet The GeForce GTX 670

Because of the relatively low power consumption of GK104 relative to past high-end NVIDIA GPUs, NVIDIA has developed a penchant for small cards. While the GTX 680 was a rather standard 10” long, NVIDIA also managed to cram the GTX 690 into the same amount of space. Meanwhile the GTX 670 takes this to a whole new level.

We’ll start at the back as this is really where NVIDIA’s fascination with small size makes itself apparent. The complete card is 9.5” long, however the actual PCB is far shorter at only 6.75” long, 3.25” shorter than the GTX 680’s PCB. In fact it would be fair to say that rather than strapping a cooler onto a card, NVIDIA strapped a card onto a cooler. NVIDIA has certainly done short PCBs before – such as with one of the latest GTX 560 Ti designs – but never on a GTX x70 part before. But given the similarities between GK104 and GF114, this isn’t wholly surprising, if not to be expected.

In any case this odd pairing of a small PCB with a large cooler is no accident. With a TDP of only 170W NVIDIA doesn’t necessarily need a huge PCB, but because they wanted a blower for a cooler they needed a large cooler. The positioning of the GPU and various electronic components meant that the only place to put a blower fan was off of the PCB entirely, as the GK104 GPU is already fairly close to the rear of the card. Meanwhile the choice of a blower seems largely driven by the fact that this is an x70 card – NVIDIA did an excellent job with the GTX 560 Ti’s open air cooler, which was designed for the same 170W TDP, so the choice is effectively arbitrary from a technical standpoint (there’s no reason to believe $400 customers are any less likely to have a well-ventilated case than $250 buyers). Accordingly, it will be NVIDIA’s partners that will be stepping in with open air coolers of their own designs.

Starting as always at the top, as we previously mentioned the reference GTX 670 is outfitted with a 9.5” long fully shrouded blower. NVIDIA tells us that the GTX 670 uses the same fan as the GTX 680, and while they’re nearly identical in design, based on our noise tests they’re likely not identical. On that note unlike the GTX 680 the fan is no longer placed high to line up with the exhaust vent, so the GTX 670 is a bit more symmetrical in design than the GTX 680 was.


Note: We dissaembled the virtually identical EVGA card here instead

Lifting the cooler we can see that NVIDIA has gone with a fairly simple design here. The fan vents into a block-style aluminum heatsink with a copper baseplate, providing cooling for the GPU. Elsewhere we’ll see a moderately sized aluminum heatsink clamped down on top of the VRMs towards the front of the card. There is no cooling provided for the GDDR5 RAM.


Note: We dissaembled the virtually identical EVGA card here instead

As for the PCB, as we mentioned previously due to the lower TDP of the GTX 670 NVIDIA has been able to save some space. The VRM circuitry has been moved to the front of the card, leaving the GPU and the RAM towards the rear and allowing NVIDIA to simply omit a fair bit of PCB space. Of course with such small VRM circuitry the reference GTX 670 isn’t built for heavy overclocking – like the other GTX 600 cards NVIDIA isn’t even allowing overvolting on reference GTX 670 PCBs – so it will be up to partners with custom PCBs to enable that kind of functionality. Curiously only 4 of the 8 Hynix R0C GDDR5 RAM chips are on the front side of the PCB; the other 4 are on the rear. We typically only see rear-mounted RAM in cards with 16/24 chips, as 8/12 will easily fit on the same side.

Elsewhere at the top of the card we’ll find the PCIe power sockets and SLI connectors. Since NVIDIA isn’t scrambling to save space like they were with the GTX 680, the GTX 670’s PCIe power sockets are laid out in a traditional side-by-side manner. As for the SLI connectors, since this is a high-end GeForce card NVIDIA provides 2 connectors, allowing for the card to be used in 3-way SLI.

Finally at the front of the card NVIDIA is using the same I/O port configuration and bracket that we first saw with the GTX 680. This means 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. This also means the GTX 670 follows the same rules as the GTX 680 when it comes to being able to idle with multiple monitors.

NVIDIA GeForce GTX 670 Meet The EVGA GeForce GTX 670 Superclocked
Comments Locked

414 Comments

View All Comments

  • eachus - Friday, May 11, 2012 - link

    "I would like to see a 7970OC comparison? I was trying to find a 680 but gave up and got an 7970oc for $499 it's running at 1025Mhz and should be faster than a 680."

    No need really--unless you want to use the latest and greatest benchmarks in an on-line pissing contest. Let's face it. ANY high-end card, including now what AMD calls mid-range (the 7800 series) will run a 1920x1200 display with no trouble. Upgrade to three displays (5760x1200) or one 2560x1600 display, and now the high-end cards make sense. But keep track of the evolving drivers...the original 7970 benchmarks are now dead letters given the 12.4 drivers. Fixing lots of little gotchas in the drivers means that now you don't need two cards to drive a three screen display. This is true of both nVidia and AMD. (Compare the benchmarks here with the launch benchmarks for the 680. The 7970 wins several benchmarks now, which will last until nVidia has some non-launch drivers. Since you are getting more performance than you originally paid for, just sit back and enjoy it.)

    Oh, and one other point which is starting to become an issue: 3d graphics. When I find a 3d display that doesn't give me headaches with a few hours use, I'll buy one. Right now though, it is already clear that 120 fps displays will be needed for that, and AMD has been considering 90+ fps out of the ROPs as sufficient--at least with 2560x1600, non-Crossfire. :-(
  • CeriseCogburn - Friday, May 11, 2012 - link

    Well eachus amd fanboy friend, instead of calling him and idiot owner of the 7970 in a useless pissing contest, why don't you just tell him the truth ?
    The 7970 LOSES @ 1,025 core clock to the stock 680.
    +
    And yes, amd sucks when it comes to 3d gaming, while nVidia is the king and has the market absolutely cornered.
  • CeriseCogburn - Friday, May 11, 2012 - link

    Here's a comparison dunce.
    http://hexus.net/tech/reviews/graphics/37209-gefor...
    The amd card loses with just it overclocked.
    Enjoy that amd fail.
  • Chris Simmo - Thursday, May 10, 2012 - link

    Excellent review. Excellent card! One thing I didn't see was image quality. Both in game and video/HT capability. I know its not a common thing amoungst gamers and self builders, but we do complete systems and always found that AMD does a better job with video than NVIDIA. Most noticable are old TV show rips or youtube videos. Do you have any opinion? Just want to know if its a good allrounder or only good at games.
  • CeriseCogburn - Friday, May 11, 2012 - link

    Well, you must have enjoyed the years long amd youtube issues with constantly crashing adobe flash, having to turn off the HW acell, etc.
    I sure hope your customers didn't mind.
  • Chris Simmo - Friday, May 11, 2012 - link

    Actually we have had few reports, and they have been pretty even on both sides, but the only time I have had to turn off hardware acceleration is with TMT5 SIMHD engine with any HD7000 card, and I have seen the flash problems you are talking about, but strangly only on notebooks, not desktops. I do believe a fix is in the works for TMT though. I really hope so. Anyway, I am not looking at it from a fanboy point of view. I will give my customers what they want, but will push the most capable product for their application and for a long while now this has generally been AMD graphics with Intel CPU's. This has come up well as we have had an incredibly low failure rate.

    Support is important and in the last 6 months I have worked with both ASUS and AMD support to correct issues in motherboards and driver bugs, and as long as they get fixed I am happy, providing it’s not a really big issue to start with.

    I’m sure you have worked with 1200 graphics cards in the last 4 years and know exactly what you are talking about, and offer support to match the warranty, which in our case is 2 years.

    I am excited about the 600 series though, and will look to stock the GTX670 when the prices settle.
  • CeriseCogburn - Saturday, May 12, 2012 - link

    Hundreds of graphics cards but not 1200 graphics cards, so yes I do know what I'm talking about, and I'm not stuck in a corporate environment where information and standard responses are a required line to be towed, so it's likely my experience this past year is actually far wider than yours.

    In any case, you allude to "pretty even on both sides" with flash player problems, but before you claimed youtube was great with amd, or at least claimed some sort of superior image quality - and mentioned old movie clips, but by your own comparison you must be servicing nVidia cards as well, or perhaps you are not, and the choice was made and now...

    So to me that appears to be your corporate environment talking as you'll go with whatever your masters say is happening, and it is said to be better then, by default necessity - as whomever decides in the power structure will in effect be demanding employees tow that line, it is after all, foolish to do otherwise, especially in a forum where you could be easily discovered doing so.

    I frankly find your surprise lack of knowledge with flash player issues disturbing, but attribute it to the call center corporate support model.

    Anyway go ahead with your issue, and find the answer you need. I think you should do well with whatever it is you need for answer, and think no doubt that has already been determined.

  • Chris Simmo - Sunday, May 13, 2012 - link

    You assume a lot. There isn't much of your argument there that actually holds any truth, and you are the foolish one to assume otherwise. To me it appears you are just looking for a fight. No better than a common troll. And as a result I won't bother with a rebuttal or correcting any of your assumptions.
  • Ryan Smith - Friday, May 11, 2012 - link

    Hi Chris;

    Since image quality is almost entirely a function of architectural improvements as opposed to the individual SKUs, we don't do a major IQ writeup for every card. For Kepler in game image quality hasn't changed (NVIDIA hasn't changed any of the fundamental algorithms). As for HT/video, we're hoping to have something up soon once we can secure a more HTPC-suitable GK107 card.

    -Thanks
    Ryan Smith
  • Chris Simmo - Friday, May 11, 2012 - link

    Thanks Ryan. Yeah, I knew its an architectural thing. I build some very high powered HTPC as gaming systems and since this card will no doubt end up shipping with smaller coolers, it seems like a great card to put in a gaming HTPC. Looking forward to the HTPC review though.

Log in

Don't have an account? Sign up now