Meet The GeForce GTX 670

Because of the relatively low power consumption of GK104 relative to past high-end NVIDIA GPUs, NVIDIA has developed a penchant for small cards. While the GTX 680 was a rather standard 10” long, NVIDIA also managed to cram the GTX 690 into the same amount of space. Meanwhile the GTX 670 takes this to a whole new level.

We’ll start at the back as this is really where NVIDIA’s fascination with small size makes itself apparent. The complete card is 9.5” long, however the actual PCB is far shorter at only 6.75” long, 3.25” shorter than the GTX 680’s PCB. In fact it would be fair to say that rather than strapping a cooler onto a card, NVIDIA strapped a card onto a cooler. NVIDIA has certainly done short PCBs before – such as with one of the latest GTX 560 Ti designs – but never on a GTX x70 part before. But given the similarities between GK104 and GF114, this isn’t wholly surprising, if not to be expected.

In any case this odd pairing of a small PCB with a large cooler is no accident. With a TDP of only 170W NVIDIA doesn’t necessarily need a huge PCB, but because they wanted a blower for a cooler they needed a large cooler. The positioning of the GPU and various electronic components meant that the only place to put a blower fan was off of the PCB entirely, as the GK104 GPU is already fairly close to the rear of the card. Meanwhile the choice of a blower seems largely driven by the fact that this is an x70 card – NVIDIA did an excellent job with the GTX 560 Ti’s open air cooler, which was designed for the same 170W TDP, so the choice is effectively arbitrary from a technical standpoint (there’s no reason to believe $400 customers are any less likely to have a well-ventilated case than $250 buyers). Accordingly, it will be NVIDIA’s partners that will be stepping in with open air coolers of their own designs.

Starting as always at the top, as we previously mentioned the reference GTX 670 is outfitted with a 9.5” long fully shrouded blower. NVIDIA tells us that the GTX 670 uses the same fan as the GTX 680, and while they’re nearly identical in design, based on our noise tests they’re likely not identical. On that note unlike the GTX 680 the fan is no longer placed high to line up with the exhaust vent, so the GTX 670 is a bit more symmetrical in design than the GTX 680 was.


Note: We dissaembled the virtually identical EVGA card here instead

Lifting the cooler we can see that NVIDIA has gone with a fairly simple design here. The fan vents into a block-style aluminum heatsink with a copper baseplate, providing cooling for the GPU. Elsewhere we’ll see a moderately sized aluminum heatsink clamped down on top of the VRMs towards the front of the card. There is no cooling provided for the GDDR5 RAM.


Note: We dissaembled the virtually identical EVGA card here instead

As for the PCB, as we mentioned previously due to the lower TDP of the GTX 670 NVIDIA has been able to save some space. The VRM circuitry has been moved to the front of the card, leaving the GPU and the RAM towards the rear and allowing NVIDIA to simply omit a fair bit of PCB space. Of course with such small VRM circuitry the reference GTX 670 isn’t built for heavy overclocking – like the other GTX 600 cards NVIDIA isn’t even allowing overvolting on reference GTX 670 PCBs – so it will be up to partners with custom PCBs to enable that kind of functionality. Curiously only 4 of the 8 Hynix R0C GDDR5 RAM chips are on the front side of the PCB; the other 4 are on the rear. We typically only see rear-mounted RAM in cards with 16/24 chips, as 8/12 will easily fit on the same side.

Elsewhere at the top of the card we’ll find the PCIe power sockets and SLI connectors. Since NVIDIA isn’t scrambling to save space like they were with the GTX 680, the GTX 670’s PCIe power sockets are laid out in a traditional side-by-side manner. As for the SLI connectors, since this is a high-end GeForce card NVIDIA provides 2 connectors, allowing for the card to be used in 3-way SLI.

Finally at the front of the card NVIDIA is using the same I/O port configuration and bracket that we first saw with the GTX 680. This means 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. This also means the GTX 670 follows the same rules as the GTX 680 when it comes to being able to idle with multiple monitors.

NVIDIA GeForce GTX 670 Meet The EVGA GeForce GTX 670 Superclocked
Comments Locked

414 Comments

View All Comments

  • CeriseCogburn - Friday, May 11, 2012 - link

    Here we are treated to 5 paragraphs of attack on the 600 series, note the extreme phrasing given against, the "known problem" of the GTX cards, not the "inexplicable" results that means something is wrong other than with the amd card when it loses.

    This contrasts with the bland put downs the 670 compared to the 680 and 570 receive when they win by enormous comparative margins in the rest of the game pages.

    So the reviewer has a field day here:
    " Overall performance isn’t particularly strong either. Given the price tag of the GTX 670 the most useful resolution is likely going to be 2560x1600, where the GTX 670 can’t even cross 30fps at our enthusiast settings."

    Completely unmentioned of course after the jab at pricing just for the 670, same price as the 7950 that fares not playably better here and gets spanked the other 75% of time, is the 5760x1200 higher resolution where the 670 achieves even higher frame rates than 30, surpassing 30 all the way up to 35.6, just below 35.8 for the 7950, two tenths of one frame.
    Somehow, that isn't mentioned, only the lower 2560 resolution with lower frame rates (for all the cards) but the 670 singled out as the only card that has peaked at "given the price".

    Later in the review completely unplayable frame rates for all cards in a test is used to attack just the 570, too, for lack of memory. Forget the fact that none of the other cards had playable frame rates.

    Eye candy was turned down at the triple monitor resolution but that has never before made 2560 most useful for reviews here, especially with lower frame rates for all the cards tested at the lower resolution settings. Only when we can cut down nVidia is such a statement useful, and it is very definitely confined to just the nVidia card then.
    So avoided is the paltry frames of the other competing cards even at "easier" 5670 settings.
    If the 670 is no good past 2560, then neither are any of the other cards at all, except the 7970 ? Maybe the reviewer suddenly has decided 5670 gaming is no good.

    " Even 1920x1200 isn’t looking particularly good. This is without a doubt the legitimate lowpoint of the GTX 670. "
    Well, then the 7950 doesn't look good at 1920 either, less than 1 fps difference, not to mention the 680 that is within in couple frames.
    If we take the reviewers words with their total meaning, what we have is the unsaid statement that - only possibly the 7970 should be used for this game at 5670, no other card though.

    Now - a total examination of the Crysis Warhead gaming page fps charts reveals this:
    Every card is unplayable at every resolution except for the latest respective releases in 1920X1200 chart.
  • BrunoLogan - Friday, May 11, 2012 - link


    ... still unreachable for me on what budget is concerned. The 660Ti is what I'm looking for but as I saw somewhere it may be 5 or 6 months away and I don't know if I can wait that long. My old C2D need's replacement. I may just grab a 560Ti and later down the road get 760Ti skipping 6xx generation... bittersweet :-\
  • shin0bi272 - Friday, May 11, 2012 - link

    what gpu do you have now? You said you need to upgrade your core 2 cpu but didnt say what you have for a gpu.

    Im still running a gts 250 and getting pretty good fps on everything but BF3 at pretty high specs on a 19x12 monitor. Your major issue with games today is they are made for consoles with dx9 cards in them that came out in 2006. So with some exceptions (crysis, metro 2033, and bf3 for example) you dont really need a huge card for anything other than playing all the new games at max spec. Sure everyone wants to do that but you dont necessarily NEED to. I played metro2033 and had physx on and it was easily playable in the 30-40 fps range.

    So if you upgrade your cpu (which btw you really only need to upgrade to a quad core if its a gaming rig to get the max fps a cpu upgrade wil give you) and keep your current gpu and then when money allows grab a 670 or 685 or whatever AMD has to offer in your price range.
  • BrunoLogan - Friday, May 11, 2012 - link

    Do you really want to know? I have a 9600GT :-P Also, I can't call it an upgrade as in "adding some new parts and keeping some of the existing ones". I'm really buying a new machine PSU and tower included. That's why I say it's bittersweet to buy a new machine with previous generation graphics.
  • shin0bi272 - Monday, May 14, 2012 - link

    hmmm well see what you have for cash left over after buying the important parts. Honestly buying a new system now is a good idea. Ivy bridge being released which drops the prices of sandy bridge (which as I said before will give you the same FPS in game) and even throwing $125 at a 550ti will be a good jump till the end of summer when the 685 comes out, and the 550 wouldnt give you the best fps so youd still be wanting to upgrade.
  • shin0bi272 - Monday, May 14, 2012 - link

    oh and a gts250 is a rebadged and die shrunk 8800gtx
  • medi01 - Saturday, May 12, 2012 - link

    Hard to justify buying 560Ti, unless you somehow decided to only by nVidia.
    7850 consumes much less power while being ahead performance wise.
  • CeriseCogburn - Saturday, May 12, 2012 - link

    7850 costs more, and has the massive disadvantage of being plagued with the now featureless in comparison amd crash pack latest, 12.4, to be followed on by another disaster within a months time.
  • medi01 - Sunday, May 13, 2012 - link

    Why don't you kill yourself, dear nVidia zealot with a lot of time to post utter nonsense?
  • CeriseCogburn - Sunday, May 13, 2012 - link

    LOL - hey man the facts aren't issues to be sad about.

    If I get depressed I'll let you know so you can help. :)

Log in

Don't have an account? Sign up now