Meet the GeForce GTX 680

All things considered the design of the GeForce GTX 680 is not a radical departure from the GTX 580, but at the same time it also has some distinct differences owing to the fact that its TDP is some 50W lower than GTX 580.

Like the past GTX x80 cards, the basic design of the GTX 680 is that of a blower. A radial fan at the rear of the card sucks in air and pushes it towards the front of the card. Notably, due to a combination of card length and the fan position, the “wedge” around the fan has been done away with. NVIDIA tells us that this shouldn’t significantly impact the cooling of the card, particularly since it has a lower TDP in the first place, but when used in SLI it will remove some of the breathing room than the GTX 580 enjoyed.

Looking at the fan itself, compared to the GTX 580 the fan has been moved from the center of the card to the top of the card. This is due to NVIDIA’s port configuration, which uses a stacked DVI connector that consumes what would have normally been part of the exhaust vent on the GTX 580. We’ll get into the port configuration more in a minute, but for the moment the significance is that because the GTX 680 only has half a vent NVIDIA has moved the fan to match the vent, which is why the fan has been moved up.

On that note, the repositioning of the fan also had its own ramifications. Because the fan is now so close to the top and at the same time so close to the rear, NVIDIA went with a unique method of arranging the PCIe power sockets. Rather than having them side-by-side as we’ve seen on countless NVIDIA cards in the past, the sockets are stacked on each other in a staggered configuration. With the fan otherwise occupying the space that one of the sockets would take up, this configuration allowed NVIDIA to have two sockets without lengthening the card just to fit another socket. Overall this staggered design is not too difficult to work with, though with one socket facing the opposite way it might require some cable repositioning if you have a well maintained cable run.

Moving on, when we remove the shroud on the GTX 680 we see the fan, baseplate, and heatsink in full detail. NVIDIA is using an aluminum fin stacked heatsink, very similar to what we saw on the GTX 580. Underneath the heatsink NVIDIA is using a set of three heatpipes to transfer heat between the GPU and the heatsink. This is as opposed to the vapor chamber on the GTX 580, and while this setup doesn’t allow empirical testing, given the high efficiency of vapor chambers it’s likely that this isn’t quite as efficient, though to what degree we couldn’t say.

Finally, after removing the fan, baseplate, and heatsink, we can see the PCB in full detail. Unlike GF110 and GF114, GK104 is not capped with an IHS, allowing for the heatsink to directly come in contact with the GPU die. Meanwhile arranged around the GPU we can see the 8 2Gb GDDR5 RAM modules that give the GTX 680 its 2GB of RAM. These are Hynix R0C modules, which means they’re rated for 6GHz, the stock memory speed for the GTX 680. Overall the card measures 10” long with no overhang from the shroud, making it 0.5” shorter than the GTX 580.  

Looking at the top of the card, as always we see the SLI connectors. Following in the footsteps of the GTX 580, the GTX 680 features 2 SLI connectors, allowing for up to 3-way SLI.

Meanwhile at the front of the card we see the I/O bracket. As we alluded to previously, the GTX 680 uses a stacked DVI design here; NVIDIA has done everything they can to keep the DVI ports at the very bottom of the card to avoid impeding airflow, but the upper DVI port still occupies roughly 40% of what would otherwise be the vent. Altogether the GTX 680 features 2 DL-DVI ports, a full size HDMI port, and a full size DisplayPort.

While NVIDIA has used DVI and HDMI ports for quite some time, this is the first time NVIDIA has included DIsplayPort on a reference design. Unfortunately we find that this ruffles our feathers a bit, although this isn’t strictly NVIDIA’s fault. As we’ve covered in the past, DisplayPort comes in both a full size and miniDP configuration – AMD in particular has used miniDP since the Radeon HD 6800 series in 2010. And while we’re happy to see DisplayPort finally make it into an NVIDIA reference design, the fact that it’s a full size DisplayPort is less than encouraging because at this point in time DisplayPort has largely been replaced by miniDP.

Ultimately the fault for this lies more with the VESA than NVIDIA, but it’s indicative of a larger problem in the DisplayPort community in that both full size DP and miniDP are equally valid and equally capable ports. While full size DisplayPort has the distinction of coming first, thanks in large part to Apple it has largely been displaced by miniDP as the most common variant on source devices. The problem with this is that both miniDP and DisplayPort are now in wide use; wide, redundant use.

At this point desktop computers and video cards coming with full size DisplayPorts is silly at best, and frustrating at worst. The laptop guys aren’t going to give up miniDP due to the space savings, and there’s no significantly good reason to use DisplayPort on desktops when miniDP offers the same functionality. We would rather see the PC industry standardize on miniDP across all source devices, and thereby eliminate any ambiguity with regards to what cables or adaptors are necessary. DisplayPort adoption has been slow enough – having 2 variants of the port on source devices only makes it more confusing for everyone.

Finally, while we’re on the subject of display connectivity we quickly took a look at how the idle clockspeeds of GTX 680 are impacted by the use of multiple displays. With 2 displays GTX 680 can utilize its full idle clocks, but only if both displays are connected via a TMDS type connection (DVI/HDMI) and run with identical timings. But if different timings are used or if one display is connected via DisplayPort, then the GTX 680 will shift to its low power 3D clocks. However if we expand that to 3 monitors and enable NVIDIA Surround, then the GTX 680 can operate at full idle regardless of whether DisplayPort is used or not.

GPU Boost: Turbo For GPUs The Test
Comments Locked

404 Comments

View All Comments

  • Arbie - Friday, March 23, 2012 - link

    "I've always said, choose your hardware by application, not by overall results"

    Actually, that' is what I said. But I wasn't as pompous about it, which may have confused you.

    ;)
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Well it's a good thing fair and impartial Ryan put the two games 680 doesn't trounce the 7970 in up first in the bench line up, so it would make amd look very good to the chart and chan click through crowd.
    Yeah, I like an alphabet that goes C for Crysis then M for Metro, so in fact A for AMD comes in first !
  • Sivar - Thursday, March 22, 2012 - link

    Many Anandtech articles not written by Anand have a certain, "written by an intelligent, geeky, slightly insecure teenager" feel to them. While still much better than other tech websites, and I've been around them all for some time, Anand is a cut above.

    This article, and a few others you've written, show that you are really getting the hang of being a truly professional writer.
    - Great technical detail without paraphrasing marketing material.
    - Not even the slightest hint of "fanboyism" for one company over another.
    - Doesn't drag on and on or repeat the same thing several times in slightly different ways.
    - Anand, who usually takes the cool articles for himself, had the trust in you to let you do this one solo.

    I would request, however, that you hyperlink some of the acronyms used. Even after being a reader since the Geocities days, it's sometimes difficult to remember every term and three letter combination on an article with so much depth and breadth.
    Also, for the sake of mobile users and image quality, there really needs to be some internal discussion on when to use which image format. PNG-8 for large areas of flat or gradient color, charts, screen captures, and slides -- but only when the source is not originally a JPG (because JPG subtly corrupts the image so as to ruin PNG's chance of compression) and JPG for pretty much all photographs. I wrote a program to analyze images and suggest a format -- Look for "ImageGuide" on Google Code.

    In any case, the fact that I can think of only the most minor of suggestions, as opposed to when I read a certain other website named after its founder of a much shorter name.
  • Sabresiberian - Thursday, March 22, 2012 - link

    I agree, another thorough review by one of the better people doing it on the internet. Thanks Ryan!

    As far as the dig on Tomshardware, I don't quite agree there. I notice Chris Angelini wrote the GTX 680 article for that website, and I'm very much looking forward to reading another thorough review.

    ;)
  • Sivar - Thursday, March 22, 2012 - link

    Tom's may have improved greatly since I last gave it another chance, but since not long after they were bought out, I've found the reporting to be flagrantly sensationalist and light on fact. The entity that bought them out, and the journalists he hired, are well known for just that. Many times I read the author's conclusion and wondered if he was looking at the same bar charts that I was.

    To be blunt, at times when people quoted their site, I felt as if I'd shifted into an alternate dimension where otherwise knowledgeable people were comically oblivious to the most egregiously flawed journalism. It was as if a group of Nobel prize winners were unthinkingly quoting Bill O'Reilly or Michael Moore on a political matter as if it was assumed they were a paragon of truth and even-headedness.
  • Sabresiberian - Thursday, March 22, 2012 - link

    Very well said. (I especially like the comment using both a staunch conservative and flaming liberal as examples of poor source material.)

    I do tend to look at specific writers, and probably give Toms too much credit based on that more narrow view. I freely admit to having a somewhat fanboy feel for the site, too, since it was one of the first and set a mark, at one time, unreached by any other site I knew about.

    I have been a bit confused by some statements made by some writers on that site, conclusions that didn't seem to be supported by the data they published. Perhaps it's time to step up and comment when that happens, instead of just interpreting my confusion as a lack of careful reading on my part (which happens to the best of us).

    ;)
  • Nfarce - Sunday, March 25, 2012 - link

    "It was as if a group of Nobel prize winners were unthinkingly quoting Bill O'Reilly or Michael Moore on a political matter"

    Well Obama, Al Gore, and Arafat were each given a Nobel Prize, so I'd hardly consider that entity a good reference point of analogy in validity. In Any event, I welcome opinions from all sides. The main stream "news" media long ago abandoned objective reporting. One is most informed by reading different takes on the same "facts" and formulate one's own opinion. Of course, you have to also research outside the spectrum for some information that the main stream media will hide from time to time: like how bad off the US economy really is.
  • Ryan Smith - Thursday, March 22, 2012 - link

    Thanks for the kind words, though I'm not sure whether "slightly insecure teenager" is a compliment on my youthful vigor or a knock against my immaturity.;-)

    Anyhow, we usually use PNGs where it makes sense. All of my photo processing is done with Photoshop, so I know ahead of time whether JPG or PNG will spit out a smaller image, and any blurring that may result. Generally speaking we should be using the right format in the right place, but if you have any specific examples where it's not, drop me a line (it will be hard to keep track of this thread) and I'll take a look.
  • IlllI - Thursday, March 22, 2012 - link

    ok there seems to be some confusion here. many times in the review you directly compare it to GF114 (which i think was never present in the 580 series) yet also at the same time you say the 680 is a direct replacement for the 580.
    i dont think it is. what it DOES seem like however, is that this 680 was indeed suppose to be the mainstream part, but since the ati competition was so low that nvidia just jacked up the card number (and price).
  • CeriseCogburn - Friday, March 23, 2012 - link

    So Nvidia should have dropped the 680, their GTX580($450+) killer in at $299...
    Charlie D's $299 rumor owns internet group think brains.

Log in

Don't have an account? Sign up now