Meet the GeForce GTX 680

All things considered the design of the GeForce GTX 680 is not a radical departure from the GTX 580, but at the same time it also has some distinct differences owing to the fact that its TDP is some 50W lower than GTX 580.

Like the past GTX x80 cards, the basic design of the GTX 680 is that of a blower. A radial fan at the rear of the card sucks in air and pushes it towards the front of the card. Notably, due to a combination of card length and the fan position, the “wedge” around the fan has been done away with. NVIDIA tells us that this shouldn’t significantly impact the cooling of the card, particularly since it has a lower TDP in the first place, but when used in SLI it will remove some of the breathing room than the GTX 580 enjoyed.

Looking at the fan itself, compared to the GTX 580 the fan has been moved from the center of the card to the top of the card. This is due to NVIDIA’s port configuration, which uses a stacked DVI connector that consumes what would have normally been part of the exhaust vent on the GTX 580. We’ll get into the port configuration more in a minute, but for the moment the significance is that because the GTX 680 only has half a vent NVIDIA has moved the fan to match the vent, which is why the fan has been moved up.

On that note, the repositioning of the fan also had its own ramifications. Because the fan is now so close to the top and at the same time so close to the rear, NVIDIA went with a unique method of arranging the PCIe power sockets. Rather than having them side-by-side as we’ve seen on countless NVIDIA cards in the past, the sockets are stacked on each other in a staggered configuration. With the fan otherwise occupying the space that one of the sockets would take up, this configuration allowed NVIDIA to have two sockets without lengthening the card just to fit another socket. Overall this staggered design is not too difficult to work with, though with one socket facing the opposite way it might require some cable repositioning if you have a well maintained cable run.

Moving on, when we remove the shroud on the GTX 680 we see the fan, baseplate, and heatsink in full detail. NVIDIA is using an aluminum fin stacked heatsink, very similar to what we saw on the GTX 580. Underneath the heatsink NVIDIA is using a set of three heatpipes to transfer heat between the GPU and the heatsink. This is as opposed to the vapor chamber on the GTX 580, and while this setup doesn’t allow empirical testing, given the high efficiency of vapor chambers it’s likely that this isn’t quite as efficient, though to what degree we couldn’t say.

Finally, after removing the fan, baseplate, and heatsink, we can see the PCB in full detail. Unlike GF110 and GF114, GK104 is not capped with an IHS, allowing for the heatsink to directly come in contact with the GPU die. Meanwhile arranged around the GPU we can see the 8 2Gb GDDR5 RAM modules that give the GTX 680 its 2GB of RAM. These are Hynix R0C modules, which means they’re rated for 6GHz, the stock memory speed for the GTX 680. Overall the card measures 10” long with no overhang from the shroud, making it 0.5” shorter than the GTX 580.  

Looking at the top of the card, as always we see the SLI connectors. Following in the footsteps of the GTX 580, the GTX 680 features 2 SLI connectors, allowing for up to 3-way SLI.

Meanwhile at the front of the card we see the I/O bracket. As we alluded to previously, the GTX 680 uses a stacked DVI design here; NVIDIA has done everything they can to keep the DVI ports at the very bottom of the card to avoid impeding airflow, but the upper DVI port still occupies roughly 40% of what would otherwise be the vent. Altogether the GTX 680 features 2 DL-DVI ports, a full size HDMI port, and a full size DisplayPort.

While NVIDIA has used DVI and HDMI ports for quite some time, this is the first time NVIDIA has included DIsplayPort on a reference design. Unfortunately we find that this ruffles our feathers a bit, although this isn’t strictly NVIDIA’s fault. As we’ve covered in the past, DisplayPort comes in both a full size and miniDP configuration – AMD in particular has used miniDP since the Radeon HD 6800 series in 2010. And while we’re happy to see DisplayPort finally make it into an NVIDIA reference design, the fact that it’s a full size DisplayPort is less than encouraging because at this point in time DisplayPort has largely been replaced by miniDP.

Ultimately the fault for this lies more with the VESA than NVIDIA, but it’s indicative of a larger problem in the DisplayPort community in that both full size DP and miniDP are equally valid and equally capable ports. While full size DisplayPort has the distinction of coming first, thanks in large part to Apple it has largely been displaced by miniDP as the most common variant on source devices. The problem with this is that both miniDP and DisplayPort are now in wide use; wide, redundant use.

At this point desktop computers and video cards coming with full size DisplayPorts is silly at best, and frustrating at worst. The laptop guys aren’t going to give up miniDP due to the space savings, and there’s no significantly good reason to use DisplayPort on desktops when miniDP offers the same functionality. We would rather see the PC industry standardize on miniDP across all source devices, and thereby eliminate any ambiguity with regards to what cables or adaptors are necessary. DisplayPort adoption has been slow enough – having 2 variants of the port on source devices only makes it more confusing for everyone.

Finally, while we’re on the subject of display connectivity we quickly took a look at how the idle clockspeeds of GTX 680 are impacted by the use of multiple displays. With 2 displays GTX 680 can utilize its full idle clocks, but only if both displays are connected via a TMDS type connection (DVI/HDMI) and run with identical timings. But if different timings are used or if one display is connected via DisplayPort, then the GTX 680 will shift to its low power 3D clocks. However if we expand that to 3 monitors and enable NVIDIA Surround, then the GTX 680 can operate at full idle regardless of whether DisplayPort is used or not.

GPU Boost: Turbo For GPUs The Test
POST A COMMENT

405 Comments

View All Comments

  • Targon - Thursday, March 22, 2012 - link

    Many people have been blasting AMD for price vs performance in the GPU arena in the current round of fighting. The thing is, until now, AMD had no competition, so it was expected that the price would remain high until NVIDIA released their new generation. So, expect lower prices from AMD to be released in the next week.

    You also fail to realize that with a 3 month lead, AMD is that much closer to the refresh parts being released that will beat NVIDIA for price vs. performance. Power draw may still be higher from the refresh parts, but that will be addressed for the next generation.

    Now, you and others have been claiming that NVIDIA is somehow blowing AMD out of the water in terms of performance, and that is NOT the case. Yes, the 680 is faster, but isn't so much faster that AMD couldn't EASILY counter with a refresh part that catches up or beats the 680 NEXT WEEK. The 7000 series has a LOT of overclocking room there.

    So, keep things in perspective. A 3 percent performance difference isn't enough to say that one is so much better than the other. It also remains to be seen how quickly the new NVIDIA parts will be made available.
    Reply
  • SlyNine - Thursday, March 22, 2012 - link

    I still blast them, I'm not happy with the price/performance increase of this generation at all. Reply
  • Unspoken Thought - Thursday, March 22, 2012 - link

    Finally! Logic! But it still falls on deaf ears. We finally see both sides getting their act together to get minimum features sets in, and we can't see passed our own bias squabbles.

    How about we continue to push these manufactures in what we want and need most; more features, better algorithms, and last and most important, revolutionize and find new way to render, aside from vector based rendering.

    Lets start incorporating high level mathematics for fluid dynamics and the such. They have already absorbed PhysX and moved to Direct Compute. Lets see more realism in games!

    Where is the Technological Singularity when you need one.
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Well, the perspective I have is amd had a really lousy (without drivers) and short 2.5 months when the GTX580 wasn't single core king w GTX590 dual core king and the latter still is and the former has been replaced by the GTX680.
    So right now Nvidia is the asbolute king, and before now save that very small time period Nvidia was core king for what with the 580 .. 1.5 years ?
    That perspective is plain fact.
    FACTS- just stating those facts makes things very clear.
    We already have heard the Nvidia monster die is afoot - that came out with "all the other lies" that turned out to be true...
    I don't fail to realize anything - I just have a clear mind about what has happened.
    I certainly hope AMD has a new better core slapped down very soon, a month would be great.
    Until AMD is really winning, it's LOSING man, it's LOSING!
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Since amd had no competition for 2.5 months and that excuses it's $569.99 price, then certainly the $500 price on the GTX580 that had no competition for a full year and a half was not an Nvidia fault, right ? Because you're a fair person and "finally logic!" is what another poster supported you with...
    So thanks for saying the GTX580 was never priced too high because it has no competition for 1.5 years.

    Reply
  • Unspoken Thought - Saturday, March 24, 2012 - link

    Honestly the only thing I was supporting was the fact he is showing that perspective changes everything. a fact exacerbated when bickering over marginal differences that are driven by the economy when dealing with price vs performance.

    Both of you have valid arguments, but it sounds like you just want to feel better about supporting nVidia.

    You should be able to see how AMD achieved its goals with nVidia following along playing leap frog. Looking at benchmarks, no it doesn't beat it in everything and both are very closely matched in power consumption, heat, and noise. Features are where nVidia shine and get my praise. but I would not fault you if you had either card.
    Reply
  • CeriseCogburn - Friday, April 06, 2012 - link

    Ok Targon, now we know TiN put the 1.3V core on the 680 and it OC'ed on air to 1,420 core, surpassing every 7970 1.3V core overclock out there.
    Furthermore, Zotac has put out the 2,000Ghz 680 edition...
    So it appears the truth comes down to the GTX680 has more left in the core than the 7970.
    Nice try but no cigar !
    Nice spin but amd does not win !
    Nice prediction, but it was wrong.
    Reply
  • SlyNine - Thursday, March 22, 2012 - link

    Go back and look at the benchmarks idiot. 7970 wins in some situations. Reply
  • SlyNine - Thursday, March 22, 2012 - link

    In Crysis max, 7970 gets 36 FPS while the 680 only gets 30 FPS.

    Yes, some how the 7970 is losing. LOOK AT THE NUMBERS, HELLO!!???

    Metro 2033 the 7970 gets 38 and the 680 gets 37. But yet in your eyes another loss for the 7970...

    7970 kills it in certian GPU Compute, and has hardware H.264 encoding.

    In a couple of games, which you already get massive FPS with both, the 680 boasts much much higher FPS. But than in games where you need the FPS the 7970 wins. Hmmm

    But no no, you're right, the 680 is total elite top shit.
    Reply
  • eddieroolz - Friday, March 23, 2012 - link

    You pretty much admitted that 7970 loses in a lot of other cases by stating that:

    "7970 kills it in certain GPU compute..."

    Adding the word modifier "certain" to describe a win is like admitting defeat in every other compute situation.

    Even for the games, you can only mention 2 out of what, 6 games where 7970 wins by a <10% margin. Meanwhile, GTX 680 proceeds to maul the 7970 by >15% in at least 2 of the games.

    Yes, 7970 is full of win, indeed! /s
    Reply

Log in

Don't have an account? Sign up now