Power, Temperature, & Noise

As always, we’re wrapping up our look at a video card’s stock performance with a look at power, temperature, and noise. More so than even single GPU cards, this is perhaps the most important set of metrics for a multi-GPU card. Poor cooling that results in high temperatures or ridiculous levels of noise can quickly sink a multi-GPU card’s chances. Ultimately with a fixed power budget of 300W or 375W, the name of the game is dissipating that heat as quietly as you can without endangering the GPUs.

GeForce GTX 600 Series Voltages
Ref GTX 690 Boost Load Ref GTX 680 Boost Load Ref GTX 690 Idle
1.175v 1.175v 0.987v

It’s interesting to note that the GPU voltages on GTX 680 and GTX 690 are identical; both idle at the 0.987v, and both max out at 1.175v for the top boost bin. It would appear that NVIDIA’s binning process for the GTX 690 is looking almost exclusively at leakage; they don’t need to find chips that operate at a lower voltage, they merely need chips that don’t waste too much power.

NVIDIA has progressively brought down their idle power consumption and it shows. Where the GTX 590 would draw 155W at the wall at idle, we’re drawing 130W with the GTX 690. For a single GPU NVIDIA’s idle power consumption is every bit as good as AMD’s, however they don’t have any way of shutting off the 2nd GPU like AMD does, meaning that the GTX 690 still draws more power at idle than the 7970CF. Being able to shut off that 2nd GPU really mitigates one of the few remaining disadvantages of a dual-GPU card, and it’s a shame NVIDIA doesn’t have something like this.

Long idle power consumption merely amplifies this difference. Now NVIDIA is running 2 GPUs while AMD is running 0, which means the GTX 690 is leading to us pulling 19W more at the wall while doing absolutely nothing.

Thanks to NVIDIA’s binning, the load power consumption of the GTX 690 looks very good here. Under Metro we’re drawing 63W less at the wall compared to the GTX 680 SLI, even though we’ve already established that performance is within 5%. The gap with the 7970CF is even larger; the 7970CF may have a performance advantage, but it comes at a cost of 175W more at the wall.

OCCT power is much the same story. Here we’re drawing 429W at the wall, an incredible 87W less than the GTX 680 SLI. In fact a GTX 690 draws less power than a single GTX 580. That is perhaps the single most impressive statistic you’ll see today. Meanwhile compared to the 7970CF the difference at the wall is 209W. The true strength of multi-GPU cards is their power consumption relative to multiple cards, and thanks to NVIDIA’s ability to get the GTX 690 so very close to the GTX 680 SLI the GTX 690 is absolutely sublime here.

Moving on to temperatures, how well does the GTX 690 do? Quite well. Like all dual-GPU cards GPU temperatures aren’t as good as with single-GPU cards, but it’s also no worse than any dual-GPU setup. In fact of all the dual-GPU cards in our benchmark selection this is the coolest, beating even the GTX 590. Kepler’s low power consumption really pays off here.

For load temperatures we’re going to split things up a bit. While our official testing protocol is to test with our video cards directly next to each other when doing multi-card configurations, we’ve gone ahead and tested the GTX 680 SLI both in an adjacent and spaced configuration, with the spaced configuration marked with a *.

When it comes to load temperatures the GTX 690 once again does well for itself. Under Metro it’s warmer than most single GPU cards, but only barely so. The difference from a GTX 680 is only 3C, 1C with a spaced GTX 680 SLI, and it’s 4C cooler than an adjacent GTX 680 SLI setup.  More importantly perhaps is that Metro temperatures are 6C cooler than on the GTX 590.

As for OCCT, the numbers are different but the story is the same. The GTX 690 is 3C warmer than the GTX 680, 1C warmer than a spaced GTX 680 SLI, and 4C cooler than an adjacent GTX 680 SLI. Meanwhile temperatures are now 8C cooler than the GTX 590 and even 6C cooler than the GTX 580.

So the GTX 680 does well with power consumption and temperatures, but is there a noise tradeoff? At idle the answer is no; at 40.9dB it’s effectively as quiet as the GTX 680 and incredibly enough over 6dB quieter than the GTX 590. NVIDA’s progress at idle continues to impress, even if they can’t shut off the second GPU.

When NVIDIA was briefing us on the GTX 690 they said that the card would be notably quieter than even a GTX 680 SLI, which is quite the claim given how quiet the GTX 680 SLI really is. So out of all the tests we have run, this is perhaps the result we’ve been the most eager to get to. The results are simply amazing. The GTX 690 is quieter than a GTX 680 SLI alright; it’s quieter than a GTX 680 SLI whether the cards are adjacent or spaced. The difference with spaced cards is only 0.5dB under Metro, but it’s still a difference. Meanwhile with that 55.1dB noise level the GTX 690 is doing well against a number of other cards here, effectively tying the 7970 and beating out every other multi-GPU configuration on the board.

OCCT is even more impressive, thanks to a combination of design and the fact that NVIDIA’s power target system effectively serves as a throttle for OCCT. 55.8dB is not only just a hair louder than under Metro, but it’s still a hair quieter than a spaced GTX 680 SLI setup. It’s also quieter than a 7970, a GTX 580, and every other multi-GPU configuration we’ve tested. The only thing it’s not quieter than is the GTX 680 and the 6970.

With all things considered the GTX 690 is not that much quieter than the GTX 590 under gaming loads, but NVIDIA has improved performance just enough that they can beat their own single-GPU cards in SLI. And at the same time the GTX 690 consumes significantly less power for what amounts to a temperature tradeoff of only a couple of degrees. The fact that the GTX 690 can’t quite reach the GTX 680 SLI’s performance may have been disappointing thus far, but after looking at our power, temperature, and noise data it’s a massive improvement on the GTX 680 SLI for what amounts to a very small gaming performance difference.

Compute Performance Overclocked: Power, Temperature, & Noise
POST A COMMENT

199 Comments

View All Comments

  • CeriseCogburn - Thursday, May 10, 2012 - link

    The GTX680 by EVGA in a single sku outsells the combined total sales of the 7870 and 7850 at newegg.
    nVidia "vaporware" sells more units than the proclaimed "best deal" 7000 series amd cards.
    ROFL
    Thanks for not noticing.
    Reply
  • Invincible10001 - Sunday, May 13, 2012 - link

    Maybe a noob question, but can we expect a mobile version of the 690 on laptops anytime soon? Reply
  • trumpetlicks - Thursday, May 24, 2012 - link

    Compute performance in this case may have to do with 2 things:
    - Amount of memory available for the threaded computational algorithm being run, and
    - the memory IO throughput capability.

    From the rumor-mill, the next NVidia chip may contain 4 GB per chip and a 512 bit bus (which is 2x larger than the GK104).

    If you can't feed the beast as fast as it can eat it, then adding more cores won't increase your overall performance.
    Reply
  • Joseph Gubbels - Tuesday, May 29, 2012 - link

    I am a new reader and equally new to the subject matter, so sorry if this is a dumb question. The second page mentioned that NVIDIA will be limiting its partners' branding of the cards, and that the first generation of GTX 690 cards are reference boards. Does NVIDIA just make a reference design that other companies use to make their own graphics cards? If not, then why would anyone but NVIDIA have any branding on the cards? Reply
  • Dark0tricks - Saturday, June 02, 2012 - link

    anyone who sides with AMD or NVIDIA are retards - side with yourself as a consumer - buy the best card at the time that is available AND right for your NEEDs.

    fact is the the 690 is trash regardless of whether you are comparing it to a NVIDIA card to a AMD card - if im buying a card like a 690 why the FUCK would i want anything below 1200 P
    even if it is uncommon its a mfing trash of a $1000 card considering:

    $999 GeForce GTX 690
    $499 GeForce GTX 680
    $479 Radeon HD 7970

    and that SLI and CF both beat(or equal) the 690 at higher res's and cost less(by 1$ for NVIDIA but still like srsly wtf NVIDIA !? and 40$ for AMD) ... WHAT !?

    furthermore you guys fighting over bias when the WHOLE mfing GFX community (companies, software developers is built on bias) is utterly ridiculous, GFX vendoers (AMD and NVIDA) have skewed results for games for the last decade + , and software vendors two - there needs to laws against specfically building a software for a particular graphics card in addition to making the software work worse on the other (this applies to both companies)

    hell workstation graphics cards are a very good example of how the industry likes to screw over consumers ( if u ever bios modded - not just soft modded a normal consumer card to a work station card , you would know all that extra charge(up-to 70% extra for the same processor) of a workstation card is BS and if the government cleaned up their shitty policies we the consumer would be better for it)
    Reply
  • nyran125 - Monday, June 04, 2012 - link

    yep........

    Ultra expensive and Ultra pointless.
    Reply
  • kitty4427 - Monday, August 20, 2012 - link

    I can't seem to find anything suggesting that the beta has started... Reply
  • trameaa - Friday, March 01, 2013 - link

    I know this is a really old review, and everyone has long since stopped the discussion - but I just couldn't resist posting something after reading through all the comments. Understand, I mean no disrespect to anyone at all by saying this, but it really does seem like a lot of people haven't actually used these cards first hand.

    I see all this discussion of nVidia surround type setups with massive resolutions and it makes me laugh a little. The 690 is obviously an amazing graphics card. I don't have one, but I do use 2x680 in SLI and have for some time now.

    As a general rule, these cards have nowhere near the processing power necessary to run those gigantic screen resolutions with all the settings cranked up to maximum detail, 8xAA, 16xAF, tessellation, etc....

    In fact, my 680 SLI setup can easily be running as low as 35 fps in a game like Metro 2033 with every setting turned up to max - and that is at 1920x1080.

    So, for all those people that think buying a $1000 graphics card means you'll be playing every game out there with every setting turned up to max across three 1920x1200 displays - I promise you, you will not - at least not at a playable frame rate.

    To do that, you'll be realistically looking at 2x$1000 graphics cards, a ridiculous power supply, and by the way you better make sure you have the processing power to push those cards. Your run of the mill i5 gaming rig isn't gonna cut it.
    Reply
  • Utomo - Friday, October 25, 2013 - link

    More than 1 year since it is announced. I hope new products will be better. My suggestion: 1 Add HDMI, it is standard. 2. consider to allow us to add memory / SSD for better/ faster performance, especially for rendering 3D animation, and other Reply

Log in

Don't have an account? Sign up now