Overclocked: Power, Temperature, & Noise

Our final task is our look at GTX 690’s overclocking capabilities. NVIDIA has told us that with GTX 690 they weren’t just looking to duplicate GTX 680 SLI’s performance, but also its overclocking capabilities. This is quite the lofty goal, since with GTX 690 NVIDIA is effectively packing 2 680s into the same amount of space, leaving far less space for VRM circuitry and trace routing.

GeForce 600 Series Overclocking
  GTX 690 GTX 680
Shipping Core Clock 915MHz 1006MHz
Shipping Max Boost Clock 1058MHz 1110MHz
Shipping Memory Clock 6GHz 6GHz
Shipping Max Boost Voltage 1.175v 1.175v
     
Overclock Core Clock 1040MHz 1106MHz
Overclock Max Boost Clock 1183MHz 1210MHz
Overclock Memory Clock 7GHz 6.5GHz
Overclock Max Boost Voltage 1.175v 1.175v

In practice NVIDIA has not quite kept up with GTX 680, and in other ways completely exceeded it. When it comes to the core clock we didn’t quite reach parity with our reference GTX 680; the GTX 680’s highest boost clock bin could hit 1210MHz, while the GTX 690’s highest boost clock bin topped out at 1183MHz, some 27MHz (2%) slower.

On the other hand, our memory overclock is so high as to be within the “this doesn’t seem physically possible” range. As we have discussed time and time again, GDDR5 memory busses are difficult to run at high clocks on a good day, never mind a bad day. With GF110 NVIDIA couldn’t get too far past 4GHz, and even with GTX 680 NVIDIA was only shipping at 6GHz.

It would appear that no one has told NVIDIA’s engineers that 7GHz is supposed to be impossible, and as a result they’ve gone and done the unthinkable. Some of this is certainly down to the luck of the draw, but it doesn’t change the fact that our GTX 690 passed every last stability test we could throw at it at 7GHz. And what makes this particularly interesting is the difference between the GTX 680 and the GTX 690 – both are equipped with 6GHz GDDR5 RAM, but while the GTX 680 is equipped with Hynix the GTX 690 is equipped with Samsung. Perhaps the key to all of this is the Samsung RAM?

In any case, our final result was a +125MHz core clock offset and a +1000MHz memory clock offset, which translates into a base clock of 1040MHz, a max boost clock of 1183MHz, and a memory clock of 7GHz. This represents a 12%-14% core overclock and a 17% memory overclock, which is going to be enough to put quite the pep in the GTX 690’s step.

As always we’re going to start our look at overclocking in reverse, beginning with power, temperature, and noise. For the purpose of our testing we’ve tested our GTX 690 at two different settings: at stock clocks with the power target set to 135% (GTX 690 PT), and with our custom overclock alongside the same 135% power target (GTX 690 OC). This allows us to look at both full overclocking and the safer option of merely maxing out the boost clocks for all they’re worth.

As expected, merely increasing the power target to 135% was enough to increase the GTX 690’s power consumption, though overclocking further adds to that. Even with the power target increase however, the power consumption at the wall for the GTX 690 is still lower than the GTX 680 SLI by over 20W, which is quite impressive. As we’ll see in our section on performance this is more than enough to erase the GTX 690’s performance gap, meaning at this point its still consuming less power than the GTX 680 SLI while offering better performance than its dual-card cousin.

It’s only after outright overclocking that we finally see power consumption equalize with the GTX 680 SLI. The overclocked GTX 690 is within 10W of the GTX 680 SLI, though as we’ll see the performance is notably higher.

What does playing with clocks and the power target do to temperatures? The impact isn’t particularly bad, though we’re definitely reaching the highest temperatures we really want to hit. For the GTX 690 PT things are actually quite good under Metro, with the temperature not budging an inch even with the higher power consumption. Under OCCT however temperatures have risen 5C to 87C. Meanwhile the GTX 690 OC reaches 84C under Metro and a toasty 89C under Metro. These should be safe temperatures, but I would not want to cross 90C for any extended period of time.

Finally we have load noise. Unsurprisingly, because load temperatures did not go up for the GTX 690 PT under Metro load noise has not gone up either. On the other hand load noise under OCCT has gone up 3.5dB, making the GTX 690 PT just as loud as our GTX 680 SLI in its adjacent configuration. In practice the noise impact from raising the power target is going trend closer to Metro than OCCT, but Metro is likely an overly optimistic scenario; there’s going to be at least a small increase in noise here.

The GTX 690 OC meanwhile approaches the noise level of the GTX 680 SLI under Metro, and shoots past it under OCCT. Considering the performance payoff some users will no doubt find this worth the noise, but it should be clear that overclocking like this means sacrificing the stock GTX 690’s quietness.

Power, Temperature, & Noise Overclocked: Gaming Performance
POST A COMMENT

199 Comments

View All Comments

  • james.jwb - Thursday, May 03, 2012 - link

    You are correct, I don't own one... I own three in triple screen. Dell U2412m's.

    I really am at a loss as to what you are on about. It is well known that 16:10 is preferred amongst enthusiasts and professionals for a few reasons. If you want 16:9, fine, go for it, but don't act like it's weird that AT are benching with 16:10 just because you went with cheap ass 16:9 screens.
    Reply
  • CeriseCogburn - Friday, May 04, 2012 - link

    Yes of course you are at a loss, you don't understand a word so why reply ?
    You're all at a loss.
    ROFL
    Reply
  • yelnatsch517 - Friday, May 04, 2012 - link

    Are you being sarcastic or an idiot?
    From my experience 1900x1200 24" monitors are the MAJORITY. My work has roughly 50 24" monitors all in that resolution. My HP ZR24W is 1900x1200 as well. The only 24" monitor that I have even seen is the TN panel that came with an HP computer.

    If you are talking about computer monitors, 1900x1200 is the dominant resolution. If you are talking about TVs, then obviously 1080p is the norm.
    Reply
  • CeriseCogburn - Saturday, May 05, 2012 - link

    There are 242 - count them, well over 200, nearly 250 1920X1080 monitors at the egg.
    _
    In your great experience, there are 16 that fit your 1920X1200 dreampipe FUD scenario at the egg, with most of them, well over half, over $400 each, while the 242 common monitors you all buy as you pinch every penny and whine about $10 difference in videocard prices are well under $200 each a lot of the time.
    So now suddenly, you all spend way over $300 to plus $400 for 11% more pixels... ROFL HAHAHHAHHA instead of $150 or $200...
    I guess that's why this place is so biased, the little bloggers are just as whacked when it comes to being honest.
    Reply
  • InsaneScientist - Saturday, May 05, 2012 - link

    Good grief... resorting to personal attacks isn't exactly a good way to get people to listen to you.

    I'm not going to argue that 1080p isn't more common (from what I've read, no one is), because it is more common, you are quite correct there, however I must point out that your logic to arrive at that conclusion is faulty:
    You're contending that 1080p is more common (it is) because there are more models available on Newegg, but just knowing how many models are available doesn't tell us how many units those move.
    If, for example, each of those 22 models of 1920x1200 monitors moves 10 times as much stock as each of the 1920x1080, nearly as many of the 1920x1200 will have been sold as the 1920x1080 ones.
    Now, I don't think that's likely, and I do agree with you that 1080p is more common nowadays (see next point), but your argument is invalid, even though you have come to the correct conclusion.
    Consider this: there are currently two models of iPhone available, compared to dozens of Android phones. By the same logic as you're using, I could say that the iPhone is incredibly rare - I'd get laughed out of town if I tried to make that argument.

    The second point is that 1920x1200 hasn't been nearly as rare in the past as it is today. When I bought my previous monitor and my laptop (both 1920x1200), 1080p monitors were almost unheard of. Since monitors tend to last a while, it's not at all unreasonable for a disproportionate amount of people to be using them compared to their current sales.

    Thirdly, there is a point of diminishing returns. Notice the complete lack of any benchmarks at or below 1680x1050? These cards are so fast that comparisons at those resolutions are pointless - they're all fast enough for anything you could do to them at that screen res - even Crysis. 1920x1080 almost falls into that category, heck, even 1920x1200 almost falls into that category. Benchmarks are only about who wins if there is some advantage to winning. Below 2560x1600, which card you're using is almost completely irrelevant, so why does it even matter whether they used 1920x1080 or 1920x1200?
    Reply
  • CeriseCogburn - Tuesday, May 08, 2012 - link

    Blah blah blah blah and I'm still 100% correct and you are not at all. Reply
  • Decembermouse - Tuesday, May 08, 2012 - link

    You're quite a character. Reply
  • anirudhs - Thursday, May 03, 2012 - link

    I use 2 at work - HP ZR24W. Reply
  • piroroadkill - Sunday, May 06, 2012 - link

    Hm, odd.
    Not only do I have 1920x1200 monitor on my desktop, I have TWO laptops with 1920x1200 screens. Using one right now.
    Yes, they're rarer than 1080p screens, but this is a site for enthusiasts, therefore, it is more likely.
    Reply
  • Ryan Smith - Thursday, May 03, 2012 - link

    The truth is a bit more simple than that. 5760x1200 is because our choice in monitors for multi-monitor testing was based on my personal monitor, which is another PA246Q. NVIDIA's limited display flexibility (same res + same sync) meant that it was easiest to just pair the PA246Q with some more PA246Qs, Consequently it's easier to just test these monitors at their native resolution when we're using NVIDIA cards. Reply

Log in

Don't have an account? Sign up now