Overclocked: Power, Temperature, & Noise

Our final task is our look at GTX 690’s overclocking capabilities. NVIDIA has told us that with GTX 690 they weren’t just looking to duplicate GTX 680 SLI’s performance, but also its overclocking capabilities. This is quite the lofty goal, since with GTX 690 NVIDIA is effectively packing 2 680s into the same amount of space, leaving far less space for VRM circuitry and trace routing.

GeForce 600 Series Overclocking
  GTX 690 GTX 680
Shipping Core Clock 915MHz 1006MHz
Shipping Max Boost Clock 1058MHz 1110MHz
Shipping Memory Clock 6GHz 6GHz
Shipping Max Boost Voltage 1.175v 1.175v
     
Overclock Core Clock 1040MHz 1106MHz
Overclock Max Boost Clock 1183MHz 1210MHz
Overclock Memory Clock 7GHz 6.5GHz
Overclock Max Boost Voltage 1.175v 1.175v

In practice NVIDIA has not quite kept up with GTX 680, and in other ways completely exceeded it. When it comes to the core clock we didn’t quite reach parity with our reference GTX 680; the GTX 680’s highest boost clock bin could hit 1210MHz, while the GTX 690’s highest boost clock bin topped out at 1183MHz, some 27MHz (2%) slower.

On the other hand, our memory overclock is so high as to be within the “this doesn’t seem physically possible” range. As we have discussed time and time again, GDDR5 memory busses are difficult to run at high clocks on a good day, never mind a bad day. With GF110 NVIDIA couldn’t get too far past 4GHz, and even with GTX 680 NVIDIA was only shipping at 6GHz.

It would appear that no one has told NVIDIA’s engineers that 7GHz is supposed to be impossible, and as a result they’ve gone and done the unthinkable. Some of this is certainly down to the luck of the draw, but it doesn’t change the fact that our GTX 690 passed every last stability test we could throw at it at 7GHz. And what makes this particularly interesting is the difference between the GTX 680 and the GTX 690 – both are equipped with 6GHz GDDR5 RAM, but while the GTX 680 is equipped with Hynix the GTX 690 is equipped with Samsung. Perhaps the key to all of this is the Samsung RAM?

In any case, our final result was a +125MHz core clock offset and a +1000MHz memory clock offset, which translates into a base clock of 1040MHz, a max boost clock of 1183MHz, and a memory clock of 7GHz. This represents a 12%-14% core overclock and a 17% memory overclock, which is going to be enough to put quite the pep in the GTX 690’s step.

As always we’re going to start our look at overclocking in reverse, beginning with power, temperature, and noise. For the purpose of our testing we’ve tested our GTX 690 at two different settings: at stock clocks with the power target set to 135% (GTX 690 PT), and with our custom overclock alongside the same 135% power target (GTX 690 OC). This allows us to look at both full overclocking and the safer option of merely maxing out the boost clocks for all they’re worth.

As expected, merely increasing the power target to 135% was enough to increase the GTX 690’s power consumption, though overclocking further adds to that. Even with the power target increase however, the power consumption at the wall for the GTX 690 is still lower than the GTX 680 SLI by over 20W, which is quite impressive. As we’ll see in our section on performance this is more than enough to erase the GTX 690’s performance gap, meaning at this point its still consuming less power than the GTX 680 SLI while offering better performance than its dual-card cousin.

It’s only after outright overclocking that we finally see power consumption equalize with the GTX 680 SLI. The overclocked GTX 690 is within 10W of the GTX 680 SLI, though as we’ll see the performance is notably higher.

What does playing with clocks and the power target do to temperatures? The impact isn’t particularly bad, though we’re definitely reaching the highest temperatures we really want to hit. For the GTX 690 PT things are actually quite good under Metro, with the temperature not budging an inch even with the higher power consumption. Under OCCT however temperatures have risen 5C to 87C. Meanwhile the GTX 690 OC reaches 84C under Metro and a toasty 89C under Metro. These should be safe temperatures, but I would not want to cross 90C for any extended period of time.

Finally we have load noise. Unsurprisingly, because load temperatures did not go up for the GTX 690 PT under Metro load noise has not gone up either. On the other hand load noise under OCCT has gone up 3.5dB, making the GTX 690 PT just as loud as our GTX 680 SLI in its adjacent configuration. In practice the noise impact from raising the power target is going trend closer to Metro than OCCT, but Metro is likely an overly optimistic scenario; there’s going to be at least a small increase in noise here.

The GTX 690 OC meanwhile approaches the noise level of the GTX 680 SLI under Metro, and shoots past it under OCCT. Considering the performance payoff some users will no doubt find this worth the noise, but it should be clear that overclocking like this means sacrificing the stock GTX 690’s quietness.

Power, Temperature, & Noise Overclocked: Gaming Performance
Comments Locked

200 Comments

View All Comments

  • Makaveli - Thursday, May 3, 2012 - link

    Some of us don't buy 16:9 monitors or TN panels!

    I want results at 1920x1200 and other 16:10 resolution you can shut up with your amd bias which you have no proof of other than your flawed logic.

  • CeriseCogburn - Thursday, May 3, 2012 - link

    Then you don't buy much. 1920x1200 is a very rare monitor.
  • Parhel - Thursday, May 3, 2012 - link

    1920x1200 was very common for several years. Until a few years ago, they were much more common than 1920x1080. I even have an old laptop that's 1920x1200. Looking at what's available to buy new, today, doesn't tell the whole story. Because people don't replace their monitors every day.

    Anandtech has always recommended spending up and getting a quality monitor. You see it in nearly every review. So, I think the readers here are more likely than the average guy on the street to own less common screens. I've had the same 2560x1600 monitor through 3 computers now, and I spent more on it than I've ever spent on any computer.
  • CeriseCogburn - Saturday, May 5, 2012 - link

    Yes, you're all super premium monitor buyers, and moments ago you were hollering the videocards are way too expensive and you cannot possibly afford them unless you are an idiot with too much money.
    I love this place, the people are so wonderfully honest.
  • Makaveli - Thursday, May 3, 2012 - link

    1920x1200 is only rare now. i've gone thru enough monitor to know what I like and cheap 16:9 TN panels are not if its that good enough for you then enjoy.

    As for your other comment about v-sync and 4xAA Guess what some of us don't care to have 8x AA and 16XAF running all the time.

    I would rather play at 1200p at high settings with AA and AF off if it means playable fps and a enjoyable experience. This isn't [H] i'm not gonna spend $1000 on a Gpu so I can meet your approved settings for playing games dude. Get a clue!
  • CeriseCogburn - Saturday, May 5, 2012 - link

    But you'll spend well over $400 for 11% more monitor pixels because "you'd rather".. "all of a sudden".
    LOL
    Way to go, thanks for helping me.
  • anirudhs - Thursday, May 3, 2012 - link

    No...I couldn't afford one but I very much wanted to buy one. It is much prettier than 16:9 for workstation purposes. New ones are being released all the time. You just have to pay more, but its worth it.
  • CeriseCogburn - Saturday, May 5, 2012 - link

    Oh, so someone who almost wants to be honest.
    So isn't it absolutely true a $500 videocard is much easier to buy when your monitor doesn't cost half that much let alone twice that much or $2,000 plus ?
    You don't need to answer. We all know the truth.
    Everyone in this thread would take a single videocard 680 or 7970 and a 1080P panel for under $200 before they'd buy a $450 1200P monitor and forfeit the 680 or 7970 for a $200 videocard instead.
    It's absolutely clear, no matter the protestations.
    In fact if they did otherwise, they would be so dumb, they would fit right in. Oh look at that, why maybe they are that foolish.
  • InsaneScientist - Saturday, May 5, 2012 - link

    Oh? A little over a year ago, I had some money for an upgrade and I wanted to upgrade either my monitor or my video card.
    Now, I have (and play) Crysis, which can only now, just barely, be handled by a single card, so obviously I could have used the GPU upgrade (still can, for that matter). I also had a decent (though not great) 22" 1920x1200 monitor.

    However, despite that, I chose to buy a new monitor, and bought a used 3008WFP (30" 2560x1600). I have not regretted that decision one bit, and that was a lot more money than your $200-300 upsell for 1920x1200
    Now, admittedly, there were other factors that were a consideration, but even without those, I would have made the same decision. Putting money into a good monitor which I'll use ALL the time I'm on the computer vs. putting money into a good video card that I'll use some of the time is a no-brainer for me.
    If all of my electronics were taken and I were starting from scratch, I'd get another 2560x1600 monitor before I even bought a video card. I'd suffer through the integrated IGP as long as I needed.

    Now, that's my choice, and everyone's needs are different, so I wouldn't demand that you make the same decision I did, but, by the same token, you shouldn't be expecting everyone to be following the same needs that you have. ;)
  • CeriseCogburn - Sunday, May 6, 2012 - link

    You've jumped from 1920 to 2560 so who cares, not even close.
    In your case you got no video card. ROFL - further proving my point, and disproving everyone elses who screamed if you get this card you have another two grand for monitors as well - which everyone here knows isn't true.

    I never demanded anyone follow any needs, let alone mine which are unknown to you despite your imaginary lifestyle readings, and obverse to the sudden flooding of monitor fanboys and the accompanying lies.

Log in

Don't have an account? Sign up now