Overclocking

With the GTX 590 NVIDIA found themselves with a bit of a PR problem. Hardcore overclockers had managed to send their GTX 590s to a flaming death, which made the GTX 590 look bad and required that NVIDIA lock down all voltage control so that no one else could repeat the feat. The GTX 590 was a solid card at stock, but NVIDIA never designed it for overvolting, and indeed I’m not sure you could even say it was designed for overclocking since it was already running at a 365W TDP.

Since that incident NVIDIA has taken a much harder stance on overvolting, which we first saw with the GTX 680. The reference GTX 680 could not be overvolted, with voltage options limited to whatever voltage the top GPU boost bin used (typically 1.175v). This principle will be continuing with the GTX 690; there will not be any overvolting options.

However this is not to say that the GTX 690 isn’t built for overclocking. The GTX 680 still has some overclocking potential thanks to some purposeful use of design headroom, and the GTX 690 is going to be the same story. In fact it’s much the same story as with AMD’s Radeon HD 5970 and 6990, both of which shipped in configurations that kept power consumption at standard levels while also offering modes that unlocked overclocking potential in exchange for greater power consumption (e.g. AWSUM). As we’ve previously mentioned the GTX 690 is designed to be able to handle up to 375W even though it ships in a 300W configuration, and that 75W is our overclocking headroom.

NVIDIA will be exposing the GTX 690’s overclocking options through a combination of power targets and clock offsets, just as with the GTX 680. This in turn means that the GTX 690 effectively has two overclocking modes:

  1. Power target overclocking. By just raising the power target (max +35%) you can increase how often the GTX 690 can boost and how frequently it can hit its max boost bin. By adjusting the power target performance will only increase in games/applications that are being held back by NVIDIA’s power limiter, but in return this is easy mode overclocking as all of the GPU boost bins are already qualified for stability. In other words, this is the GTX 690’s higher performance, higher power 375W mode.
  2. Power target + offset overclocking. By using clock offsets it’s possible to further raise the performance of the GTX 690, and to do so across all games and applications. The lack of overvolting support means that there isn’t a ton of headroom for the offset, but as it stands NVIDIA’s clocks are conservative for power purposes and Kepler is clearly capable of more than 915MHz/1019MHz. This of course will require testing for stability, and it should be noted that because NVIDIA’s GPU boost bins already go so high over the base clock that it won’t take much to be boosting into 1.2GHz+.

NVIDIA’s goal with the GTX 690 was not just to reach GTX 680 SLI performance, but also match the GTX 680’s overclocking capabilities. We’ll get to our full results in our overclocking performance section, but for the time being we’ll leave it at this: we hit 1040MHz base, 1183MHz boost, and 7GHz memory on our GTX 690; even without overvolting it’s a capable overclocker.

Meet The GeForce GTX 690 GeForce Experience & The Test
Comments Locked

200 Comments

View All Comments

  • CeriseCogburn - Saturday, May 5, 2012 - link

    I'm certain they would pay none of you since not a single one can be honest nor has a single argument to counter my points.
    You're all down to name calling trolls - and you all have to face the facts now, that your clueless ignorance left out of your minds for some time.
    Have fun buying your cheap 1080P panels and slow and cheapo amd cards - LOL
    Oh sorry, you all now buy premium flat panels...
  • CeriseCogburn - Sunday, May 6, 2012 - link

    No actually I expected a lot more from the people here.
    I expected a big thank you, or a thanks for the information we'll keep that in mind and it helps our purchasing decisions.
    Instead we got a flood of raging new monitor owners and haters and name callers.
    Next time just thanking me for providing very pertinent information would be the right thing to do, but at this point I don't expect any of you to ever do the right thing.
  • UltraTech79 - Thursday, May 3, 2012 - link

    Never seen a triple screen setup before?
  • tipoo - Thursday, May 3, 2012 - link

    I'm curious why the 680 and 690 trail AMD cards in Crysis and Metro, seeing as those seem to be the most GPU intensive games, while they win in most other tests. Would it be shading performance or something else?

    My mind is pretty blow that we have cards that can run Crysis and Metro at 5760x1200 at very comfortable framerates now, that's insane. But barring that resolution or 2560 for some games, I'm sure most of us don't see appeal here, it will be sold in a very very small niche. For normal monitor resolutions, I doubt games in large quantities will get much more demanding until we have new consoles out.
  • CeriseCogburn - Thursday, May 3, 2012 - link

    Oh, wow, they also are so biased toward amd they removed the actual most demanding game, Shogun 2, Total War, because I kept pointing out how the Nvidia 680's swept that game across the board - so now it's gone !
    ROFL
    (before you attack me I note the anand reviewer stated S2TW is the most demanding, it's right in the reviews here - but not this one.
  • Ryan Smith - Thursday, May 3, 2012 - link

    Um, it's there. Page 8.
  • Sabresiberian - Thursday, May 3, 2012 - link

    LOL.

    Cerise, epic fail!

    ;)
  • CeriseCogburn - Thursday, May 3, 2012 - link

    Oh I see it was added because the patch broke the Nvidia cards - but in amd's favor again, the tester kept the breaking patch in, instead of providing results.
    Wow, more amd bias.
    Glad my epic fails are so productive. :-)
    U still mad ? Or madder and raging out of control?
  • silverblue - Thursday, May 3, 2012 - link

    So, if they failed to add it, it'd have been AMD bias, but considering they DID add it... it's AMD bias.

    And you're the one talking about rage, trollboi?

    Had you just merely mentioned that the patch doesn't provide favourable results for NVIDIA cards, Ryan might have been tempted to reinstall the game and retest. Well, he might have - can't speak for the guy. Doubt he will now, though.
  • tipoo - Thursday, May 3, 2012 - link

    So back on non-trolling topic...?

Log in

Don't have an account? Sign up now