Overclocking

With the GTX 590 NVIDIA found themselves with a bit of a PR problem. Hardcore overclockers had managed to send their GTX 590s to a flaming death, which made the GTX 590 look bad and required that NVIDIA lock down all voltage control so that no one else could repeat the feat. The GTX 590 was a solid card at stock, but NVIDIA never designed it for overvolting, and indeed I’m not sure you could even say it was designed for overclocking since it was already running at a 365W TDP.

Since that incident NVIDIA has taken a much harder stance on overvolting, which we first saw with the GTX 680. The reference GTX 680 could not be overvolted, with voltage options limited to whatever voltage the top GPU boost bin used (typically 1.175v). This principle will be continuing with the GTX 690; there will not be any overvolting options.

However this is not to say that the GTX 690 isn’t built for overclocking. The GTX 680 still has some overclocking potential thanks to some purposeful use of design headroom, and the GTX 690 is going to be the same story. In fact it’s much the same story as with AMD’s Radeon HD 5970 and 6990, both of which shipped in configurations that kept power consumption at standard levels while also offering modes that unlocked overclocking potential in exchange for greater power consumption (e.g. AWSUM). As we’ve previously mentioned the GTX 690 is designed to be able to handle up to 375W even though it ships in a 300W configuration, and that 75W is our overclocking headroom.

NVIDIA will be exposing the GTX 690’s overclocking options through a combination of power targets and clock offsets, just as with the GTX 680. This in turn means that the GTX 690 effectively has two overclocking modes:

  1. Power target overclocking. By just raising the power target (max +35%) you can increase how often the GTX 690 can boost and how frequently it can hit its max boost bin. By adjusting the power target performance will only increase in games/applications that are being held back by NVIDIA’s power limiter, but in return this is easy mode overclocking as all of the GPU boost bins are already qualified for stability. In other words, this is the GTX 690’s higher performance, higher power 375W mode.
  2. Power target + offset overclocking. By using clock offsets it’s possible to further raise the performance of the GTX 690, and to do so across all games and applications. The lack of overvolting support means that there isn’t a ton of headroom for the offset, but as it stands NVIDIA’s clocks are conservative for power purposes and Kepler is clearly capable of more than 915MHz/1019MHz. This of course will require testing for stability, and it should be noted that because NVIDIA’s GPU boost bins already go so high over the base clock that it won’t take much to be boosting into 1.2GHz+.

NVIDIA’s goal with the GTX 690 was not just to reach GTX 680 SLI performance, but also match the GTX 680’s overclocking capabilities. We’ll get to our full results in our overclocking performance section, but for the time being we’ll leave it at this: we hit 1040MHz base, 1183MHz boost, and 7GHz memory on our GTX 690; even without overvolting it’s a capable overclocker.

Meet The GeForce GTX 690 GeForce Experience & The Test
Comments Locked

200 Comments

View All Comments

  • theSeb - Thursday, May 3, 2012 - link

    I must say I found it quite odd and hilarious to see people accusing Anandtech of favouring AMD by using a monitor with a 1200 vertical resolution. 16:10 monitors are not that uncommon and we really should be showing the industry what we think by not purchasing 16:9 monitors.

    Anyway, if anything this review seems to be Nvidia biased, in my opinion. The 7970 CF does not do too badly, In fact it beats the 690 / 680 CF in many games and only loses out in the games where it's "broken". I am not sure why you cannot recommend it based on the numbers in your benchmarks since it hardly embarrasses itself.
  • silverblue - Thursday, May 3, 2012 - link

    It's not "people", it's "person"... and he's only here to troll graphics card articles.

    When AMD gets it right, CrossFire is absolutely blistering. Unfortunately, the sad state of affairs is that AMD isn't getting it right with a good proportion of the games in this review.

    NVIDIA may not get quite as high scaling as AMD when AMD does get it right, but they're just far more consistent at providing good performance. This is the main gripe about AMD; with a few more resources devoted to the project, surely they can overcome this?
  • CeriseCogburn - Friday, May 4, 2012 - link

    Yes, of course, call names forever, but never dispute the facts.
    I will agree with you though, amd drivers suck especially in CF, and they suck for a lot of games for a long long time.
  • silverblue - Friday, May 4, 2012 - link

    No, I said AMD's drivers have issues with Crossfire, not that they suck in general.

    I've also checked three random British websites and there's no issues whatsoever in finding a 1920x1200 monitor. I also looked at NewEgg and found eight immediately. It's really not difficult to find one.
  • CeriseCogburn - Saturday, May 5, 2012 - link

    1920x1200 all of you protesteth far too much.
    The cat is out of the bag and you won't be putting it back in.
    Enjoy the bias, you obviously do, and leave me alone, stop the stalking.
  • seapeople - Saturday, May 5, 2012 - link

    I'm with ya bro. Forget these high resolution monitor nancy's who don't know what they're missing. I'm rockin' games just fine with 60+ fps on my 720p plasma tv, and that's at 600hz! Just you try to get 24xAAAA in 3D (that's 1200hz total) on that 1920x1200 monitor of yours!

    Framerate fanboys unite!
  • CeriseCogburn - Sunday, May 6, 2012 - link

    Ahh, upped the ante to plasma monitors ? ROFL - desperation of you people knows no bounds.
  • saf227 - Thursday, May 3, 2012 - link

    On page 2 of the review - where you have all the pictures of the card - we have no real basis for figuring out the cards true size. Could you include a reference in one of those photos? Say, a ruler or a pencil or something, so we have an idea what the size of the card truly is?
  • Ryan Smith - Thursday, May 3, 2012 - link

    The card is 10" long, the same length as the GTX 590 (that should be listed on page 2). But I'll take that under consideration for future articles.
  • ueharaf - Thursday, May 3, 2012 - link

    why they back to 256 bits and the gtx 590 have 384 bits?!?!
    cause they dont want to have a lot of advantage?
    maybe the next gtx 790 will have again 384 bits and it would be better than gtx690 ....come on!!!

Log in

Don't have an account? Sign up now