Overclocking

With the GTX 590 NVIDIA found themselves with a bit of a PR problem. Hardcore overclockers had managed to send their GTX 590s to a flaming death, which made the GTX 590 look bad and required that NVIDIA lock down all voltage control so that no one else could repeat the feat. The GTX 590 was a solid card at stock, but NVIDIA never designed it for overvolting, and indeed I’m not sure you could even say it was designed for overclocking since it was already running at a 365W TDP.

Since that incident NVIDIA has taken a much harder stance on overvolting, which we first saw with the GTX 680. The reference GTX 680 could not be overvolted, with voltage options limited to whatever voltage the top GPU boost bin used (typically 1.175v). This principle will be continuing with the GTX 690; there will not be any overvolting options.

However this is not to say that the GTX 690 isn’t built for overclocking. The GTX 680 still has some overclocking potential thanks to some purposeful use of design headroom, and the GTX 690 is going to be the same story. In fact it’s much the same story as with AMD’s Radeon HD 5970 and 6990, both of which shipped in configurations that kept power consumption at standard levels while also offering modes that unlocked overclocking potential in exchange for greater power consumption (e.g. AWSUM). As we’ve previously mentioned the GTX 690 is designed to be able to handle up to 375W even though it ships in a 300W configuration, and that 75W is our overclocking headroom.

NVIDIA will be exposing the GTX 690’s overclocking options through a combination of power targets and clock offsets, just as with the GTX 680. This in turn means that the GTX 690 effectively has two overclocking modes:

  1. Power target overclocking. By just raising the power target (max +35%) you can increase how often the GTX 690 can boost and how frequently it can hit its max boost bin. By adjusting the power target performance will only increase in games/applications that are being held back by NVIDIA’s power limiter, but in return this is easy mode overclocking as all of the GPU boost bins are already qualified for stability. In other words, this is the GTX 690’s higher performance, higher power 375W mode.
  2. Power target + offset overclocking. By using clock offsets it’s possible to further raise the performance of the GTX 690, and to do so across all games and applications. The lack of overvolting support means that there isn’t a ton of headroom for the offset, but as it stands NVIDIA’s clocks are conservative for power purposes and Kepler is clearly capable of more than 915MHz/1019MHz. This of course will require testing for stability, and it should be noted that because NVIDIA’s GPU boost bins already go so high over the base clock that it won’t take much to be boosting into 1.2GHz+.

NVIDIA’s goal with the GTX 690 was not just to reach GTX 680 SLI performance, but also match the GTX 680’s overclocking capabilities. We’ll get to our full results in our overclocking performance section, but for the time being we’ll leave it at this: we hit 1040MHz base, 1183MHz boost, and 7GHz memory on our GTX 690; even without overvolting it’s a capable overclocker.

Meet The GeForce GTX 690 GeForce Experience & The Test
Comments Locked

200 Comments

View All Comments

  • Makaveli - Thursday, May 3, 2012 - link

    Some of us don't buy 16:9 monitors or TN panels!

    I want results at 1920x1200 and other 16:10 resolution you can shut up with your amd bias which you have no proof of other than your flawed logic.

  • CeriseCogburn - Thursday, May 3, 2012 - link

    Then you don't buy much. 1920x1200 is a very rare monitor.
  • Parhel - Thursday, May 3, 2012 - link

    1920x1200 was very common for several years. Until a few years ago, they were much more common than 1920x1080. I even have an old laptop that's 1920x1200. Looking at what's available to buy new, today, doesn't tell the whole story. Because people don't replace their monitors every day.

    Anandtech has always recommended spending up and getting a quality monitor. You see it in nearly every review. So, I think the readers here are more likely than the average guy on the street to own less common screens. I've had the same 2560x1600 monitor through 3 computers now, and I spent more on it than I've ever spent on any computer.
  • CeriseCogburn - Saturday, May 5, 2012 - link

    Yes, you're all super premium monitor buyers, and moments ago you were hollering the videocards are way too expensive and you cannot possibly afford them unless you are an idiot with too much money.
    I love this place, the people are so wonderfully honest.
  • Makaveli - Thursday, May 3, 2012 - link

    1920x1200 is only rare now. i've gone thru enough monitor to know what I like and cheap 16:9 TN panels are not if its that good enough for you then enjoy.

    As for your other comment about v-sync and 4xAA Guess what some of us don't care to have 8x AA and 16XAF running all the time.

    I would rather play at 1200p at high settings with AA and AF off if it means playable fps and a enjoyable experience. This isn't [H] i'm not gonna spend $1000 on a Gpu so I can meet your approved settings for playing games dude. Get a clue!
  • CeriseCogburn - Saturday, May 5, 2012 - link

    But you'll spend well over $400 for 11% more monitor pixels because "you'd rather".. "all of a sudden".
    LOL
    Way to go, thanks for helping me.
  • anirudhs - Thursday, May 3, 2012 - link

    No...I couldn't afford one but I very much wanted to buy one. It is much prettier than 16:9 for workstation purposes. New ones are being released all the time. You just have to pay more, but its worth it.
  • CeriseCogburn - Saturday, May 5, 2012 - link

    Oh, so someone who almost wants to be honest.
    So isn't it absolutely true a $500 videocard is much easier to buy when your monitor doesn't cost half that much let alone twice that much or $2,000 plus ?
    You don't need to answer. We all know the truth.
    Everyone in this thread would take a single videocard 680 or 7970 and a 1080P panel for under $200 before they'd buy a $450 1200P monitor and forfeit the 680 or 7970 for a $200 videocard instead.
    It's absolutely clear, no matter the protestations.
    In fact if they did otherwise, they would be so dumb, they would fit right in. Oh look at that, why maybe they are that foolish.
  • InsaneScientist - Saturday, May 5, 2012 - link

    Oh? A little over a year ago, I had some money for an upgrade and I wanted to upgrade either my monitor or my video card.
    Now, I have (and play) Crysis, which can only now, just barely, be handled by a single card, so obviously I could have used the GPU upgrade (still can, for that matter). I also had a decent (though not great) 22" 1920x1200 monitor.

    However, despite that, I chose to buy a new monitor, and bought a used 3008WFP (30" 2560x1600). I have not regretted that decision one bit, and that was a lot more money than your $200-300 upsell for 1920x1200
    Now, admittedly, there were other factors that were a consideration, but even without those, I would have made the same decision. Putting money into a good monitor which I'll use ALL the time I'm on the computer vs. putting money into a good video card that I'll use some of the time is a no-brainer for me.
    If all of my electronics were taken and I were starting from scratch, I'd get another 2560x1600 monitor before I even bought a video card. I'd suffer through the integrated IGP as long as I needed.

    Now, that's my choice, and everyone's needs are different, so I wouldn't demand that you make the same decision I did, but, by the same token, you shouldn't be expecting everyone to be following the same needs that you have. ;)
  • CeriseCogburn - Sunday, May 6, 2012 - link

    You've jumped from 1920 to 2560 so who cares, not even close.
    In your case you got no video card. ROFL - further proving my point, and disproving everyone elses who screamed if you get this card you have another two grand for monitors as well - which everyone here knows isn't true.

    I never demanded anyone follow any needs, let alone mine which are unknown to you despite your imaginary lifestyle readings, and obverse to the sudden flooding of monitor fanboys and the accompanying lies.

Log in

Don't have an account? Sign up now