Overclocking

With the GTX 590 NVIDIA found themselves with a bit of a PR problem. Hardcore overclockers had managed to send their GTX 590s to a flaming death, which made the GTX 590 look bad and required that NVIDIA lock down all voltage control so that no one else could repeat the feat. The GTX 590 was a solid card at stock, but NVIDIA never designed it for overvolting, and indeed I’m not sure you could even say it was designed for overclocking since it was already running at a 365W TDP.

Since that incident NVIDIA has taken a much harder stance on overvolting, which we first saw with the GTX 680. The reference GTX 680 could not be overvolted, with voltage options limited to whatever voltage the top GPU boost bin used (typically 1.175v). This principle will be continuing with the GTX 690; there will not be any overvolting options.

However this is not to say that the GTX 690 isn’t built for overclocking. The GTX 680 still has some overclocking potential thanks to some purposeful use of design headroom, and the GTX 690 is going to be the same story. In fact it’s much the same story as with AMD’s Radeon HD 5970 and 6990, both of which shipped in configurations that kept power consumption at standard levels while also offering modes that unlocked overclocking potential in exchange for greater power consumption (e.g. AWSUM). As we’ve previously mentioned the GTX 690 is designed to be able to handle up to 375W even though it ships in a 300W configuration, and that 75W is our overclocking headroom.

NVIDIA will be exposing the GTX 690’s overclocking options through a combination of power targets and clock offsets, just as with the GTX 680. This in turn means that the GTX 690 effectively has two overclocking modes:

  1. Power target overclocking. By just raising the power target (max +35%) you can increase how often the GTX 690 can boost and how frequently it can hit its max boost bin. By adjusting the power target performance will only increase in games/applications that are being held back by NVIDIA’s power limiter, but in return this is easy mode overclocking as all of the GPU boost bins are already qualified for stability. In other words, this is the GTX 690’s higher performance, higher power 375W mode.
  2. Power target + offset overclocking. By using clock offsets it’s possible to further raise the performance of the GTX 690, and to do so across all games and applications. The lack of overvolting support means that there isn’t a ton of headroom for the offset, but as it stands NVIDIA’s clocks are conservative for power purposes and Kepler is clearly capable of more than 915MHz/1019MHz. This of course will require testing for stability, and it should be noted that because NVIDIA’s GPU boost bins already go so high over the base clock that it won’t take much to be boosting into 1.2GHz+.

NVIDIA’s goal with the GTX 690 was not just to reach GTX 680 SLI performance, but also match the GTX 680’s overclocking capabilities. We’ll get to our full results in our overclocking performance section, but for the time being we’ll leave it at this: we hit 1040MHz base, 1183MHz boost, and 7GHz memory on our GTX 690; even without overvolting it’s a capable overclocker.

Meet The GeForce GTX 690 GeForce Experience & The Test
Comments Locked

200 Comments

View All Comments

  • james.jwb - Thursday, May 3, 2012 - link

    You are correct, I don't own one... I own three in triple screen. Dell U2412m's.

    I really am at a loss as to what you are on about. It is well known that 16:10 is preferred amongst enthusiasts and professionals for a few reasons. If you want 16:9, fine, go for it, but don't act like it's weird that AT are benching with 16:10 just because you went with cheap ass 16:9 screens.
  • CeriseCogburn - Friday, May 4, 2012 - link

    Yes of course you are at a loss, you don't understand a word so why reply ?
    You're all at a loss.
    ROFL
  • yelnatsch517 - Friday, May 4, 2012 - link

    Are you being sarcastic or an idiot?
    From my experience 1900x1200 24" monitors are the MAJORITY. My work has roughly 50 24" monitors all in that resolution. My HP ZR24W is 1900x1200 as well. The only 24" monitor that I have even seen is the TN panel that came with an HP computer.

    If you are talking about computer monitors, 1900x1200 is the dominant resolution. If you are talking about TVs, then obviously 1080p is the norm.
  • CeriseCogburn - Saturday, May 5, 2012 - link

    There are 242 - count them, well over 200, nearly 250 1920X1080 monitors at the egg.
    _
    In your great experience, there are 16 that fit your 1920X1200 dreampipe FUD scenario at the egg, with most of them, well over half, over $400 each, while the 242 common monitors you all buy as you pinch every penny and whine about $10 difference in videocard prices are well under $200 each a lot of the time.
    So now suddenly, you all spend way over $300 to plus $400 for 11% more pixels... ROFL HAHAHHAHHA instead of $150 or $200...
    I guess that's why this place is so biased, the little bloggers are just as whacked when it comes to being honest.
  • InsaneScientist - Saturday, May 5, 2012 - link

    Good grief... resorting to personal attacks isn't exactly a good way to get people to listen to you.

    I'm not going to argue that 1080p isn't more common (from what I've read, no one is), because it is more common, you are quite correct there, however I must point out that your logic to arrive at that conclusion is faulty:
    You're contending that 1080p is more common (it is) because there are more models available on Newegg, but just knowing how many models are available doesn't tell us how many units those move.
    If, for example, each of those 22 models of 1920x1200 monitors moves 10 times as much stock as each of the 1920x1080, nearly as many of the 1920x1200 will have been sold as the 1920x1080 ones.
    Now, I don't think that's likely, and I do agree with you that 1080p is more common nowadays (see next point), but your argument is invalid, even though you have come to the correct conclusion.
    Consider this: there are currently two models of iPhone available, compared to dozens of Android phones. By the same logic as you're using, I could say that the iPhone is incredibly rare - I'd get laughed out of town if I tried to make that argument.

    The second point is that 1920x1200 hasn't been nearly as rare in the past as it is today. When I bought my previous monitor and my laptop (both 1920x1200), 1080p monitors were almost unheard of. Since monitors tend to last a while, it's not at all unreasonable for a disproportionate amount of people to be using them compared to their current sales.

    Thirdly, there is a point of diminishing returns. Notice the complete lack of any benchmarks at or below 1680x1050? These cards are so fast that comparisons at those resolutions are pointless - they're all fast enough for anything you could do to them at that screen res - even Crysis. 1920x1080 almost falls into that category, heck, even 1920x1200 almost falls into that category. Benchmarks are only about who wins if there is some advantage to winning. Below 2560x1600, which card you're using is almost completely irrelevant, so why does it even matter whether they used 1920x1080 or 1920x1200?
  • CeriseCogburn - Tuesday, May 8, 2012 - link

    Blah blah blah blah and I'm still 100% correct and you are not at all.
  • Decembermouse - Tuesday, May 8, 2012 - link

    You're quite a character.
  • anirudhs - Thursday, May 3, 2012 - link

    I use 2 at work - HP ZR24W.
  • piroroadkill - Sunday, May 6, 2012 - link

    Hm, odd.
    Not only do I have 1920x1200 monitor on my desktop, I have TWO laptops with 1920x1200 screens. Using one right now.
    Yes, they're rarer than 1080p screens, but this is a site for enthusiasts, therefore, it is more likely.
  • Ryan Smith - Thursday, May 3, 2012 - link

    The truth is a bit more simple than that. 5760x1200 is because our choice in monitors for multi-monitor testing was based on my personal monitor, which is another PA246Q. NVIDIA's limited display flexibility (same res + same sync) meant that it was easiest to just pair the PA246Q with some more PA246Qs, Consequently it's easier to just test these monitors at their native resolution when we're using NVIDIA cards.

Log in

Don't have an account? Sign up now