Overclocking

With the GTX 590 NVIDIA found themselves with a bit of a PR problem. Hardcore overclockers had managed to send their GTX 590s to a flaming death, which made the GTX 590 look bad and required that NVIDIA lock down all voltage control so that no one else could repeat the feat. The GTX 590 was a solid card at stock, but NVIDIA never designed it for overvolting, and indeed I’m not sure you could even say it was designed for overclocking since it was already running at a 365W TDP.

Since that incident NVIDIA has taken a much harder stance on overvolting, which we first saw with the GTX 680. The reference GTX 680 could not be overvolted, with voltage options limited to whatever voltage the top GPU boost bin used (typically 1.175v). This principle will be continuing with the GTX 690; there will not be any overvolting options.

However this is not to say that the GTX 690 isn’t built for overclocking. The GTX 680 still has some overclocking potential thanks to some purposeful use of design headroom, and the GTX 690 is going to be the same story. In fact it’s much the same story as with AMD’s Radeon HD 5970 and 6990, both of which shipped in configurations that kept power consumption at standard levels while also offering modes that unlocked overclocking potential in exchange for greater power consumption (e.g. AWSUM). As we’ve previously mentioned the GTX 690 is designed to be able to handle up to 375W even though it ships in a 300W configuration, and that 75W is our overclocking headroom.

NVIDIA will be exposing the GTX 690’s overclocking options through a combination of power targets and clock offsets, just as with the GTX 680. This in turn means that the GTX 690 effectively has two overclocking modes:

  1. Power target overclocking. By just raising the power target (max +35%) you can increase how often the GTX 690 can boost and how frequently it can hit its max boost bin. By adjusting the power target performance will only increase in games/applications that are being held back by NVIDIA’s power limiter, but in return this is easy mode overclocking as all of the GPU boost bins are already qualified for stability. In other words, this is the GTX 690’s higher performance, higher power 375W mode.
  2. Power target + offset overclocking. By using clock offsets it’s possible to further raise the performance of the GTX 690, and to do so across all games and applications. The lack of overvolting support means that there isn’t a ton of headroom for the offset, but as it stands NVIDIA’s clocks are conservative for power purposes and Kepler is clearly capable of more than 915MHz/1019MHz. This of course will require testing for stability, and it should be noted that because NVIDIA’s GPU boost bins already go so high over the base clock that it won’t take much to be boosting into 1.2GHz+.

NVIDIA’s goal with the GTX 690 was not just to reach GTX 680 SLI performance, but also match the GTX 680’s overclocking capabilities. We’ll get to our full results in our overclocking performance section, but for the time being we’ll leave it at this: we hit 1040MHz base, 1183MHz boost, and 7GHz memory on our GTX 690; even without overvolting it’s a capable overclocker.

Meet The GeForce GTX 690 GeForce Experience & The Test
Comments Locked

200 Comments

View All Comments

  • von Krupp - Saturday, May 5, 2012 - link

    Not precisely. That $350 performance point? It used to be a $200 performance point. Similarly, that $350 point will turn into a $400 performance point. So, assuming I maintain the price tier, graphics returns for my dollar are gradually tapering off. I look at the performance I was getting out of my 7800 GT at 1280x1024, and it wasn't worth upgrading to a newer card, period, because of Windows XP, my single core CPU, and the fact that I was already maxing out every game I had and still getting decent frame rates. I think they key factor is that I do not care if I dip below 60 frames, as long as I'm above 30 and getting reasonable frame times.

    I also know that consoles extend the life of PC hardware. The 7800GT is a 20-pipe version of the GTX, which is in turn the GPU found in the PS3.Devs have gotten much better at optimization in titles that matter to me.
  • CeriseCogburn - Saturday, May 5, 2012 - link

    You spend well over $1,600 on a decent system.
    It makes no sense to spend all that money, then buy monitors the cards in question cannot successfully drive on 3 year old Crysis game, let alone well over half the benchmarks in this article set without turning DOWN the settings.
    You cannot turn up DX11 tesselation, keep it on medium.
    You cannot turn up MSAA past 4X, and better keep it at 2X.
    You had better turn down your visual distance in game.
    That in fact, with "all the console ports" moanings "holding us back".
    I get it, the obvious problem is none of you seem to, because you want to moan and pretend spending $1,000.00 on a monitor alone, or more, is "how it's done", because you whine you cannot even afford $500 for a single video card.
    These cards successfully drive 1920X1080 monitors in the benchmarks, but just barely - and if you turn the eye candy up, they cannot do it.
  • CeriseCogburn - Saturday, May 5, 2012 - link

    Thanks for telling everyone how correct I am by doing a pure 100% troll attack after you and yours could not avoid the facts.
    Your mommy, if you knew who she was, must be very disappointed.
  • geok1ng - Sunday, May 6, 2012 - link

    This card was not build for 2560x1600 gaming. a single 680 is more than enough for that.
    The 690 was built for 5760x1200 gaming.

    I would like to see triple 30" tests. Nothing like gaming at 7680x1600 to feel that you are spending well your VGA money.
  • CeriseCogburn - Sunday, May 6, 2012 - link

    You can use cards 2 generations back for that, but like these cards, you will be turning down most and near all of the eye candy, and be stuck rweaking and clocking, and jittering and wishing you had more power.
    These cards cannot handle 1920X at current "console port" games unless you turn them down, and that goes ESPECIALLY for the AMD cards that suck at extreme tesselation and have more issues with anything above 4XAA, and often 4XAA.
    The 5770 is an eyefinity card and runs 5760X1200 too.
    I guess none of you will ever know until you try it, and it appears none of you have spent the money and become disappointed turning down the eye candy settings - so blabbering about resolutions is all you have left.
  • _vor_ - Tuesday, May 8, 2012 - link

    "... blabbering..."

    Pot, meet kettle.
  • CeriseCogburn - Sunday, May 6, 2012 - link

    They cost $400 to $2,000 plus, not $150 like the 242 1080p.
    Thanks for playing.
  • hechacker1 - Monday, May 7, 2012 - link

    Nope, you can already get IPS, 27", 2560x1440 panels (the same that Apple uses) for $400.

    They're rare, but currently they are building them in batches of 1000 to see how strong demand is for them.

    Sure the 120Hz will sort of go to waste due to the slow IPS switching speed, but it will accept that signal with 0 input lag.

    The only problem is that only the 680 seems to have a ramdac fast enough to do 120Hz. Radeon's tend to cap out at 85Hz.
  • marine73 - Monday, May 7, 2012 - link

    After checking Newegg it would seem that, unfortunately for Nvidia, this will be another piece of vaporware. Perhaps they should scale the Kepler's to 22nm and contract Intel to fab them since TSMC has major issues with 28nm. Just a thought.
  • marine73 - Monday, May 7, 2012 - link

    I guess I should retract my comments about TSMC as other customers are not experiencing supply issues with 28nm parts. Apparently the issues are with Nvidia's design, which may require another redo. I'm guessing AMD will be out with their 8000 series before Nvidia gets their act together. Sad because I have used several generations of Nvidia cards and was always happy with them.

Log in

Don't have an account? Sign up now