GPU Boost 2.0: Overclocking & Overclocking Your Monitor

The first half of the GPU Boost 2 story is of course the fact that with 2.0 NVIDIA is switching from power based controls to temperature based controls. However there is also a second story here, and that is the impact to overclocking.

With the GTX 680, overclocking capabilities were limited, particularly in comparison to the GeForce 500 series. The GTX 680 could have its power target raised (guaranteed “overclocking”), and further overclocking could be achieved by using clock offsets. But perhaps most importantly, voltage control was forbidden, with NVIDIA going so far as to nix EVGA and MSI’s voltage adjustable products after a short time on the market.

There are a number of reasons for this, and hopefully one day soon we’ll be able to get into NVIDIA’s Project Greenlight video card approval process in significant detail so that we can better explain this, but the primary concern was that without strict voltage limits some of the more excessive users may blow out their cards with voltages set too high. And while the responsibility for this ultimately falls to the user, and in some cases the manufacturer of their card (depending on the warranty), it makes NVIDIA look bad regardless. The end result being that voltage control on the GTX 680 (and lower cards) was disabled for everyone, regardless of what a card was capable of.

With Titan this has finally changed, at least to some degree. In short, NVIDIA is bringing back overvoltage control, albeit in a more limited fashion.

For Titan cards, partners will have the final say in whether they wish to allow overvolting or not. If they choose to allow it, they get to set a maximum voltage (Vmax) figure in their VBIOS. The user in turn is allowed to increase their voltage beyond NVIDIA’s default reliability voltage limit (Vrel) up to Vmax. As part of the process however users have to acknowledge that increasing their voltage beyond Vrel puts their card at risk and may reduce the lifetime of the card. Only once that’s acknowledged will users be able to increase their voltages beyond Vrel.

With that in mind, beyond overvolting overclocking has also changed in some subtler ways. Memory and core offsets are still in place, but with the switch from power based monitoring to temperature based monitoring, the power target slider has been augmented with a separate temperature target slider.

The power target slider is still responsible for controlling the TDP as before, but with the ability to prioritize temperatures over power consumption it appears to be somewhat redundant (or at least unnecessary) for more significant overclocking. That leaves us with the temperature slider, which is really a control for two functions.

First and foremost of course is that the temperature slider controls what the target temperature is for Titan. By default for Titan this is 80C, and it may be turned all the way up to 95C. The higher the temperature setting the more frequently Titan can reach its highest boost bins, in essence making this a weaker form of overclocking just like the power target adjustment was on GTX 680.

The second function controlled by the temperature slider is the fan curve, which for all practical purposes follows the temperature slider. With modern video cards ramping up their fan speeds rather quickly once cards get into the 80C range, merely increasing the power target alone wouldn’t be particularly desirable in most cases due to the extra noise it generates, so NVIDIA tied in the fan curve to the temperature slider. By doing so it ensures that fan speeds stay relatively low until they start exceeding the temperature target. This seems a bit counterintuitive at first, but when put in perspective of the goal – higher temperatures without an increase in fan speed – this starts to make sense.

Finally, in what can only be described as a love letter to the boys over at 120hz.net, NVIDIA is also introducing a simplified monitor overclocking option, which can be used to increase the refresh rate sent to a monitor in order to coerce it into operating at that higher refresh rate. Notably, this isn’t anything that couldn’t be done before with some careful manipulation of the GeForce control panel’s custom resolution option, but with the monitor overclocking option exposed in PrecisionX and other utilities, monitor overclocking has been reduced to a simple slider rather than a complex mix of timings and pixel counts.

Though this feature can technically work with any monitor, it’s primarily geared towards monitors such as the various Korean LG-based 2560x1440 monitors that have hit the market in the past year, a number of which have come with electronics capable of operating far in excess of the 60Hz that is standard for those monitors. On the models that have been able to handle it, modders have been able to get some of these 2560x1440 monitors up to and above 120Hz, essentially doubling their native 60Hz refresh rate to 120Hz, greatly improving smoothness to levels similar to a native 120Hz TN panel, but without the resolution and quality drawbacks inherent to those TN products.

Of course it goes without saying that just like any other form of overclocking, monitor overclocking can be dangerous and risks breaking the monitor. On that note, out of our monitor collection we were able to get our Samsung 305T up to 75Hz, but whether that’s due to the panel or the driving electronics it didn’t seem to have any impact on performance, smoothness, or responsiveness. This is truly a “your mileage may vary” situation.

GPU Boost 2.0: Temperature Based Boosting Origin’s Genesis: Titan on Water & More to Come
Comments Locked

157 Comments

View All Comments

  • Wolfpup - Tuesday, February 19, 2013 - link

    I think so too. And IMO this makes sense...no one NEEDS this card, the GTX 680 is still awesome, and still competitive where it is. They can be selling these elsewhere for more, etc.

    Now, who wants to buy me 3 of them to run Folding @ Home on :-D
  • IanCutress - Tuesday, February 19, 2013 - link

    Doing some heavy compute, this card could pay for itself in a couple of weeks over a 680 or two. On the business side, it all comes down to 'does it make a difference to throughput', and if you can quantify that and cost it up, then it'll make sense. Gaming, well that's up to you. Folding... I wonder if the code needs tweaking a little.
  • wreckeysroll - Tuesday, February 19, 2013 - link

    Price is going to kill this card. See the powerpoints in the previews for performance. Titan is not too much faster than what they have on the market now, so not just the same price as a 690 but 30% slower as well.

    Game customers are not pro customers.

    This card could of been nice before someone slipped a gear at nvidia and thought gamers would eat this $1000 rip-off. A few will like anything not many though. Big error was made here on pricing this for $1000. A sane price would of sold many more than this lunacy.

    Nvidia dropped the ball.
  • johnthacker - Tuesday, February 19, 2013 - link

    People doing compute will eat this up, though. I went to NVIDIA's GPU Tech Conference last year, people were clamoring for a GK110 based consumer card for compute, after hearing that Dynamic Paralleism and HyperQ were limited to the GK110 and not on the GK104.

    They will sell as many as they want to people doing compute, and won't care at all if they aren't selling them to gamers, since they'll be making more profit anyway.

    Nvidia didn't drop the ball, it's that they're playing a different game than you think.
  • TheJian - Wednesday, February 20, 2013 - link

    Want to place bet on them being out of stock on the day their on sale? I'd be shocked if you can get one in a day if not a week.

    I thought the $500 Nexus 10 would slow some down but I had to fight for hours to get one bought and sold out in most places in under an hour. I believe most overpriced apple products have the same problem.

    They are not trying to sell this to the middle class ya know.

    Asus prices the ares 2 at $1600. They only made 1000 last I checked. These are not going to sell 10 million and selling for anything less would just mean less money, and problems meeting production. You price your product at what you think the market will bare. Not what Wreckeysroll thinks the price should be. Performance like a dualchip card is quite a feat of engineering. Note the Ares2 uses like ~475watts. This will come in around 250w. Again, quite a feat. That's around ~100 less than a 690 also.

    Don't forget this is a card that is $2500 of compute power. Even Amazon had to buy 10000 K20's just to get a $1500 price on them, and had to also buy $500 insurance for each one to get that deal. You think Amazon is a bunch of idiots? This is a card that fixes 600 series weakness and adds substantial performance to boot. It would be lunacy to sell it for under $1000. If we could all afford it they'd make nothing and be out of stock in .5 seconds...LOL
  • chizow - Friday, February 22, 2013 - link

    Except they have been selling this *SAME* class of card for much <$1000 for the better part of a decade. *SAME* size, same relative performance, same cost to produce. Where have you been and why do you think it's now OK to sell it for 2x as much when nothing about it justifies the price increase?
  • CeriseCogburn - Sunday, February 24, 2013 - link

    LOL same cost to produce....

    You're insane.
  • Gastec - Wednesday, February 27, 2013 - link

    You forget about the "bragging rights" factor. Perhaps Nvidia won't make many GTX Titan but all those they do make will definitely sell like warm bread. There are enough "enthusiasts" and other kinds of trolls out there (most of them in United States) willing to give anything to show to the Internet their high scores in various benchmarks and/or post a flashy picture with their shiny "rig".
  • herzwoig - Tuesday, February 19, 2013 - link

    Unacceptable price.
    Less than promised performance.

    Pro customers will get a Tesla, that is what those cards are for with the attenuate support and drivers. Nvidia is selling this as a consumer gaming play card and trying to reshape the high end gaming SKU as an even more premium product (doubly so!!)

    Terrible value and whatever performance it has going for it is erroded by the nonsensical pricing strategy. Surprising level of miscalculation on the greed front from nvidia...
  • TheJian - Wednesday, February 20, 2013 - link

    They'll pay $2500 to get that unless they buy 10000 like amazon (which still paid $2000/card). Unacceptable for you, but I guess you're not their target market. You can get TWO of these for the price of ONE tesla at $2000 and ONLY if you buy 10000 like amazon. Heck if buying one Tesla, I get two of these, a new I7-3770K+ a board...LOL. They're selling this as a consumer card with telsa performance (sans support/insurance). Sounds like they priced it right in line with nearly every other top of the line card released for years in this range. 7990, 690 etc...on down the line.

    Less than promised performance? So you've benchmarked it then?

    "Terrible value and whatever performance it has going for it"
    So you haven't any idea yet right?...Considering a 7990 costs a $1000 too basically, and uses 475w vs. 250w, while being 1/2 the size this isn't so nonsensical. This card shouldn't heat up your room either. There are many benefits, you just can't see beyond those AMD goggles you've got on.

Log in

Don't have an account? Sign up now