GPU Boost 2.0: Overclocking & Overclocking Your Monitor

The first half of the GPU Boost 2 story is of course the fact that with 2.0 NVIDIA is switching from power based controls to temperature based controls. However there is also a second story here, and that is the impact to overclocking.

With the GTX 680, overclocking capabilities were limited, particularly in comparison to the GeForce 500 series. The GTX 680 could have its power target raised (guaranteed “overclocking”), and further overclocking could be achieved by using clock offsets. But perhaps most importantly, voltage control was forbidden, with NVIDIA going so far as to nix EVGA and MSI’s voltage adjustable products after a short time on the market.

There are a number of reasons for this, and hopefully one day soon we’ll be able to get into NVIDIA’s Project Greenlight video card approval process in significant detail so that we can better explain this, but the primary concern was that without strict voltage limits some of the more excessive users may blow out their cards with voltages set too high. And while the responsibility for this ultimately falls to the user, and in some cases the manufacturer of their card (depending on the warranty), it makes NVIDIA look bad regardless. The end result being that voltage control on the GTX 680 (and lower cards) was disabled for everyone, regardless of what a card was capable of.

With Titan this has finally changed, at least to some degree. In short, NVIDIA is bringing back overvoltage control, albeit in a more limited fashion.

For Titan cards, partners will have the final say in whether they wish to allow overvolting or not. If they choose to allow it, they get to set a maximum voltage (Vmax) figure in their VBIOS. The user in turn is allowed to increase their voltage beyond NVIDIA’s default reliability voltage limit (Vrel) up to Vmax. As part of the process however users have to acknowledge that increasing their voltage beyond Vrel puts their card at risk and may reduce the lifetime of the card. Only once that’s acknowledged will users be able to increase their voltages beyond Vrel.

With that in mind, beyond overvolting overclocking has also changed in some subtler ways. Memory and core offsets are still in place, but with the switch from power based monitoring to temperature based monitoring, the power target slider has been augmented with a separate temperature target slider.

The power target slider is still responsible for controlling the TDP as before, but with the ability to prioritize temperatures over power consumption it appears to be somewhat redundant (or at least unnecessary) for more significant overclocking. That leaves us with the temperature slider, which is really a control for two functions.

First and foremost of course is that the temperature slider controls what the target temperature is for Titan. By default for Titan this is 80C, and it may be turned all the way up to 95C. The higher the temperature setting the more frequently Titan can reach its highest boost bins, in essence making this a weaker form of overclocking just like the power target adjustment was on GTX 680.

The second function controlled by the temperature slider is the fan curve, which for all practical purposes follows the temperature slider. With modern video cards ramping up their fan speeds rather quickly once cards get into the 80C range, merely increasing the power target alone wouldn’t be particularly desirable in most cases due to the extra noise it generates, so NVIDIA tied in the fan curve to the temperature slider. By doing so it ensures that fan speeds stay relatively low until they start exceeding the temperature target. This seems a bit counterintuitive at first, but when put in perspective of the goal – higher temperatures without an increase in fan speed – this starts to make sense.

Finally, in what can only be described as a love letter to the boys over at 120hz.net, NVIDIA is also introducing a simplified monitor overclocking option, which can be used to increase the refresh rate sent to a monitor in order to coerce it into operating at that higher refresh rate. Notably, this isn’t anything that couldn’t be done before with some careful manipulation of the GeForce control panel’s custom resolution option, but with the monitor overclocking option exposed in PrecisionX and other utilities, monitor overclocking has been reduced to a simple slider rather than a complex mix of timings and pixel counts.

Though this feature can technically work with any monitor, it’s primarily geared towards monitors such as the various Korean LG-based 2560x1440 monitors that have hit the market in the past year, a number of which have come with electronics capable of operating far in excess of the 60Hz that is standard for those monitors. On the models that have been able to handle it, modders have been able to get some of these 2560x1440 monitors up to and above 120Hz, essentially doubling their native 60Hz refresh rate to 120Hz, greatly improving smoothness to levels similar to a native 120Hz TN panel, but without the resolution and quality drawbacks inherent to those TN products.

Of course it goes without saying that just like any other form of overclocking, monitor overclocking can be dangerous and risks breaking the monitor. On that note, out of our monitor collection we were able to get our Samsung 305T up to 75Hz, but whether that’s due to the panel or the driving electronics it didn’t seem to have any impact on performance, smoothness, or responsiveness. This is truly a “your mileage may vary” situation.

GPU Boost 2.0: Temperature Based Boosting Origin’s Genesis: Titan on Water & More to Come
Comments Locked

157 Comments

View All Comments

  • TheJian - Wednesday, February 20, 2013 - link

    http://www.guru3d.com/articles-pages/geforce_gtx_t...
    1176mhz from 876 (boost). No bad for $2500 K20 basically for $1000. I've never done homework on it, but I don't think K20's overclock, but I could be wrong.

    Can't wait to see the review tomorrow. Clearly he'll bench it there and he has 3 :) You should get your answers then :)

    I'm wondering if some hacker will enable the K20 drivers, or if that's possible. It seems a lot of reviewers got 3, so you should have lots of data by weekend.
  • Bill Brasky - Tuesday, February 19, 2013 - link

    There were rumors this card would launch at 799-899, which made more sense. But for a grand this thing better be pretty darn close to 690.
  • wand3r3r - Tuesday, February 19, 2013 - link

    The price tag just makes this card a failure. It's a 580 replacement no matter how they label it, so they can shove it. They lost a potential customer...
  • karasaj - Tuesday, February 19, 2013 - link

    So what is the 680?
  • Sandcat - Tuesday, February 19, 2013 - link

    A GK104, which replaced the GF104,

    The GK110 is the replacement for the GF110, which was the GTX 580.
  • Ananke - Tuesday, February 19, 2013 - link

    the 680 was meant as a 560ti replacement...however, NVidia decided it turns too good to be sold too cheap, and changed the model numbering...I have several close friends in the marketing at NV :)
    However, NV is using this GK110 core for HPComputing for the very beginning in the Quadro cards, since there they really cannot skip on the double precision.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    BS.
    The 680 core is entirely different, the rollout time is over half a year off, and that just doesn't happen on a whim in Jan2012 with the post mental breakdown purported 7970 epic failure by amd...

    So after the scat lickers spew the 7970 amd failure, they claim it's the best card ever, even now.
    R O F L

    Have it both ways rumor mongering retreads. No one will notice... ( certainly none of you do).
  • rolodomo - Tuesday, February 19, 2013 - link

    Their business model has become separating money from the wallets of well-to-do who have no sense of value and technology (NVIDIA's PR admits this in the article ). It is a business model, but boutique. Doesn't do much for the their name brand in the view of the technorati either (NVIDIA: We Market to Suckers).
  • Wreckage - Tuesday, February 19, 2013 - link

    It's almost as fast as a pair of 7970's that cost $1100 at launch.

    AMD set the bar on high prices. Now that they are out of the GPU race, don't expect much to change.

    At least NVIDIA was able to bring a major performance increase this year. While AMD has become the new Matrox.
  • Stuka87 - Tuesday, February 19, 2013 - link

    AMD is out of the GPU race? What are you smoking? A $1000 dollar card does not put AMD out of the GPU race. The 7970GE competes well with the 680 for less money (They go back and forth depending on the game).

    Now if this card was priced at $500 then that would hurt AMD as the prices on the 660/670/680 would all drop. But its not the case, so your point is moot. Not to mention this card was due out a year ago, and it got delayed. Which is why the GK104 was bumped up to the 680 slot.

Log in

Don't have an account? Sign up now