GPU Boost 2.0: Overclocking & Overclocking Your Monitor

The first half of the GPU Boost 2 story is of course the fact that with 2.0 NVIDIA is switching from power based controls to temperature based controls. However there is also a second story here, and that is the impact to overclocking.

With the GTX 680, overclocking capabilities were limited, particularly in comparison to the GeForce 500 series. The GTX 680 could have its power target raised (guaranteed “overclocking”), and further overclocking could be achieved by using clock offsets. But perhaps most importantly, voltage control was forbidden, with NVIDIA going so far as to nix EVGA and MSI’s voltage adjustable products after a short time on the market.

There are a number of reasons for this, and hopefully one day soon we’ll be able to get into NVIDIA’s Project Greenlight video card approval process in significant detail so that we can better explain this, but the primary concern was that without strict voltage limits some of the more excessive users may blow out their cards with voltages set too high. And while the responsibility for this ultimately falls to the user, and in some cases the manufacturer of their card (depending on the warranty), it makes NVIDIA look bad regardless. The end result being that voltage control on the GTX 680 (and lower cards) was disabled for everyone, regardless of what a card was capable of.

With Titan this has finally changed, at least to some degree. In short, NVIDIA is bringing back overvoltage control, albeit in a more limited fashion.

For Titan cards, partners will have the final say in whether they wish to allow overvolting or not. If they choose to allow it, they get to set a maximum voltage (Vmax) figure in their VBIOS. The user in turn is allowed to increase their voltage beyond NVIDIA’s default reliability voltage limit (Vrel) up to Vmax. As part of the process however users have to acknowledge that increasing their voltage beyond Vrel puts their card at risk and may reduce the lifetime of the card. Only once that’s acknowledged will users be able to increase their voltages beyond Vrel.

With that in mind, beyond overvolting overclocking has also changed in some subtler ways. Memory and core offsets are still in place, but with the switch from power based monitoring to temperature based monitoring, the power target slider has been augmented with a separate temperature target slider.

The power target slider is still responsible for controlling the TDP as before, but with the ability to prioritize temperatures over power consumption it appears to be somewhat redundant (or at least unnecessary) for more significant overclocking. That leaves us with the temperature slider, which is really a control for two functions.

First and foremost of course is that the temperature slider controls what the target temperature is for Titan. By default for Titan this is 80C, and it may be turned all the way up to 95C. The higher the temperature setting the more frequently Titan can reach its highest boost bins, in essence making this a weaker form of overclocking just like the power target adjustment was on GTX 680.

The second function controlled by the temperature slider is the fan curve, which for all practical purposes follows the temperature slider. With modern video cards ramping up their fan speeds rather quickly once cards get into the 80C range, merely increasing the power target alone wouldn’t be particularly desirable in most cases due to the extra noise it generates, so NVIDIA tied in the fan curve to the temperature slider. By doing so it ensures that fan speeds stay relatively low until they start exceeding the temperature target. This seems a bit counterintuitive at first, but when put in perspective of the goal – higher temperatures without an increase in fan speed – this starts to make sense.

Finally, in what can only be described as a love letter to the boys over at 120hz.net, NVIDIA is also introducing a simplified monitor overclocking option, which can be used to increase the refresh rate sent to a monitor in order to coerce it into operating at that higher refresh rate. Notably, this isn’t anything that couldn’t be done before with some careful manipulation of the GeForce control panel’s custom resolution option, but with the monitor overclocking option exposed in PrecisionX and other utilities, monitor overclocking has been reduced to a simple slider rather than a complex mix of timings and pixel counts.

Though this feature can technically work with any monitor, it’s primarily geared towards monitors such as the various Korean LG-based 2560x1440 monitors that have hit the market in the past year, a number of which have come with electronics capable of operating far in excess of the 60Hz that is standard for those monitors. On the models that have been able to handle it, modders have been able to get some of these 2560x1440 monitors up to and above 120Hz, essentially doubling their native 60Hz refresh rate to 120Hz, greatly improving smoothness to levels similar to a native 120Hz TN panel, but without the resolution and quality drawbacks inherent to those TN products.

Of course it goes without saying that just like any other form of overclocking, monitor overclocking can be dangerous and risks breaking the monitor. On that note, out of our monitor collection we were able to get our Samsung 305T up to 75Hz, but whether that’s due to the panel or the driving electronics it didn’t seem to have any impact on performance, smoothness, or responsiveness. This is truly a “your mileage may vary” situation.

GPU Boost 2.0: Temperature Based Boosting Origin’s Genesis: Titan on Water & More to Come
Comments Locked

157 Comments

View All Comments

  • Olaf van der Spek - Tuesday, February 19, 2013 - link

    Who cares about your pair of cards? Nobody but you!
  • Iketh - Tuesday, February 19, 2013 - link

    lol hater!
  • CeriseCogburn - Sunday, February 24, 2013 - link

    As compared to the crawl into the street and stone yourself missive you throw in another post ? LOL

    You won the hate war bub !

    I thought the gentleman owning the two 7950's made a very decent comment.
    Yes it's shocking coming from someone with 2 amd cards, but for once, it occurred.
  • chizow - Tuesday, February 19, 2013 - link

    This is much worst than the Ultra imo, at least in the case of the 8800GTX/Ultra, the performance at least somewhat justified the price relative to the rest of the market. We are somewhat spoiled by the bevy of card releases in recent years, but that's also the curse of the 680 and now Titan, the performance difference is nowhere close to the increase in price tag.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    You're NUTS to pretend you deserve 1 to 1 price to perf pricing on up the line, or that it is even a standard or usual expected measured outcome.

    What you do have is years now of idiot web articles and postings from insanely focused miniscule minded scrooge like weirdos futzing around dicing up beans to fill web space. So now your brain is fried. FPS is all the drool cup can visibly contain.

    Congratulations on the complete brainwashing. When you screamed 100% in the prior threads, it was found to be 20%, 30%, etc. outlying 40%.

    Facts don't matter, all the bang for the buck historical fantasy BS in your gourd, does.
  • joqqy - Wednesday, February 20, 2013 - link

    I'll wait until price drops, quite content with what I have now.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    Since you spent $550, you could spend a grand.
    I accept your notional decision, but it is not out of your price range, you are after all running 2x #2 flagships.

    In fact yours is the first and ONLY reasonable pricing comment (complaint) in the entire thread.

    Congratulations for that. Appreciate it. Happy gaming to you.
  • Deo Domuique - Friday, March 8, 2013 - link

    I strongly believe, currently the best setup one could have is what you have...

    2x 7950 the most bang for your buck! 2 great cards with great price... Although, I'm no fan of Crossfire or Sli. Still, even one 7950 it still holds the best spot in my mind.
  • sensiballfeel - Tuesday, February 19, 2013 - link

    $1000 for a card slower than a GTX 690?

    Running two 580s for years now and skipped 680 for being too slow expecting something else in the pipeline.This is it and NvidiA wants to double the price to $1000?

    Nvidia has lost their mind.Good card,the price is beyond ridiculous.Nice try nvidia,but no thanks.
  • Menty - Tuesday, February 19, 2013 - link

    Meh, it's a single card rather than an SLI-on-a-stick card. That makes it better, in my book.

Log in

Don't have an account? Sign up now