GPU Boost 2.0: Overclocking & Overclocking Your Monitor

The first half of the GPU Boost 2 story is of course the fact that with 2.0 NVIDIA is switching from power based controls to temperature based controls. However there is also a second story here, and that is the impact to overclocking.

With the GTX 680, overclocking capabilities were limited, particularly in comparison to the GeForce 500 series. The GTX 680 could have its power target raised (guaranteed “overclocking”), and further overclocking could be achieved by using clock offsets. But perhaps most importantly, voltage control was forbidden, with NVIDIA going so far as to nix EVGA and MSI’s voltage adjustable products after a short time on the market.

There are a number of reasons for this, and hopefully one day soon we’ll be able to get into NVIDIA’s Project Greenlight video card approval process in significant detail so that we can better explain this, but the primary concern was that without strict voltage limits some of the more excessive users may blow out their cards with voltages set too high. And while the responsibility for this ultimately falls to the user, and in some cases the manufacturer of their card (depending on the warranty), it makes NVIDIA look bad regardless. The end result being that voltage control on the GTX 680 (and lower cards) was disabled for everyone, regardless of what a card was capable of.

With Titan this has finally changed, at least to some degree. In short, NVIDIA is bringing back overvoltage control, albeit in a more limited fashion.

For Titan cards, partners will have the final say in whether they wish to allow overvolting or not. If they choose to allow it, they get to set a maximum voltage (Vmax) figure in their VBIOS. The user in turn is allowed to increase their voltage beyond NVIDIA’s default reliability voltage limit (Vrel) up to Vmax. As part of the process however users have to acknowledge that increasing their voltage beyond Vrel puts their card at risk and may reduce the lifetime of the card. Only once that’s acknowledged will users be able to increase their voltages beyond Vrel.

With that in mind, beyond overvolting overclocking has also changed in some subtler ways. Memory and core offsets are still in place, but with the switch from power based monitoring to temperature based monitoring, the power target slider has been augmented with a separate temperature target slider.

The power target slider is still responsible for controlling the TDP as before, but with the ability to prioritize temperatures over power consumption it appears to be somewhat redundant (or at least unnecessary) for more significant overclocking. That leaves us with the temperature slider, which is really a control for two functions.

First and foremost of course is that the temperature slider controls what the target temperature is for Titan. By default for Titan this is 80C, and it may be turned all the way up to 95C. The higher the temperature setting the more frequently Titan can reach its highest boost bins, in essence making this a weaker form of overclocking just like the power target adjustment was on GTX 680.

The second function controlled by the temperature slider is the fan curve, which for all practical purposes follows the temperature slider. With modern video cards ramping up their fan speeds rather quickly once cards get into the 80C range, merely increasing the power target alone wouldn’t be particularly desirable in most cases due to the extra noise it generates, so NVIDIA tied in the fan curve to the temperature slider. By doing so it ensures that fan speeds stay relatively low until they start exceeding the temperature target. This seems a bit counterintuitive at first, but when put in perspective of the goal – higher temperatures without an increase in fan speed – this starts to make sense.

Finally, in what can only be described as a love letter to the boys over at, NVIDIA is also introducing a simplified monitor overclocking option, which can be used to increase the refresh rate sent to a monitor in order to coerce it into operating at that higher refresh rate. Notably, this isn’t anything that couldn’t be done before with some careful manipulation of the GeForce control panel’s custom resolution option, but with the monitor overclocking option exposed in PrecisionX and other utilities, monitor overclocking has been reduced to a simple slider rather than a complex mix of timings and pixel counts.

Though this feature can technically work with any monitor, it’s primarily geared towards monitors such as the various Korean LG-based 2560x1440 monitors that have hit the market in the past year, a number of which have come with electronics capable of operating far in excess of the 60Hz that is standard for those monitors. On the models that have been able to handle it, modders have been able to get some of these 2560x1440 monitors up to and above 120Hz, essentially doubling their native 60Hz refresh rate to 120Hz, greatly improving smoothness to levels similar to a native 120Hz TN panel, but without the resolution and quality drawbacks inherent to those TN products.

Of course it goes without saying that just like any other form of overclocking, monitor overclocking can be dangerous and risks breaking the monitor. On that note, out of our monitor collection we were able to get our Samsung 305T up to 75Hz, but whether that’s due to the panel or the driving electronics it didn’t seem to have any impact on performance, smoothness, or responsiveness. This is truly a “your mileage may vary” situation.

GPU Boost 2.0: Temperature Based Boosting Origin’s Genesis: Titan on Water & More to Come


View All Comments

  • mrdude - Tuesday, February 19, 2013 - link

    I doubt it, given the transistor count and die size. This thing isn't exactly svelte, with 7.1Billion transistors. The viable-chips-per-wafer must be quite low, hence the price tag.

    What I don't understand is why people would buy a a $1000 GPU for compute? I can understand why somebody buys a ~$300 GPU to add a little extra horsepower to their small selection of applications, but if you're paying $1000 for a GPU then you're also expecting a decent set of drivers as well. But both AMD and nVidia have purposely neutered their consumer cards' performance for most professional tasks and applications. As a result, you can buy a cheaper FirePro or Quadro with professional drivers based on the smaller die/GPU (like a 7850 or 660Ti) that will outperform this $1000 single GPU card in a variety of software.

    If I'm paying upwards of $1000 for a GPU, it sure as hell has to work. Buying a consumer grade GPU and relying on consumer (gaming) drivers just means that you'll almost never hit anywhere near the max theoretical throughput of the card. In essence, you're paying for performance which you'll never get anywhere close to.

    This is a perfect card for the fools who overspend on their gaming GPUs. For everyone else it's just a high-priced bore.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    All those fools, we have been told over and over, and in fact very recently by the site's own, are here !

    That's what this is for, dimwit. Not for crybaby losers who can barely scrape up an HD 5750.

    Let's face it, every one of you whining jerks is drooling uncontrollably for this flagship, and if you're just a loser with a 450W power supply, no worries, they're being sold in high priced systems with that.

    You'd take in a minute, happily, and max out your games and your 1920x1080 monitor in MOST games.

    I mean I have no idea what kind of poor all you crybabies are. I guess you're all living in some 3rd world mudhole.
  • madmilk - Thursday, February 21, 2013 - link

    They're clearly not in any kind of hurry, given how well Tesla is selling at 3 times the price. These are probably just the rejects, set to a higher voltage and TDP and sold to the consumer market. Reply
  • mrdude - Thursday, February 21, 2013 - link

    Oh yea, nVidia is never going to jeopardize the cash cow that is the Tesla for the HPC crowd, or Quadro for the professional market. The margins there aren't worth giving up in order to bring GPU compute (and its drivers) to the mass market.

    This notion that this is a GPGPU card is silly, frankly. We can throw around the max theoretical GFLOPs/TFLOPs figures all we please, the reality is that you'll never see anywhere close to those in professional applications. There are two reasons for that: Tesla and Quadro.
  • chizow - Tuesday, February 19, 2013 - link

    Yeah, totally agree with the post title, Nvidia has lost their fking minds.

    And PS: The X-Men *STILL* want their logo back.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    This isn't 19G80 Kansas anymore Dorothy.

    Do any of you people live in the USA ?

    I mean really, how frikkin poor are all you crybabies, and how do you even afford any gaming system or any games ?

    Are you all running low end C2D still, no SSD's, and 1280x1024, do you live in a box ?

    How can you be in the USA and whine about this price on the very top end product for your Lifetime Hobby ?

    What is wrong with you, is the question.
  • Pariah - Tuesday, February 19, 2013 - link

    In most cases, this card won't make sense. There are at least a couple of scenarios where it might make sense. One, in an ultra highend gaming system. That means multiple Titan cards. Because these are single GPU cards, an SLI Titan setup should scale much better than an SLI 690 with 4 GPU's would. And further that point with triple SLI Titans.

    Secondly, this card is smaller and uses less power than a 690, which means you can use it in much smaller cases, even some mini-itx cases. That would be one helluva a nice portable LAN box.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    This card makes sense for anyone running a mid sandy bridge and 1920x1080 monitor.
    After I complained about the 1920X1200 reviews here, pointing out nVidia is 12% BETTER compared to amd in the former resolution, 50 raging amd fanboys screeched they have a 1920X1200 monitor they run all the time and they were more than willing to pop the extra $150 bucks for it over the 1920x1080...

    So we can safely assume MOST of the people here have a 1920X1080 for pete sakes.
    A low end sandy is $50 to $80, same for a board, DDR3 is the cheapest ram.
    So for less than $200 bucks to prepare at max, (use your old case+ps) near everyone here is ready to run this card, and would find benefit from doing so.

    Now lying about that just because they don't plan on buying one is what most here seem to want to do.

  • Deo Domuique - Friday, March 8, 2013 - link

    This card should be cost ~600-650$. Not a single cent more. The rest is ala Apple markup for the mindless consumer. Unfortunately, there are a lot of them. Reply
  • trajan2448 - Tuesday, February 19, 2013 - link

    Obviously a great piece of technology. Interested to see what the over clockers can achieve.
    If it was $700 It would make a lot more sense. Nonetheless, fun to see some fanatics do a TRI SLI overclocked and blow up their monitor.

Log in

Don't have an account? Sign up now