GPU Boost 2.0: Overclocking & Overclocking Your Monitor

The first half of the GPU Boost 2 story is of course the fact that with 2.0 NVIDIA is switching from power based controls to temperature based controls. However there is also a second story here, and that is the impact to overclocking.

With the GTX 680, overclocking capabilities were limited, particularly in comparison to the GeForce 500 series. The GTX 680 could have its power target raised (guaranteed “overclocking”), and further overclocking could be achieved by using clock offsets. But perhaps most importantly, voltage control was forbidden, with NVIDIA going so far as to nix EVGA and MSI’s voltage adjustable products after a short time on the market.

There are a number of reasons for this, and hopefully one day soon we’ll be able to get into NVIDIA’s Project Greenlight video card approval process in significant detail so that we can better explain this, but the primary concern was that without strict voltage limits some of the more excessive users may blow out their cards with voltages set too high. And while the responsibility for this ultimately falls to the user, and in some cases the manufacturer of their card (depending on the warranty), it makes NVIDIA look bad regardless. The end result being that voltage control on the GTX 680 (and lower cards) was disabled for everyone, regardless of what a card was capable of.

With Titan this has finally changed, at least to some degree. In short, NVIDIA is bringing back overvoltage control, albeit in a more limited fashion.

For Titan cards, partners will have the final say in whether they wish to allow overvolting or not. If they choose to allow it, they get to set a maximum voltage (Vmax) figure in their VBIOS. The user in turn is allowed to increase their voltage beyond NVIDIA’s default reliability voltage limit (Vrel) up to Vmax. As part of the process however users have to acknowledge that increasing their voltage beyond Vrel puts their card at risk and may reduce the lifetime of the card. Only once that’s acknowledged will users be able to increase their voltages beyond Vrel.

With that in mind, beyond overvolting overclocking has also changed in some subtler ways. Memory and core offsets are still in place, but with the switch from power based monitoring to temperature based monitoring, the power target slider has been augmented with a separate temperature target slider.

The power target slider is still responsible for controlling the TDP as before, but with the ability to prioritize temperatures over power consumption it appears to be somewhat redundant (or at least unnecessary) for more significant overclocking. That leaves us with the temperature slider, which is really a control for two functions.

First and foremost of course is that the temperature slider controls what the target temperature is for Titan. By default for Titan this is 80C, and it may be turned all the way up to 95C. The higher the temperature setting the more frequently Titan can reach its highest boost bins, in essence making this a weaker form of overclocking just like the power target adjustment was on GTX 680.

The second function controlled by the temperature slider is the fan curve, which for all practical purposes follows the temperature slider. With modern video cards ramping up their fan speeds rather quickly once cards get into the 80C range, merely increasing the power target alone wouldn’t be particularly desirable in most cases due to the extra noise it generates, so NVIDIA tied in the fan curve to the temperature slider. By doing so it ensures that fan speeds stay relatively low until they start exceeding the temperature target. This seems a bit counterintuitive at first, but when put in perspective of the goal – higher temperatures without an increase in fan speed – this starts to make sense.

Finally, in what can only be described as a love letter to the boys over at 120hz.net, NVIDIA is also introducing a simplified monitor overclocking option, which can be used to increase the refresh rate sent to a monitor in order to coerce it into operating at that higher refresh rate. Notably, this isn’t anything that couldn’t be done before with some careful manipulation of the GeForce control panel’s custom resolution option, but with the monitor overclocking option exposed in PrecisionX and other utilities, monitor overclocking has been reduced to a simple slider rather than a complex mix of timings and pixel counts.

Though this feature can technically work with any monitor, it’s primarily geared towards monitors such as the various Korean LG-based 2560x1440 monitors that have hit the market in the past year, a number of which have come with electronics capable of operating far in excess of the 60Hz that is standard for those monitors. On the models that have been able to handle it, modders have been able to get some of these 2560x1440 monitors up to and above 120Hz, essentially doubling their native 60Hz refresh rate to 120Hz, greatly improving smoothness to levels similar to a native 120Hz TN panel, but without the resolution and quality drawbacks inherent to those TN products.

Of course it goes without saying that just like any other form of overclocking, monitor overclocking can be dangerous and risks breaking the monitor. On that note, out of our monitor collection we were able to get our Samsung 305T up to 75Hz, but whether that’s due to the panel or the driving electronics it didn’t seem to have any impact on performance, smoothness, or responsiveness. This is truly a “your mileage may vary” situation.

GPU Boost 2.0: Temperature Based Boosting Origin’s Genesis: Titan on Water & More to Come
Comments Locked

157 Comments

View All Comments

  • WhoppingWallaby - Thursday, February 21, 2013 - link

    Dude, you have some gall calling another person a fanboy. We could all do without your ranting and raving, so go troll elsewhere or calm down a little.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    Oh shut up yourself you radeon rager.

    You idiots think you have exclusive rights to spew your crap all over the place, and when ANYONE disagrees you have a ***** fit and demand they stop.

    How about all you whining critical diaper pooping fanatics stop instead ?
  • IanCutress - Tuesday, February 19, 2013 - link

    It's all about single card performance. Everything just works eaier with a single card. Start putting SLI into the mix and you need to take into account for drivers, or when doing compute it requires a complete reworking of code. Not to mention the potentially lower power output and OC capabilities of Titan over a dual GPU card.

    At any given price point, getting two cards up to that cost will always be quicker than a single card in any scenario that can take advantage, if you're willing to put up with it. So yes, two GTX 680s, a 690, or a Titan is a valid question, and it's up to the user preference which one to get.

    I need to double check my wallet, see if it hasn't imploded after hearing the price.
  • wreckeysroll - Tuesday, February 19, 2013 - link

    lost their minds?
    how about fell and cracked their head after losing it. Smoking too much of that good stuff down there in California.

    How stupid do they take us for. Way to thumb your customers in the eye nvidia. $1000 on a single gpu kit.

    Good laugh for the morning.
  • B3an - Tuesday, February 19, 2013 - link

    Use some ****ing common sense. You get what you pay for.

    6GB with 386-bit memory bus, and a 551mm2 size GPU. Obviously this wont be cheap and theres no way this could be sold for anywhere near the price of a 680 without losing tons of money.

    Nvidia already had this thing in super computers anyway so why not turn it in to a consumer product? Some people WILL buy this. If you have the money why not. Atleast NV are not sitting on their arses like AMD are with no new high-end GPU's this year. Even though i have AMD cards i'm very disappointed with AMD's crap lately as an enthusiast and someone who's just interested in GPU tech. First they literally give up on competitive performance CPU's and now it's looking like they're doing it with GPU's.
  • siliconfiber - Tuesday, February 19, 2013 - link

    Common sense is what you are missing.

    GTX 580, 480, 285 were all sold to for much less than this card and were all used in HPC applications, had the same or much bigger dies sizes, and the same or bigger bus. DDR memory is dirt cheap as well

    I have seen it all now. Largest rip-off in the history of video cards right here.
  • Genx87 - Tuesday, February 19, 2013 - link

    Oh look I have never seen this argument before. Biggest rip off in history of video cards. Pre-ceded only by every high end video card release since the introduction of high end discrete GPUs. And will remain a ripoff until the next high end GPU is released surpassing this card ripoff factor.
  • Blibbax - Tuesday, February 19, 2013 - link

    It's not a rip off because you don't have to buy it. The 680 hasn't gotten any slower.

    Just like with cars and anything else, when you add 50% more performance to a high-end product, it's gunna be a lot more than 50% more expensive.
  • johnthacker - Tuesday, February 19, 2013 - link

    The largest rip-off in the history of video cards are some of the Quadro cards. This is extremely cheap for a card with so good FP64 performance.
  • TheJian - Wednesday, February 20, 2013 - link

    GTX580 (40nm) was not in the same league as this and only had 3b transistors. Titan has 7.1B on 28nm. 512cuda cores compared to 2880? It came with 1.5GB memory too, this has 6. etc etc..The 580 did not run like a $2500 pro card @ a 1500 discount either. Also a chip this complicated doesn't YIELD well. It's very expensive to toss out the bad ones.

    Do you know the difference between system memory and graphics memory (you said ddr). They do not cost the same. You meant GDDR? Well this stuff is 4x as much running 6ghz not 4ghz.

    Ref clock is 876 but these guys got theirs to 1176:
    http://www.guru3d.com/articles-pages/geforce_gtx_t...

    The card is a monster value vs. $2500 K20. Engineering is not FREE. Ask AMD. They lost 1.18B last year selling crap at prices that would make you happy I guess. That's how you go out of business. Get it? They haven't made money in 10yrs (lost 3-4B over that time as a whole). Think they should've charged more for their cards/chips the last ten years? I DO. If Titan is priced wrong, they will remain on the shelf. Correct? So if you're right they won't sell. These will be gone in a day, because there are probably enough people that would pay $1500 for them they'll sell out quickly. You have to pay $2500 to get this on the pro side.

Log in

Don't have an account? Sign up now