GPU Boost 2.0: Temperature Based Boosting

With the Kepler family NVIDIA introduced their GPU Boost functionality. Present on the desktop GTX 660 and above, boost allows NVIDIA’s GPUs to turbo up to frequencies above their base clock so long as there is sufficient power headroom to operate at those higher clockspeeds and the voltages they require. Boost, like turbo and other implementations, is essentially a form of performance min-maxing, allowing GPUs to offer higher clockspeeds for lighter workloads while still staying within their absolute TDP limits.

With the first iteration of GPU Boost, GPU Boost was based almost entirely around power considerations. With the exception of an automatic 1 bin (13MHz) step down in high temperatures to compensate for increased power consumption, whether GPU Boost could boost and by how much depended on how much power headroom was available. So long as there was headroom, GPU Boost could boost up to its maximum boost bin and voltage.

For Titan, GPU Boost has undergone a small but important change that has significant ramifications to how GPU Boost works, and how much it boosts by. And that change is that with GPU Boost 2, NVIDIA has essentially moved on from a power-based boost system to a temperature-based boost system. Or perhaps more precisely, a system that is predominantly temperature based but is also capable of taking power into account.

When it came to GPU Boost 1, its greatest weakness as explained by NVIDIA is that it essentially made conservative assumptions about temperatures and the interplay between high temperatures and high voltages in order keep from seriously impacting silicon longevity. The end result being that NVIDIA was picking boost bin voltages based on the worst case temperatures, which meant those conservative assumptions about temperatures translated into conservative voltages.

So how does a temperature based system fix this? By better mapping the relationship between voltage, temperature, and reliability, NVIDIA can allow for higher voltages – and hence higher clockspeeds – by being able to finely control which boost bin is hit based on temperature. As temperatures start ramping up, NVIDIA can ramp down the boost bins until an equilibrium is reached.

Of course total power consumption is still a technical concern here, though much less so. Technically NVIDIA is watching both the temperature and the power consumption and clamping down when either is hit. But since GPU Boost 2 does away with the concept of separate power targets – sticking solely with the TDP instead – in the design of Titan there’s quite a bit more room for boosting thanks to the fact that it can keep on boosting right up until the point it hits the 250W TDP limit. Our Titan sample can boost its clockspeed by up to 19% (837MHz to 992MHz), whereas our GTX 680 sample could only boost by 10% (1006MHz to 1110MHz).

Ultimately however whether GPU Boost 2 is power sensitive is actually a control panel setting, meaning that power sensitivity can be disabled. By default GPU Boost will monitor both temperature and power, but 3rd party overclocking utilities such as EVGA Precision X can prioritize temperature over power, at which point GPU Boost 2 can actually ignore TDP to a certain extent to focus on power. So if nothing else there’s quite a bit more flexibility with GPU Boost 2 than there was with GPU Boost 1.

Unfortunately because GPU Boost 2 is only implemented in Titan it’s hard to evaluate just how much “better” this is in any quantities sense. We will be able to present specific Titan numbers on Thursday, but other than saying that our Titan maxed out at 992MHz at its highest boost bin of 1.162v, we can’t directly compare it to how the GTX 680 handled things.

Titan For Compute GPU Boost 2.0: Overclocking & Overclocking Your Monitor
Comments Locked

157 Comments

View All Comments

  • Olaf van der Spek - Tuesday, February 19, 2013 - link

    Who cares about your pair of cards? Nobody but you!
  • Iketh - Tuesday, February 19, 2013 - link

    lol hater!
  • CeriseCogburn - Sunday, February 24, 2013 - link

    As compared to the crawl into the street and stone yourself missive you throw in another post ? LOL

    You won the hate war bub !

    I thought the gentleman owning the two 7950's made a very decent comment.
    Yes it's shocking coming from someone with 2 amd cards, but for once, it occurred.
  • chizow - Tuesday, February 19, 2013 - link

    This is much worst than the Ultra imo, at least in the case of the 8800GTX/Ultra, the performance at least somewhat justified the price relative to the rest of the market. We are somewhat spoiled by the bevy of card releases in recent years, but that's also the curse of the 680 and now Titan, the performance difference is nowhere close to the increase in price tag.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    You're NUTS to pretend you deserve 1 to 1 price to perf pricing on up the line, or that it is even a standard or usual expected measured outcome.

    What you do have is years now of idiot web articles and postings from insanely focused miniscule minded scrooge like weirdos futzing around dicing up beans to fill web space. So now your brain is fried. FPS is all the drool cup can visibly contain.

    Congratulations on the complete brainwashing. When you screamed 100% in the prior threads, it was found to be 20%, 30%, etc. outlying 40%.

    Facts don't matter, all the bang for the buck historical fantasy BS in your gourd, does.
  • joqqy - Wednesday, February 20, 2013 - link

    I'll wait until price drops, quite content with what I have now.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    Since you spent $550, you could spend a grand.
    I accept your notional decision, but it is not out of your price range, you are after all running 2x #2 flagships.

    In fact yours is the first and ONLY reasonable pricing comment (complaint) in the entire thread.

    Congratulations for that. Appreciate it. Happy gaming to you.
  • Deo Domuique - Friday, March 8, 2013 - link

    I strongly believe, currently the best setup one could have is what you have...

    2x 7950 the most bang for your buck! 2 great cards with great price... Although, I'm no fan of Crossfire or Sli. Still, even one 7950 it still holds the best spot in my mind.
  • sensiballfeel - Tuesday, February 19, 2013 - link

    $1000 for a card slower than a GTX 690?

    Running two 580s for years now and skipped 680 for being too slow expecting something else in the pipeline.This is it and NvidiA wants to double the price to $1000?

    Nvidia has lost their mind.Good card,the price is beyond ridiculous.Nice try nvidia,but no thanks.
  • Menty - Tuesday, February 19, 2013 - link

    Meh, it's a single card rather than an SLI-on-a-stick card. That makes it better, in my book.

Log in

Don't have an account? Sign up now