G-SYNC Comes to the Notebook

G-SYNC is not a new technology, but the implementation on a notebook is somewhat different than the desktop version that Anand reviewed back in 2013. On the desktop, due to the almost infinite number of possible combinations of hardware and displays, NVIDIA required a G-SYNC module in the display itself which would be profiled for the specific panel inside the monitor. At the time, there was not an official method for adaptive framerates in the desktop DisplayPort specification, so NVIDIA essentially had to engineer their own vertical solution.

However in a roundabout manner, that changed in 2014 when VESA added Adaptive-Sync to the DisplayPort 1.2a standard. Based around the variable VBLANK capabilities first created for embedded DisplayPort (eDP) years earlier, competitor AMD proved that it would be possible to implement variable refresh technology by manipulating these VBLANK capabilities. The end result was that these abilities were adopted into the desktop DisplayPort standard under the name Adaptive-Sync.

Meanwhile because Adaptive-Sync is based on technology already found in eDP, this meant that with the right GPU and scaler, laptops could be equipped with variable refresh technology as well. The end result is that for Mobile G-SYNC, NVIDIA has been able to do away with the G-SYNC module entirely since all of the necessary functionality is already baked into the eDP standard, leaning on the standard to provide the basic variable refresh functionality G-SYNC needs.

However with that said, not all of NVIDIA's G-SYNC quality requirements are covered by embedded DisplayPort, and as a result the company is still doing some additional work to validate individual Mobile G-SYNC laptop models. As part of the Mobile G-SYNC program, NVIDIA will be doing the calibration and qualification of G-SYNC laptops (for a fee of course) in order for them to be official G-SYNC devices with branding.

In our initial news of G-SYNC in a mobile (laptop) form factor the other changes for G-SYNC were covered, but it is worth going over here as well. One of the under-the-hood features of G-SYNC that has been around since the beginning, but not really discussed by NVIDIA was their variable overdrive capability. In a fixed refresh display panel, LCD vendors use overdrive in order to reach a desired color output quicker. In a simple sense, if you are at color A and you want to go to color B, you tell the panel to instead move to B + a value which will overdrive it to a point where it would be past B, but by the time the next display refresh comes around, it will actually have just hit B where it would be stopped.

This is all well and good, but with G-SYNC, the formula for determining what value to overdrive to gets a lot more complicated since the refresh rate is changing. G-SYNC attempts to determine when the next frame will arrive and choose the appropriate overdrive value to hit the correct color. There is no way to do this with 100% accuracy though, but variable overdrive presents a best-effort attempt which will be much closer to the correct value than if you just ignore overdrive altogether. With a laptop being a fixed set of components, NVIDIA can do the qualifying on a device and create the correct settings for each panel.

I think that G-SYNC is a big win for the notebook space. The thermal limitations of notebooks have always ensured that they are going to have less performance than an equivalent desktop part. NVIDIA has been closing the gap here in recent generations, but even with just theoretical performance the GTX 980M is going to be about 75% of the performance of the GTX 980 desktop part. With less performance, the end result is that frame rates are going to be lower than a desktop part, and frame rates are going to be more likely to fall below the refresh rate of the display, and that is when G-SYNC steps in and stops the stuttering that V-SYNC can cause in games, and remove the tearing that comes with not running V-SYNC at all.

There is a pretty major downside to G-SYNC in the notebook space though, and that is the potential impact on battery life. On NVIDIA’s FAQ for G-SYNC, they state “G-SYNC has no impact on notebook battery life” and while that is technically true, it is also not the entire story. In order to implement G-SYNC, the NVIDIA GPU must be directly connected to the display panel over eDP - since variable refresh doesn't currently translate through iGPUs - which means that it instantly precludes implementation of NVIDIA’s Optimus technology which allows you to disable the NVIDIA GPU when not doing 3D gaming in order to boost battery life. NVIDIA has been making efforts to reduce power consumption at idle on their latest GPUs, but they are still a long ways from the power consumption of the integrated GPU of a Haswell or even more so Broadwell processor.

So for now, G-SYNC laptops are going to be those targeted with a primary purpose of gaming. The odds of a 14-inch general use notebook with a GTX 965M having G-SYNC are going to be low, since battery life is so important on those kinds of systems. Perhaps in the near future NVIDIA will work with Intel to be able to implement G-SYNC with Optimus, but for now and maybe forever, it is not available.

That is really too bad, because once again as we move down in the range, devices with less powerful NVIDIA GPUs like the ASUS Zenbook Pro UX501 would benefit even more than most gaming laptops which generally have enough GPU power to keep framerates at acceptable levels. But that does not take away from the technology in devices that can support it, and G-SYNC and really is one of those technologies that are eye-opening when gaming, and make the entire experience better.

Design System Performance
Comments Locked

52 Comments

View All Comments

  • Notmyusualid - Saturday, August 1, 2015 - link

    ^ Also this is true.

    Guys in work bemoanded my M3800 purchase back in December due to only having Haswell, and supposedly Broadwell was due out any day... But a machine was required immediately.

    Broadwell shipped 8 months later, and in no numbers too! I'm not ever buying a 'U' Intel cpu, so those don't count.

    Turns out my instinct (likely due to places like this), was right.
  • BMNify - Wednesday, July 29, 2015 - link

    Around the corner?? Skylake H mobile will take atleast 6 months before launching in a new product like this, people buy when they need and don't wait half a year for updates.
  • shadowjk - Wednesday, July 29, 2015 - link

    Exactly. I'm still on 4700MQ + 780M.

    With "Skylake around the corner", nobody seems to want to sell even the few Broadwell-H laptops that allegedly exist, atleast in Europe. I'm guessing we'll be stuck on Haswell another 9-16 months..
  • boeush - Wednesday, July 29, 2015 - link

    My guess is, nobody wanted to invest in Broadwell H inventory, when Skylake is soon to render it obsolete. OEMs are simply waiting for Skylake to refresh their models; they are skipping Broadwell pretty much just as Intel did.
  • Refuge - Thursday, July 30, 2015 - link

    Yes, this. No OEM's are paying Broadwell much mind. It was a flubbed launch when you look at its placement in the timeline. Timing was all wrong.

    Intel shouldn't have even bothered with it if they weren't going to push back Skylake, but oh well. At least the limited it to a very small launch.
  • douglord - Wednesday, July 29, 2015 - link

    Totally pointless system for anyone that's not an engineer or architect that has to be able to take 3D drawings to customers.

    Why can't we get a 4 Core 45w chip with Iris Pro in a 5lb 15" laptop with all day battery life (10 hours)?

    No we have to choose between a pointless 10lb gayming system with 1 hour battery life and 2 core ULP garbage in a 2lb folder.
  • BMNify - Wednesday, July 29, 2015 - link

    I doubt you even read the article's first paragraph before commenting here, You are blabering about Engineers and architects and ignoring the name of the product itself: "Republic of Gamers", Hope you got a clue and its gaming not gayming. Hope i didn't hurt your gay feelings, homophobic people generally are just hiding their own insecurities and can be cured with proper help in coming out of the closet.
  • Notmyusualid - Saturday, August 1, 2015 - link

    doug sounds clueless to me.
  • benzosaurus - Wednesday, July 29, 2015 - link

    I believe the low-end Retina Macbook Pro is exactly the machine you're describing.
  • boeush - Wednesday, July 29, 2015 - link

    As an engineer (SW) who periodically deals with 3D graphics and modeling, I wouldn't bother with any 16:9 screen no matter the resolution. 16:10 is the minimal aspect ratio that is remotely acceptable for a workstation. 1080p is only suitable for movie watching, and is counter-productive for anything else.

Log in

Don't have an account? Sign up now