G-SYNC Comes to the Notebook

G-SYNC is not a new technology, but the implementation on a notebook is somewhat different than the desktop version that Anand reviewed back in 2013. On the desktop, due to the almost infinite number of possible combinations of hardware and displays, NVIDIA required a G-SYNC module in the display itself which would be profiled for the specific panel inside the monitor. At the time, there was not an official method for adaptive framerates in the desktop DisplayPort specification, so NVIDIA essentially had to engineer their own vertical solution.

However in a roundabout manner, that changed in 2014 when VESA added Adaptive-Sync to the DisplayPort 1.2a standard. Based around the variable VBLANK capabilities first created for embedded DisplayPort (eDP) years earlier, competitor AMD proved that it would be possible to implement variable refresh technology by manipulating these VBLANK capabilities. The end result was that these abilities were adopted into the desktop DisplayPort standard under the name Adaptive-Sync.

Meanwhile because Adaptive-Sync is based on technology already found in eDP, this meant that with the right GPU and scaler, laptops could be equipped with variable refresh technology as well. The end result is that for Mobile G-SYNC, NVIDIA has been able to do away with the G-SYNC module entirely since all of the necessary functionality is already baked into the eDP standard, leaning on the standard to provide the basic variable refresh functionality G-SYNC needs.

However with that said, not all of NVIDIA's G-SYNC quality requirements are covered by embedded DisplayPort, and as a result the company is still doing some additional work to validate individual Mobile G-SYNC laptop models. As part of the Mobile G-SYNC program, NVIDIA will be doing the calibration and qualification of G-SYNC laptops (for a fee of course) in order for them to be official G-SYNC devices with branding.

In our initial news of G-SYNC in a mobile (laptop) form factor the other changes for G-SYNC were covered, but it is worth going over here as well. One of the under-the-hood features of G-SYNC that has been around since the beginning, but not really discussed by NVIDIA was their variable overdrive capability. In a fixed refresh display panel, LCD vendors use overdrive in order to reach a desired color output quicker. In a simple sense, if you are at color A and you want to go to color B, you tell the panel to instead move to B + a value which will overdrive it to a point where it would be past B, but by the time the next display refresh comes around, it will actually have just hit B where it would be stopped.

This is all well and good, but with G-SYNC, the formula for determining what value to overdrive to gets a lot more complicated since the refresh rate is changing. G-SYNC attempts to determine when the next frame will arrive and choose the appropriate overdrive value to hit the correct color. There is no way to do this with 100% accuracy though, but variable overdrive presents a best-effort attempt which will be much closer to the correct value than if you just ignore overdrive altogether. With a laptop being a fixed set of components, NVIDIA can do the qualifying on a device and create the correct settings for each panel.

I think that G-SYNC is a big win for the notebook space. The thermal limitations of notebooks have always ensured that they are going to have less performance than an equivalent desktop part. NVIDIA has been closing the gap here in recent generations, but even with just theoretical performance the GTX 980M is going to be about 75% of the performance of the GTX 980 desktop part. With less performance, the end result is that frame rates are going to be lower than a desktop part, and frame rates are going to be more likely to fall below the refresh rate of the display, and that is when G-SYNC steps in and stops the stuttering that V-SYNC can cause in games, and remove the tearing that comes with not running V-SYNC at all.

There is a pretty major downside to G-SYNC in the notebook space though, and that is the potential impact on battery life. On NVIDIA’s FAQ for G-SYNC, they state “G-SYNC has no impact on notebook battery life” and while that is technically true, it is also not the entire story. In order to implement G-SYNC, the NVIDIA GPU must be directly connected to the display panel over eDP - since variable refresh doesn't currently translate through iGPUs - which means that it instantly precludes implementation of NVIDIA’s Optimus technology which allows you to disable the NVIDIA GPU when not doing 3D gaming in order to boost battery life. NVIDIA has been making efforts to reduce power consumption at idle on their latest GPUs, but they are still a long ways from the power consumption of the integrated GPU of a Haswell or even more so Broadwell processor.

So for now, G-SYNC laptops are going to be those targeted with a primary purpose of gaming. The odds of a 14-inch general use notebook with a GTX 965M having G-SYNC are going to be low, since battery life is so important on those kinds of systems. Perhaps in the near future NVIDIA will work with Intel to be able to implement G-SYNC with Optimus, but for now and maybe forever, it is not available.

That is really too bad, because once again as we move down in the range, devices with less powerful NVIDIA GPUs like the ASUS Zenbook Pro UX501 would benefit even more than most gaming laptops which generally have enough GPU power to keep framerates at acceptable levels. But that does not take away from the technology in devices that can support it, and G-SYNC and really is one of those technologies that are eye-opening when gaming, and make the entire experience better.

Design System Performance
Comments Locked

52 Comments

View All Comments

  • meacupla - Wednesday, July 29, 2015 - link

    Correct me if I'm wrong, but didn't optimus cause a ton of problems that people just wanted to disable it permanently?
  • Dribble - Wednesday, July 29, 2015 - link

    You're wrong, never had any problems with it and hardly read any complaints about it.
  • Gigaplex - Wednesday, July 29, 2015 - link

    I've had plenty of problems with it. Just because you haven't seen them doesn't mean they don't exist.
  • Refuge - Thursday, July 30, 2015 - link

    Never heard about it? Were you hiding under a rock during that fiasco? It was so bad that some review sites would mark a product down just for having Optimus enabled by default in the bios from the factory.
  • nerd1 - Friday, July 31, 2015 - link

    Optimus is terrible for everything except AAA gaming (big trouble with most online games, nightmare with linux, and so on), and does not make any sense for large caliber gaming rigs anyway. Basically you have to plug in otherwise battery won't last more than an hour.
  • DanNeely - Wednesday, July 29, 2015 - link

    There's a 1-2% framerate hit; and while that's a meaningless real world difference hysteria driven configuration has meant it's often not been installed in top of the line gaming laptops; and caused people to disable it in mid-range ones.
  • Samus - Wednesday, July 29, 2015 - link

    Optimus crashes pretty much every 3D modeling program I've ever tried on it, especially Solidworks.
  • Jorsher - Friday, July 31, 2015 - link

    I've never had a problem with it on my 2012 (perhaps older) Dell XPS with Intel and NVidia graphics. I'm glad to have it.
  • WorldWithoutMadness - Wednesday, July 29, 2015 - link

    I assume it has something to do with the G-sync.
    Maybe it is not compatible with optimus, switching intel hd to gtx, vice versa
  • Brett Howse - Wednesday, July 29, 2015 - link

    I think it has something to do with G-Sync, which is why I laid that out exactly on page 3 :)

    "In order to implement G-SYNC, the NVIDIA GPU must be directly connected to the display panel over eDP - since variable refresh doesn't currently translate through iGPUs - which means that it instantly precludes implementation of NVIDIA’s Optimus technology"

Log in

Don't have an account? Sign up now