NVIDIA Launches Mobile G-Sync, Enables Windowed G-Sync, & Moreby Brett Howse & Ryan Smith on May 31, 2015 6:01 PM EST
- Posted in
- Computex 2015
With Computex kicking off today NVIDIA has a number of announcements hitting the wire at the same time. The biggest news of course is the launch of the GeForce GTX 980 Ti, however the company is also releasing a number of G-Sync announcements today. This includes the launch of Mobile G-Sync for laptops, Windowed G-Sync support for laptops and desktops, new G-Sync framerate control functionality, and a number of new G-Sync desktop monitors.
We'll kick things off with the biggest of the G-Sync announcements, which is Mobile G-Sync. Today NVIDIA is announcing a very exciting product for notebook gamers. After much speculation (and an early prototype leak) NVIDIA’s G-Sync technology is now coming to notebooks.
Anand took a look at the original G-Sync back in 2013 and for those that need a refresher on the technology, this would be a great place to start. But what G-Sync allows for is a variable refresh rate on the display which allows it to stay in sync with the GPU’s abilities to push out frames rather than forcing everything to work at a single fixed rate as dictated by the display.
From a technical/implementation perspective, because desktop systems can be hooked to any monitor, desktop G-Sync originally required that NVIDIA implement a separate module - the G-Sync module - to be put into the display and to serve as an enhanced scaler. For a desktop monitor this is not a big deal, particularly since it was outright needed in 2013 when G-Sync was first introduced. However with laptops come new challenges and new technologies, and that means a lot of the implementation underpinnings are changing with the announcement of Mobile G-Sync today.
With embedded DisplayPort (eDP) now being a common fixture in high-end notebooks these days, NVIDIA will be able to do away with the G-Sync module entirely and rely just on the variable timing and panel self-refresh functionality built in to current versions of eDP. eDP's variable timing functionality was of course the basis of desktop DisplayPort Adaptive-Sync (along with AMD's Freesync implementation), and while the technology is a bit different in laptops, the end result is quite similar. Which is to say that NVIDIA will be able to drive variable refresh laptops entirely with standardized eDP features, and will not be relying on proprietary features or hardware as they do with desktop G-Sync.
Removing the G-Sync module offers a couple of implementation advantages. The first of these is power; even though the G-Sync module replaced a scaler, it was a large and relatively power-hungry device, which would make it a poor fit for laptops. The second advantage is that it allows G-Sync to be implemented against traditional, lower-cost laptop eDP scalers, which brings the price of the entire solution down. In fact for these reasons I would not be surprised to eventually see NVIDIA release a G-Sync 2.0 for desktops using just DisplayPort Adaptive-Sync (for qualified monitors only, of course), however NVIDIA obviously isn't talking about such a thing at this time. Laptops as compared to desktops do have the advantage of being a known, fixed platform, so there would be a few more issues to work out to bring something like this to desktops.
Moving on, while the technical underpinnings have changed, what hasn't changed is how NVIDIA is approaching mobile G-Sync development. For laptops to be enabled for mobile G-Sync they must still undergo qualification from NVIDIA, and while NVIDIA doesn't release specific financial details, there is a fee for this process (and presumably per-unit royalties as well). Unfortunately NVIDIA also isn't commenting on what kind of price premium G-Sync enabled laptops will go for, though they tell us that they don't expect the premium to be dramatically different, if only because they think that all gaming laptops will want to have this feature.
As far as qualification goes. the qualification process is designed to ensure a minimum level of overall quality in products that receive G-Sync branding, along with helping ODMs tune their notebooks for G-Sync. This process is something NVIDIA considers a trump-card of sorts for the technology, and something they believe delivers a better overall experience. From what we're hearing on quality, it sounds like NVIDIA is going to put their foot down on low quality panels, for example, so that the G-Sync brand and experience doesn't get attached to subpar laptops. Meanwhile the tuning process involves a similar process as on the desktop, with laptops and their respective components going through a profiling and optimization process to determine its refresh properties and pixel response times in order to set G-Sync timings and variable overdrive.
Which on that note (and on a slight tangent), after initially staying mum on the issue in the early days of G-Sync (presumably as a trade secret), NVIDIA is now confirming that all G-Sync implementations (desktop and mobile) include support for variable overdrive. As implied by the name, variable overdrive involves adjusting the amount of overdrive applied to a pixel in order to make overdrive more compatible with variable refresh timings.
As a quick refresher, the purpose of overdrive in an LCD is to decrease the pixel response time and resulting ghosting by overdriving pixels to get them to reach the desired color sooner. This is done by setting a pixel to a color intensity (voltage) above or below where you really want it to go, knowing that due to the response times of liquid crystals it will take more than 1 refresh interval for the pixel to reach that overdriven value. By driving a pixel harder and then stopping it on the next refresh, it's possible to reach a desired color sooner (or at least, something close to the desired color) than without overdrive.
Overdrive has been a part of LCD displays for many years now, however the nature of overdrive has always implied a fixed refresh rate, as it's not possible to touch a pixel outside of a refresh window. This in turn leads to issues with variable refresh, as you don't know when the next refresh may happen. Ultimately there's no mathematically perfect solution here - you can't predict the future with 100% accuracy - so G-Sync variable overdrive is a best-effort attempt to predict when the next frame will arrive, and adjusting the overdrive values accordingly. The net result is that in motion it's going to result in a slight decrease in color accuracy versus using a fixed refresh rate due to errors in prediction, but it allows for an overall reduction in ghosting versus not running overdrive at all.
But getting back to the subject at hand of mobile G-Sync, this is a big win for notebooks for a couple of reasons. First, more notebooks are sold now than desktops, so this makes G-Sync available to a bigger audience. Of course not all those devices even have GPUs, but NVIDIA has seen steady growth in the mobile GeForce segment over the last while, so the market is strong. The other reason this is important though is because mobile products are much more thermally constrained, as well as space constrained, so the mobile parts are always going to be slower than desktop parts. That gap has reduced with the latest Maxwell parts, but it is still there. G-Sync on mobile should help even more than it does on the desktop due to the lower overall framerate of laptop parts.
But there is a catch, and it’s a big one.
In order for G-Sync to be available on a laptop, a couple of things need to be true. First, the laptop must have a GeForce GPU obviously. Second, the laptop manufacturer needs to work with NVIDIA to enable this, since NVIDIA has to establish the parameters for the particular laptop panel in order to correctly know the maximum and minimum refresh rate as well as the amount of over/under-drive necessary. But the third is the big one. The laptop display must be directly connected to the GeForce GPU.
What this means is that in order for G-Sync to be available, Optimus (NVIDIA’s ability to switch from the integrated CPU graphics to the discrete NVIDIA graphics) will not be available. They are, at least for now, mutually exclusive. As a refresher for Optimus, the integrated GPU is actually the one that is connected to the display, and when Optimus is enabled, the iGPU acts as an intermediary and is the display controller. The discreet GPU feeds through the iGPU and then to the display. Due to the necessity of the GPU being directly connected to the display, this means that Optimus enabled notebooks will not have G-Sync available.
Obviously this is a big concern because Optimus is found on almost all notebooks that have GeForce GPUs, and has been one of the big drivers to reasonable battery life on gaming notebooks. However, going forward, it is likely that true gaming notebooks will drop this support in order to offer G-Sync, and more versatile devices which may use the GPU just once in a while, or for compute purposes, will likely keep it. There is going to be a trade-off that the ODM needs to consider. I asked specifically about this and NVIDIA feels that this is less of an issue than it was in the past because they have worked very hard on the idle power levels on Maxwell, but despite this there is likely going to be a hit to the battery life. Going forward this is something we'd like to test, so hopefully we'll be able to properly quantify the tradeoff in the future..
As for release details, mobile G-Sync is going to be available starting in June with laptops from Gigabyte’s Aorus line, MSI, ASUS, and Clevo. Expect more soon though since this should be a killer feature on the less powerful laptops around.
Wrapping things up, as I mentioned before, mobile G-Sync seems like a good solution to the often lower capabilities of gaming laptops and it should really bring G-Sync to many more people since a dedicated G-Sync capable monitor is not required. It really is a shame that it does not work with Optimus though since that has become the standard on NVIDIA based laptops. ODMs could use hardware multiplexer to get around this, which was the solution prior to Optimus, but due to the added cost and complexity needed my guess is that this will not be available on very many, if any, laptops which want to leverage G-Sync.
Windowed Mode G-Sync
The second major G-Sync announcement coming from NVIDIA today is that G-Sync is receiving windowed mode support, with that functionality being rolled into NVIDIA's latest drivers. Before now, running a game in Windowed mode could cause stutters and tearing because once you are in Windowed mode, the image being output is composited by the Desktop Window Manager (DWM) in Windows. Even though a game might be outputting 200 frames per second, DWM will only refresh the image with its own timings. The off-screen buffer for applications can be updated many times before DWM updates the actual image on the display.
NVIDIA will now change this using their display driver, and when Windowed G-Sync is enabled, whichever window is the current active window will be the one that determines the refresh rate. That means if you have a game open, G-Sync can be leveraged to reduce screen tearing and stuttering, but if you then click on your email application, the refresh rate will switch back to whatever rate that application is using. Since this is not always going to be a perfect solution - without a fixed refresh rate, it's impossible to make every application perfectly line up with every other application - Windowed G-Sync can be enabled or disabled on a per-application basis, or just globally turned on or off.
Meanwhile SLI users will be happy to know that Windowed G-Sync works there as well. However there will be a slight catch: for the moment it works for 2-way SLI, but not 3-way or 4-way SLI.
Finally, NVIDIA is also noting at this time that Windowed G-Sync is primarily for gaming applications, so movie viewers looking to get perfect timing in their windowed media players will be out of luck for the moment. The issue here isn’t actually with Windowed G-Sync, but rather current media players do not know about variable refresh technology and will always attempt to run at the desktop refresh rate. Once media players become Windowed G-Sync aware, it should be possible to have G-Sync work with media playback as well.
G-Sync Max Refresh Rate Framerate Control (AKA G-Sync V-Sync)
Third up on NVIDIA’s list of G-Sync announcements is support for controlling the behavior of G-Sync when framerates reach or exceed the refresh rate limit of a monitor. Previously, NVIDIA would cap the framerate at the refresh rate, essentially turning on v-sync in very high framerates. However with their latest update, NVIDIA is going to delegate that option to the user, allowing users to either enable or disable the framerate cap as they please.
The tradeoff here is that capping the framerate ensures that no tearing occurs since there are only as many frames as there are refresh intervals, but it also introduces some input lag if frames are held back to be displayed rather than displayed immediately. NVIDIA previously opted for a tear-free experience, but now will let the user pick between tear-free operation or reducing input lag to the bare minimum. This is one area where NVIDIA’s G-Sync and AMD’s Freesync implementations have significantly differed – AMD was the first to allow the user to control this – so NVIDIA is going for feature parity with AMD in this case.
New G-Sync Monitors
Last but certainly not least from today’s G-Sync announcements, NVIDIA is announcing that their partners Acer and Asus are preparing several new G-Sync monitors for release this year. Most notably, both will be releasing 34” 3440x1440 ultra-wide monitors. Both displays are IPS based, with the Asus model topping out at 60Hz while the Acer model tops out at 75Hz. Meanwhile Acer will be releasing a second, 35” ultra-wide based on a VA panel and operating at a resolution of 2560x1080.
Asus and Acer will also be releasing some additional traditional format monitors at 4K and 1440p. This includes some new 27”/28” 4K IPS monitors and a 27” 1440p IPS monitor that runs at 144Hz. All of these monitors are scheduled for release this year, however as they’re third party products NVIDIA is unable to give us a precise ETA. They’re hoping for a faster turnaround time than the first generation of G-Sync monitors, though how much faster remains to be seen.
Post Your CommentPlease log in or sign up to comment.
View All Comments
chizow - Sunday, May 31, 2015 - linkVery interesting developments regarding Nvidia's G-Sync 2.0 that does away with the module in laptops, however, I wonder if that means that solution will be subject to the same limitations AMD sees regarding low-FPS ranges, where there is no lookaside/local buffer to handle low FPS by repeating frames.
These are all things Tom Petersen said before though, regarding eDP and Variable refresh rate, he didn't think an amendment to the DP standard was needed as everything was already there. It just looks like Nvidia has figured out the nuances of variable OverDrive before AMD.
It also opens up the possibility Nvidia offers a G-Sync Lite on the desktop for monitors that don't need the G-Sync module, but again, we'll need to see if that also means you forego the benefits of G-Sync that we see over FreeSync at the low refresh range of the spectrum.
Flunk - Sunday, May 31, 2015 - linkThis actually is basically the same as FreeSync. Adaptive Sync support on the display side and driver support for Adaptive Sync. I predict that Nvidia and AMD will keep using different brand names, but the technology is obviously on a collision course. All we need is for Intel to support it too and 99% of the GPU market would support Adaptive Sync.
chizow - Monday, June 1, 2015 - linkYes and No. eDP was always specified by both Nvidia and AMD as the precursor for AdaptiveSync, but it didn't require new scalers specifically because the GPU was in direct communication with the display using embedded DP protocols. That isn't the case with desktop GPUs, which is why AMD has to create the DP Adaptive Sync protocol that piggy-backs on Vblank signaling. The ten ton elephant in the room however, is whether or not there are provisions in the desktop Adaptive Sync spec to control overdrive, as it is handled this way on the laptop display side, and how Nvidia handles it using their custom G-Sync scaler.
Beyond that, I imagine there will be logo program limitations even if Nvidia does adopt their own desktop G-Sync module-less Adaptive Sync monitor solution. While AMD has repeatedly claimed there is no licensing fee for FreeSync and that is royalty free, it is Trademarked and displays must pass AMD's logo program qualifcations. I am not sure how open they would be to allowing a co-branded Display that supported both FreeSync and G-Sync (Lite).
testbug00 - Monday, June 1, 2015 - linkJust make a display that supports A-sync. Given that what AMD says about Freesync being only a brand, and that Nvidia's version would be the same, it should work fine with both.
Of course, no one seems to want to launch a monitor with just A-sync branding. And, NVidia is even better at controlling it's brands than AMD. And they do love their proprietary stuff, so, I suspect that sadly we'll end up having a divided market until Intel wakes up and starts pushing A-sync on everything. At which point, neither AMD or NVidia will have a choice but to make sure their drivers work fine without their special branding.
As for controlling the brand more... I imagine AMD would be fine with a monitor also having some NVidia branding. Cannot imagine Nvidia doing the same, due to much better brand control. Something AMD really needs to figure out. Of course, I've convinced that AMD's "marketing" team is an unpaid intern who works an hour or two a week. At least in the NA/Europe markets.
testbug00 - Monday, June 1, 2015 - linkThe benefits I would argue isn't the big thing. The bigger thing is that monitors using A-sync scalars appear to only be going down to ~40fps instead of ~30fps.
I think that will matter more to people buying such monitors than how it handles below that threshold. Mostly because if you're buying a monitor to ensure smooth framerates wouldn't you make sure your game doesn't dip below the refresh rate unless it is impossible to do. And, I would say a lot of games dip into the low-mid 30s, but, not often to much lower.
testbug00 - Monday, June 1, 2015 - linkI mean the benefits below the minimum refresh rate when I said "the benefits" at the start.
chizow - Monday, June 1, 2015 - linkWell, there's 2 main benefits below the minimum window. 1) You still get VRR. 2) You don't get the nosticeable ghosting due to OD being broken and the monitor refreshes the last frame to prevent pixel decay. Nvidia has made it clear there still isn't a min refresh rate and from what they've said about it at PCPer, it seems that's because the GPU is directly controlling all aspects of the display over eDP, ie. the display has all the timing/OD/refresh characteristics exposed to the GPU directly.
testbug00 - Monday, June 1, 2015 - linkPCPer was a guess? Read the TechReport article when they, you know, ask an NVidia engineer. Basically, below the monitor minimum refresh rate, G-sync "guesses" when the next frames will come, and tries to line them up to avoid having to redraw mid-way through a frame.
And, if G-sync works like PCPer said, it would be a terrible latency-inducing mess at low framerates.
chizow - Sunday, May 31, 2015 - linkOh man, saw that last slide too, PG279Q Swift2 IPS 1440p just as I expected....need more info on that lol.
Laststop311 - Monday, June 1, 2015 - linkYea that monitor is going to be the number 1 gaming monitor. It's also going to be close to 1000 dollars i guarantee it.