FreeSync Features

In many ways FreeSync and G-SYNC are comparable. Both refresh the display as soon as a new frame is available, at least within their normal range of refresh rates. There are differences in how this is accomplished, however.

G-SYNC uses a proprietary module that replaces the normal scaler hardware in a display. Besides cost factors, this means that any company looking to make a G-SYNC display has to buy that module from NVIDIA. Of course the reason NVIDIA went with a proprietary module was because adaptive sync didn’t exist when they started working on G-SYNC, so they had to create their own protocol. Basically, the G-SYNC module controls all the regular core features of the display like the OSD, but it’s not as full featured as a “normal” scaler.

In contrast, as part of the DisplayPort 1.2a standard, Adaptive Sync (which is what AMD uses to enable FreeSync) will likely become part of many future displays. The major scaler companies (Realtek, Novatek, and MStar) have all announced support for Adaptive Sync, and it appears most of the changes required to support the standard could be accomplished via firmware updates. That means even if a display vendor doesn’t have a vested interest in making a FreeSync branded display, we could see future displays that still work with FreeSync.

Having FreeSync integrated into most scalers has other benefits as well. All the normal OSD controls are available, and the displays can support multiple inputs – though FreeSync of course requires the use of DisplayPort as Adaptive Sync doesn’t work with DVI, HDMI, or VGA (DSUB). AMD mentions in one of their slides that G-SYNC also lacks support for audio input over DisplayPort, and there’s mention of color processing as well, though this is somewhat misleading. NVIDIA's G-SYNC module supports color LUTs (Look Up Tables), but they don't support multiple color options like the "Warm, Cool, Movie, User, etc." modes that many displays have; NVIDIA states that the focus is on properly producing sRGB content, and so far the G-SYNC displays we've looked at have done quite well in this regard. We’ll look at the “Performance Penalty” aspect as well on the next page.

One other feature that differentiates FreeSync from G-SYNC is how things are handled when the frame rate is outside of the dynamic refresh range. With G-SYNC enabled, the system will behave as though VSYNC is enabled when frame rates are either above or below the dynamic range; NVIDIA's goal was to have no tearing, ever. That means if you drop below 30FPS, you can get the stutter associated with VSYNC while going above 60Hz/144Hz (depending on the display) is not possible – the frame rate is capped. Admittedly, neither situation is a huge problem, but AMD provides an alternative with FreeSync.

Instead of always behaving as though VSYNC is on, FreeSync can revert to either VSYNC off or VSYNC on behavior if your frame rates are too high/low. With VSYNC off, you could still get image tearing but at higher frame rates there would be a reduction in input latency. Again, this isn't necessarily a big flaw with G-SYNC – and I’d assume NVIDIA could probably rework the drivers to change the behavior if needed – but having choice is never a bad thing.

There’s another aspect to consider with FreeSync that might be interesting: as an open standard, it could potentially find its way into notebooks sooner than G-SYNC. We have yet to see any shipping G-SYNC enabled laptops, and it’s unlikely most notebooks manufacturers would be willing to pay $200 or even $100 extra to get a G-SYNC module into a notebook, and there's the question of power requirements. Then again, earlier this year there was an inadvertent leak of some alpha drivers that allowed G-SYNC to function on the ASUS G751j notebook without a G-SYNC module, so it’s clear NVIDIA is investigating other options.

While NVIDIA may do G-SYNC without a module for notebooks, there are still other questions. With many notebooks using a form of dynamic switchable graphics (Optimus and Enduro), support for Adaptive Sync by the Intel processor graphics could certainly help. NVIDIA might work with Intel to make G-SYNC work (though it’s worth pointing out that the ASUS G751 doesn’t support Optimus so it’s not a problem with that notebook), and AMD might be able to convince Intel to adopt DP Adaptive Sync, but to date neither has happened. There’s no clear direction yet but there’s definitely a market for adaptive refresh in laptops, as many are unable to reach 60+ FPS at high quality settings.

FreeSync Displays and Pricing FreeSync vs. G-SYNC Performance
Comments Locked

350 Comments

View All Comments

  • Cerb - Saturday, March 21, 2015 - link

    If it's not working, this is just as wrong. Since it's fairly close, at 24, 25, or almost 30, you will see the tear line creeping up or down the image, if vsync isn't on. It's exceptionally obvious. Usually, you will just see skipped frames on Windows, since the compositor forces vsync for the desktop, and this is generally well-supported by any video player's playback mechanisms. The skipped frames become more noticeable as you watch, but aren't nearly as bad as tearing.
  • looncraz - Saturday, March 21, 2015 - link

    Tearing can happen anytime.

    I'm writing a compositing engine for HaikuOS and I would LOVE to be able to control the refresh timing! when a small update occurs, and the frame buffer is ready, I'd swap it, trigger a monitor refresh, and then be on my way right away.

    As it stands, I have to either always be a frame behind, or try and guess how long composing the frame buffer from the update stream will take before I know anything about what the update stream will be like so I know when to wake up the composite engine control loop.

    That means, even on normal day-to-day stuff, like opening a menu, dragging icons, playing solitaire, browsing the web, etc. FreeSync would be quite useful. As it stands, the best I can do is hope the frame is ready for the next interval, or wait until the next refresh is complete to swap frame buffers - which means that the data on screen is always a frame out of date (or more).

    At 60hz that is a fixed delay multiplier of 16.7, with a minimum multiplicand of 1. Going with higher refresh rates on the desktop is just wasteful (we don't really need 60, except for things to feel smooth due to the delay multiplier effect of the refresh rate).

    If I could use the whole range from 45hz to 75 hz, our (virtual) multiplicand could be 0.75-1.33, instead of exactly 1 or 2. That make a significant difference in jitter.

    Everything would be smoother - and we could drop down to a 45hz refresh interval by default, saving energy in the process, instead of being stuck at at a fixed cadence.
  • Cerb - Saturday, March 21, 2015 - link

    Wrong. it is generating the visuals, and doing so the exact same way, as far as any of this technology is concerned, and screen tearing does happen, because refresh rates vary from our common ones.
  • soccerballtux - Friday, March 20, 2015 - link

    considering the power saving impact it's had on the mobile sector (no sense rendering to pixels that haven't changed, just render to the ones that have), it most definitely would have a significant impact on the laptop market and would be a great 'green' tech in general.
  • erple2 - Friday, March 20, 2015 - link

    No value, except to the consumer that doesn't have to pay the (current) $160+ premium for g-sync. Now, if amd had a gfx card competitor to the gtx980, it'd be marvelous, and a no brainer. Given that the cost is apparently minimal to implement, I don't see that as a problem. Even if you think it's not value added, panel manufacturers shoved the pointless 3d down everyone's throat, so clearly, they're not averse to that behavior.
  • mdriftmeyer - Sunday, March 22, 2015 - link

    It has value for any animated sequence.
  • JonnyDough - Monday, March 23, 2015 - link

    Inside of gaming it has plenty of value - who even cares about the rest? Gaming was a $25.1 billion market in 2010 (ESA annual report). I'd take a billionth of that pie and go out for a nice meal wouldn't you?
  • dragonsqrrl - Thursday, March 19, 2015 - link

    ... No current or upcoming DP spec ...requires... adaptive sync. It's optional, not sure how else you could interpret that, especially when you take the comment I responded to into consideration.
  • eddman - Friday, March 20, 2015 - link

    Wait a minute; that only applies to monitors, right? It'd suck to buy a DP 1.2a/3 video card and find out that it cannot do adaptive-sync.
  • tobi1449 - Friday, March 20, 2015 - link

    Plus FreeSync != Adaptive Sync

Log in

Don't have an account? Sign up now