FreeSync Features

In many ways FreeSync and G-SYNC are comparable. Both refresh the display as soon as a new frame is available, at least within their normal range of refresh rates. There are differences in how this is accomplished, however.

G-SYNC uses a proprietary module that replaces the normal scaler hardware in a display. Besides cost factors, this means that any company looking to make a G-SYNC display has to buy that module from NVIDIA. Of course the reason NVIDIA went with a proprietary module was because adaptive sync didn’t exist when they started working on G-SYNC, so they had to create their own protocol. Basically, the G-SYNC module controls all the regular core features of the display like the OSD, but it’s not as full featured as a “normal” scaler.

In contrast, as part of the DisplayPort 1.2a standard, Adaptive Sync (which is what AMD uses to enable FreeSync) will likely become part of many future displays. The major scaler companies (Realtek, Novatek, and MStar) have all announced support for Adaptive Sync, and it appears most of the changes required to support the standard could be accomplished via firmware updates. That means even if a display vendor doesn’t have a vested interest in making a FreeSync branded display, we could see future displays that still work with FreeSync.

Having FreeSync integrated into most scalers has other benefits as well. All the normal OSD controls are available, and the displays can support multiple inputs – though FreeSync of course requires the use of DisplayPort as Adaptive Sync doesn’t work with DVI, HDMI, or VGA (DSUB). AMD mentions in one of their slides that G-SYNC also lacks support for audio input over DisplayPort, and there’s mention of color processing as well, though this is somewhat misleading. NVIDIA's G-SYNC module supports color LUTs (Look Up Tables), but they don't support multiple color options like the "Warm, Cool, Movie, User, etc." modes that many displays have; NVIDIA states that the focus is on properly producing sRGB content, and so far the G-SYNC displays we've looked at have done quite well in this regard. We’ll look at the “Performance Penalty” aspect as well on the next page.

One other feature that differentiates FreeSync from G-SYNC is how things are handled when the frame rate is outside of the dynamic refresh range. With G-SYNC enabled, the system will behave as though VSYNC is enabled when frame rates are either above or below the dynamic range; NVIDIA's goal was to have no tearing, ever. That means if you drop below 30FPS, you can get the stutter associated with VSYNC while going above 60Hz/144Hz (depending on the display) is not possible – the frame rate is capped. Admittedly, neither situation is a huge problem, but AMD provides an alternative with FreeSync.

Instead of always behaving as though VSYNC is on, FreeSync can revert to either VSYNC off or VSYNC on behavior if your frame rates are too high/low. With VSYNC off, you could still get image tearing but at higher frame rates there would be a reduction in input latency. Again, this isn't necessarily a big flaw with G-SYNC – and I’d assume NVIDIA could probably rework the drivers to change the behavior if needed – but having choice is never a bad thing.

There’s another aspect to consider with FreeSync that might be interesting: as an open standard, it could potentially find its way into notebooks sooner than G-SYNC. We have yet to see any shipping G-SYNC enabled laptops, and it’s unlikely most notebooks manufacturers would be willing to pay $200 or even $100 extra to get a G-SYNC module into a notebook, and there's the question of power requirements. Then again, earlier this year there was an inadvertent leak of some alpha drivers that allowed G-SYNC to function on the ASUS G751j notebook without a G-SYNC module, so it’s clear NVIDIA is investigating other options.

While NVIDIA may do G-SYNC without a module for notebooks, there are still other questions. With many notebooks using a form of dynamic switchable graphics (Optimus and Enduro), support for Adaptive Sync by the Intel processor graphics could certainly help. NVIDIA might work with Intel to make G-SYNC work (though it’s worth pointing out that the ASUS G751 doesn’t support Optimus so it’s not a problem with that notebook), and AMD might be able to convince Intel to adopt DP Adaptive Sync, but to date neither has happened. There’s no clear direction yet but there’s definitely a market for adaptive refresh in laptops, as many are unable to reach 60+ FPS at high quality settings.

FreeSync Displays and Pricing FreeSync vs. G-SYNC Performance
Comments Locked

350 Comments

View All Comments

  • eanazag - Thursday, March 19, 2015 - link

    The AMD and Nvidia haters all come out of the wood work for these type articles.

    Intel needs to chime in. I suspect they will go the FreeSync route since it is part of the spec and there are no costs.

    I understand Nvidia has some investment here. I fully expect them to support adaptive sync - at least in 5 years. They really need to do something about Phys-X. As a customer I see it as irrelevant. I know it isn't their style to open up their tech.
  • eddman - Thursday, March 19, 2015 - link

    Not to go off-topic too much, but physx as a CPU physics engine, like havok, etc., is quite popular. There are hundreds of titles out there using it and more are coming.

    As for GPU physx, which is what you had in mind, yes, it'd never become widely adopted unless nvidia opens it up, and that would probably not happen, unless someone else comes up with another, open GPU accelerated physics engine.
  • mczak - Thursday, March 19, 2015 - link

    Minor nitpick, intel's solution won't be called FreeSync - this is reserved for AMD certified solutions. Pretty sure though it's going to be technically the same, just using the adaptive sync feature of DP 1.2a.
    (My guess would be at some point in the future nvidia is going to follow suit, first with notebooks because gsync is more or less impossible there though even then it will be initially restricted to notebooks which drive the display from the nvidia gpu which aren't many but everything else is going to require intel to support it first. I'm quite confident they are going to do this with desktop gpus too, though I would suspect they'd continue to call it GSync. Let's face it requiring a specific nvidia gsync module in the monitor just isn't going to fly with anything but high-end gaming market whereas adaptive sync should trickle down to a lot more markets, thus imho there's no way nvidia's position on this doesn't have to change.)
  • anubis44 - Tuesday, March 24, 2015 - link

    @eanazag: nVidia will be supporting FreeSync about 20 minutes after the first hacked nVidia driver to support FreeSync makes it onto the web, whether they like it or not.
  • chizow - Tuesday, March 24, 2015 - link

    Cool, I welcome it, one less reason to buy anything AMD related.
  • chizow - Thursday, March 19, 2015 - link

    There's no need to be disappointed honestly, Jarred just copy/pasted half of AMD's slide deck and then posted a Newegg Review. Nothing wrong with that, Newegg Reviews have their place in the world, its just unfortunate that people will take his conclusions and actually believe Freesync and G-Sync are equivalents, when there are already clear indications this is not the case.

    - 40 to 48 minimums are simply unacceptably low thresholds before things start falling apart, especially given many of these panels are higher than 1080p. 40 Minimum at 4K for example is DAMN hard to accomplish, in fact the recently launched Titan X can't even do it in most games. CrossFireX isn't going to be an option either until AMD fixes FreeSync + CF, if ever.

    -The tearing/ghosting/blurring issues at low frame rates is significant. AMD mentioned issues with pixel decay causing problems at low refresh, but honestly, this alone shows us G-Sync is worth the premium because it is simply better. http://www.pcper.com/files/imagecache/article_max_...
    Jarred has mused multiple times these panels may use the same one as the one in the Swift, so why are the FreeSync panels faling so badly at low refresh? Maybe that G-Sync module is actually doing something, like actively sync'ing with the monitor to force overdrive without breaking the kind of guesswork framesync FreeSync is using?

    -Input lag? We can show AMD's slide and take their word for it without even bothering to test? High speed camera, USB input double attached to a mouse, scroll and see which one responds faster. FreeSync certainly seems to work within its supported frequency bands in preventing tearing, but that was only half of the problem related to Vsync on/off. The other trade off for Vsync ON was how much input lag this introduced.

    -A better explanation of Vsync On/Off and tearing? Is this something the driver handles automatically? Is Vsync being turned on and off by the driver dynamically, similar to Nvidia's Adaptive Vsync? When it is on, does it introduce input lag?

    In any case, AnandTech's Newegg Review of FreeSync is certainly a nice preview and proof of concept of FreeSync, but I wouldn't take it as more than that. I'd wait for actual reviews to cover the science of display technology that actually matter, like input lag, blurring, image retention etc that can only really be captured and quantified with equipment like high speed cameras and a sound testing methodology.
  • at80eighty - Thursday, March 19, 2015 - link

    Waaa
  • chizow - Thursday, March 19, 2015 - link

    Another disappointed AMD user I see, I agree, FreeSync certainly isn't as good as one might have hoped.
  • at80eighty - Friday, March 20, 2015 - link

    had more nvidia cards than amd; so keep trying.
  • chizow - Friday, March 20, 2015 - link

    Doubt it, but keep trying.

Log in

Don't have an account? Sign up now