FreeSync Features

In many ways FreeSync and G-SYNC are comparable. Both refresh the display as soon as a new frame is available, at least within their normal range of refresh rates. There are differences in how this is accomplished, however.

G-SYNC uses a proprietary module that replaces the normal scaler hardware in a display. Besides cost factors, this means that any company looking to make a G-SYNC display has to buy that module from NVIDIA. Of course the reason NVIDIA went with a proprietary module was because adaptive sync didn’t exist when they started working on G-SYNC, so they had to create their own protocol. Basically, the G-SYNC module controls all the regular core features of the display like the OSD, but it’s not as full featured as a “normal” scaler.

In contrast, as part of the DisplayPort 1.2a standard, Adaptive Sync (which is what AMD uses to enable FreeSync) will likely become part of many future displays. The major scaler companies (Realtek, Novatek, and MStar) have all announced support for Adaptive Sync, and it appears most of the changes required to support the standard could be accomplished via firmware updates. That means even if a display vendor doesn’t have a vested interest in making a FreeSync branded display, we could see future displays that still work with FreeSync.

Having FreeSync integrated into most scalers has other benefits as well. All the normal OSD controls are available, and the displays can support multiple inputs – though FreeSync of course requires the use of DisplayPort as Adaptive Sync doesn’t work with DVI, HDMI, or VGA (DSUB). AMD mentions in one of their slides that G-SYNC also lacks support for audio input over DisplayPort, and there’s mention of color processing as well, though this is somewhat misleading. NVIDIA's G-SYNC module supports color LUTs (Look Up Tables), but they don't support multiple color options like the "Warm, Cool, Movie, User, etc." modes that many displays have; NVIDIA states that the focus is on properly producing sRGB content, and so far the G-SYNC displays we've looked at have done quite well in this regard. We’ll look at the “Performance Penalty” aspect as well on the next page.

One other feature that differentiates FreeSync from G-SYNC is how things are handled when the frame rate is outside of the dynamic refresh range. With G-SYNC enabled, the system will behave as though VSYNC is enabled when frame rates are either above or below the dynamic range; NVIDIA's goal was to have no tearing, ever. That means if you drop below 30FPS, you can get the stutter associated with VSYNC while going above 60Hz/144Hz (depending on the display) is not possible – the frame rate is capped. Admittedly, neither situation is a huge problem, but AMD provides an alternative with FreeSync.

Instead of always behaving as though VSYNC is on, FreeSync can revert to either VSYNC off or VSYNC on behavior if your frame rates are too high/low. With VSYNC off, you could still get image tearing but at higher frame rates there would be a reduction in input latency. Again, this isn't necessarily a big flaw with G-SYNC – and I’d assume NVIDIA could probably rework the drivers to change the behavior if needed – but having choice is never a bad thing.

There’s another aspect to consider with FreeSync that might be interesting: as an open standard, it could potentially find its way into notebooks sooner than G-SYNC. We have yet to see any shipping G-SYNC enabled laptops, and it’s unlikely most notebooks manufacturers would be willing to pay $200 or even $100 extra to get a G-SYNC module into a notebook, and there's the question of power requirements. Then again, earlier this year there was an inadvertent leak of some alpha drivers that allowed G-SYNC to function on the ASUS G751j notebook without a G-SYNC module, so it’s clear NVIDIA is investigating other options.

While NVIDIA may do G-SYNC without a module for notebooks, there are still other questions. With many notebooks using a form of dynamic switchable graphics (Optimus and Enduro), support for Adaptive Sync by the Intel processor graphics could certainly help. NVIDIA might work with Intel to make G-SYNC work (though it’s worth pointing out that the ASUS G751 doesn’t support Optimus so it’s not a problem with that notebook), and AMD might be able to convince Intel to adopt DP Adaptive Sync, but to date neither has happened. There’s no clear direction yet but there’s definitely a market for adaptive refresh in laptops, as many are unable to reach 60+ FPS at high quality settings.

FreeSync Displays and Pricing FreeSync vs. G-SYNC Performance
Comments Locked

350 Comments

View All Comments

  • barleyguy - Thursday, March 19, 2015 - link

    I was already shopping for a 21:9 monitor for my home office. I'm now planning to order a 29UM67 as soon as I see one in stock. The GPU in that machine is an R7/260X, which is on the compatible list. :-)
  • boozed - Thursday, March 19, 2015 - link

    "the proof is in the eating of the pudding"

    Thankyou for getting this expression right!

    Oh, and Freesync looks cool too.
  • D. Lister - Thursday, March 19, 2015 - link

    I have had my reservations with claims made by AMD these days, and my opinion of 'FreeSync' wasn't quite in contrast. If this actually works at least just as well as G-Sync (as claimed by this rather brief review) with various hardware/software setups then it is indeed a praiseworthy development. I personally would certainly be glad that the rivalry of two tech giants resulted (even if only inadvertently) in something that benefits the consumer.
  • cmdrdredd - Thursday, March 19, 2015 - link

    I love the arguments about "freesync is an open standard" when it doesn't matter. 80% of the market is Nvidia and won't be using it. Intel is a non-issue because not many people are playing games that benefit from adaptive v-sync. Think about it, either way you're stuck. If you buy a GSync monitor now you likely will upgrade your GPU before the monitor goes out. So your options are only Nvidia. If you buy a freesync monitor your options are only AMD. So everyone arguing against gsync because you're stuck with Nvidia, have fun being stuck with AMD the other way around.

    Best to not even worry about either of these unless you absolutely do not see yourself changing GPU manufacturers for the life of the display.
  • barleyguy - Friday, March 20, 2015 - link

    NVidia is 71% of the AIB market, as of the latest released numbers from Hexus. That doesn't include AMD's APUs, which also support Freesync and are often used by "midrange" gamers.

    The relevance of being an open standard though, is that monitor manufacturers can add it with almost zero extra cost. If it's built into nearly every monitor in a couple of years, then NVidia might have a reason to start supporting it.
  • tsk2k - Thursday, March 19, 2015 - link

    @Jarred Walton
    You disappoint me.
    What you said about G-sync below minimum refresh rate is not correct, also there seems to be issues with ghosting on freesync. I encourage everyone to go to PCper(dot)com and read a much more in-depth article on the subject.
    Get rekt anandtech.
  • JarredWalton - Friday, March 20, 2015 - link

    If you're running a game and falling below the minimum refresh rate, you're using settings that are too demanding for your GPU. I've spent quite a few hours playing games on the LG 34UM67 today just to see if I could see/feel issues below 48 FPS. I can't say that I did, though I also wasn't running settings that dropped below 30 FPS. Maybe I'm just getting too old, but if the only way to quantify the difference is with expensive equipment, perhaps we're focusing too much on the theoretical rather than the practical.

    Now, there will undoubtedly be some that say they really see/feel the difference, and maybe they do. There will be plenty of others where it doesn't matter one way or the other. But if you've got an R9 290X and you're looking at the LG 34UM67, I see no reason not to go that route. Of course you need to be okay with a lower resolution and a more limited range for VRR, and you're also willing to go with a slower response time IPS (AHVA) panel rather than dealing with TN problems. Many people are.

    What's crazy to me is all the armchair experts reading our review and the PCPer review and somehow coming out with one or the other of us being "wrong". I had limited time with the FreeSync display, but even so there was nothing I encountered that caused me any serious concern. Are there cases where FreeSync doesn't work right? Yes. The same applies to G-SYNC. (For instance, at 31 FPS on a G-SYNC display, you won't get frame doubling but you will see some flicker in my experience. So that 30-40 FPS range is a problem for G-SYNC as well as FreeSync.)

    I guess it's all a matter of perspective. Is FreeSync identical to G-SYNC? No, and we shouldn't expect it to be. The question is how much the differences matter. Remember the anisotropic filtering wars of last decade where AMD and NVIDIA were making different optimizations? Some were perhaps provably better, but in practice most gamers didn't really care. It was all just used as flame bait and marketing fluff.

    I would agree that right now you can make the case the G-SYNC is provably better than FreeSync in some situations, but then both are provably better than static refresh rates. It's the edge cases where NVIDIA wins (specifically, when frame rates fall below the minimum VRR rate), but when that happens you're already "doing it wrong". Seriously, if I play a game and it starts to stutter, I drop the quality settings a notch. I would wager most gamers do the same. When we're running benchmarks and comparing performance, it's all well and good to say GPU 1 is better than GPU 2, but in practice people use settings that provide a good experience.

    Example:
    Assassin's Creed: Unity runs somewhat poorly on AMD GPUs. Running at Ultra settings or even Very High in my experience is asking for problems, no matter if you have a FreeSync display or not. Stick with High and you'll be a lot happier, and in the middle of a gaming session I doubt anyone will really care about the slight drop in visual fidelity. With an R9 290X running at 2560x1080 High, ACU typically runs at 50-75FPS on the LG 34UM67; with a GTX 970, it would run faster and be "better". But unless you have both GPUs and for some reason you like swapping between them, it's all academic: you'll find settings that work and play the game, or you'll switch to a different game.

    Bottom Line: AMD users can either go with FreeSync or not; they have no other choice. NVIDIA users likewise can go with G-SYNC or not. Both provide a smoother gaming experience than 60Hz displays, absolutely... but with a 120/144Hz panel only the high speed cameras and eagle eyed youth will really notice the difference. :-)
  • chizow - Friday, March 20, 2015 - link

    Haha love it, still feisty I see even in your "old age" there Jarred. I think all the armchair experts want is for you and AT to use your forum on the internet to actually do the kind of testing and comparisons that matter for the products being discussed, not just provide another Engadget-like experience of superficial touch-feely review, dismissing anything actually relevant to this tech and market as not being discernable to someone "your age".
  • JarredWalton - Friday, March 20, 2015 - link

    It's easy to point out flaws in testing; it's a lot harder to get the hardware necessary to properly test things like input latency. AnandTech doesn't have a central location, so I basically test with what I have. Things I don't have include gadgets to measure refresh rate in a reliable fashion, high speed cameras, etc. Another thing that was lacking: time. I received the display on March 17, in the afternoon; sometimes you just do what you can in the time you're given.

    You however are making blanket statements that are pro-NVIDIA/anti-AMD, just as you always do. The only person that takes your comments seriously is you, and perhaps other NVIDIA zealots. Mind you, I prefer my NVIDIA GPUs to my AMD GPUs for a variety of reasons, but I appreciate competition and in this case no one is going to convince me that the closed ecosystem of G-SYNC is the best way to do things long-term. Short-term it was the way to be first, but now there's an open DisplayPort standard (albeit an optional one) and NVIDIA really should do everyone a favor and show that they can support both.

    If NVIDIA feels G-SYNC is ultimately the best way to do things, fine -- support both and let the hardware enthusiasts decide which they actually want to use. With only seven G-SYNC displays there's not a lot of choice right now, and if most future DP1.2a and above displays use scalers that support Adaptive Sync it would be stupid not to at least have an alternate mode.

    But if the only real problem with FreeSync is when you fall below the minimum refresh rate you get judder/tearing, that's not a show stopper. As I said above, if that happens to me I'm already changing my settings. (I do the same with G-SYNC incidentally: my goal is 45+ FPS, as below 40 doesn't really feel smooth to me. YMMV.)
  • Soulwager - Saturday, March 21, 2015 - link

    You can test absolute input latency to sub millisecond precision with ~50 bucks worth of hobby electronics, free software, and some time to play with it. For example, an arduino micro, a photoresistor, a second resistor to make a divider, a breadboard, and a usb cable. Set the arduino up to emulate a mouse, and record the difference in timing between a mouse input and the corresponding change in light intensity. Let it log a couple minutes of press/release cycles, subtract 1ms of variance for USB polling, and there you go, full chain latency. If you have access to a CRT, you can get a precise baseline as well.

    As for sub-VRR behavior, if you leave v-sync on, does the framerate drop directly to 20fps, or is AMD using triple buffering?

Log in

Don't have an account? Sign up now