Introduction to FreeSync and Adaptive Sync

The first time anyone talked about adaptive refresh rates for monitors – specifically applying the technique to gaming – was when NVIDIA demoed G-SYNC back in October 2013. The idea seemed so logical that I had to wonder why no one had tried to do it before. Certainly there are hurdles to overcome, e.g. what to do when the frame rate is too low, or too high; getting a panel that can handle adaptive refresh rates; supporting the feature in the graphics drivers. Still, it was an idea that made a lot of sense.

The impetus behind adaptive refresh is to overcome visual artifacts and stutter cause by the normal way of updating the screen. Briefly, the display is updated with new content from the graphics card at set intervals, typically 60 times per second. While that’s fine for normal applications, when it comes to games there are often cases where a new frame isn’t ready in time, causing a stall or stutter in rendering. Alternatively, the screen can be updated as soon as a new frame is ready, but that often results in tearing – where one part of the screen has the previous frame on top and the bottom part has the next frame (or frames in some cases).

Neither input lag/stutter nor image tearing are desirable, so NVIDIA set about creating a solution: G-SYNC. Perhaps the most difficult aspect for NVIDIA wasn’t creating the core technology but rather getting display partners to create and sell what would ultimately be a niche product – G-SYNC requires an NVIDIA GPU, so that rules out a large chunk of the market. Not surprisingly, the result was that G-SYNC took a bit of time to reach the market as a mature solution, with the first displays that supported the feature requiring modification by the end user.

Over the past year we’ve seen more G-SYNC displays ship that no longer require user modification, which is great, but pricing of the displays so far has been quite high. At present the least expensive G-SYNC displays are 1080p144 models that start at $450; similar displays without G-SYNC cost about $200 less. Higher spec displays like the 1440p144 ASUS ROG Swift cost $759 compared to other WQHD displays (albeit not 120/144Hz capable) that start at less than $400. And finally, 4Kp60 displays without G-SYNC cost $400-$500 whereas the 4Kp60 Acer XB280HK will set you back $750.

When AMD demonstrated their alternative adaptive refresh rate technology and cleverly called it FreeSync, it was a clear jab at the added cost of G-SYNC displays. As with G-SYNC, it has taken some time from the initial announcement to actual shipping hardware, but AMD has worked with the VESA group to implement FreeSync as an open standard that’s now part of DisplayPort 1.2a, and they aren’t getting any royalties from the technology. That’s the “Free” part of FreeSync, and while it doesn’t necessarily guarantee that FreeSync enabled displays will cost the same as non-FreeSync displays, the initial pricing looks quite promising.

There may be some additional costs associated with making a FreeSync display, though mostly these costs come in the way of using higher quality components. The major scaler companies – Realtek, Novatek, and MStar – have all built FreeSync (DisplayPort Adaptive Sync) into their latest products, and since most displays require a scaler anyway there’s no significant price increase. But if you compare a FreeSync 1440p144 display to a “normal” 1440p60 display of similar quality, the support for higher refresh rates inherently increases the price. So let’s look at what’s officially announced right now before we continue.

FreeSync Displays and Pricing
Comments Locked

350 Comments

View All Comments

  • SleepModezZ - Thursday, March 19, 2015 - link

    Really different reviews between AnandTech and PC Perspective. You conclude that FreeSync performs as well as G-Sync - if not better, because of the option to disable V-sync. PC Perspective, on the other hand, noticed that their FreeSync monitors performed badly compared to the G-Sync monitors when the frame rate dropped below the lowest refresh rate of the monitor.

    You give the impression that they would behave the same - or FreeSync would be potentially better because you could choose your poison: stutter or tearing - when with G-Sync you would always get stuttering. PC Perspective, on the other hand, tells that G-Sync monitors handle this gracefully by refreshing the display twice or more during those longer frames - and as such G-Sync avoids both stutter and tearing at those lower fram rates. Their FreeSync monitors did not do that - and the stuttering or tearing was very noticeable. The frame rate dropping below 48 fps is not uncommon and the displays behavior in those situations is very important. That makes the G-Sync the superior technology. Unless - the tearing or stuttering at speeds lower than the display's lowest refresh rate is only a problem with that specific monitor and not with the FreeSync / AdaptiveSync technology in general. (The LG monitor is incapable of doubling its slowest refresh rate - other monitors that are capable maybe could handle the situation differently. If not, FreeSync is the inferior technology.)

    I don't know how G-Sync and FreeSync actually would handle full screen movies at 24 fps. G-Sync could easily display it at a 48 Hz refresh rate. Your LG monitor would probably also show it at 48 Hz - because it is the lowest it could go. But would the LG monitor with FreeSync be smart enough to show a 25 fps movie in 50 Hz - or would it display it in 48 Hz with unnecessary tearing or stuttering?
  • Gigaplex - Friday, March 20, 2015 - link

    "PC Perspective, on the other hand, tells that G-Sync monitors handle this gracefully by refreshing the display twice or more during those longer frames - and as such G-Sync avoids both stutter and tearing at those lower fram rates."

    That would drastically reduce the effects of tearing, but it would not do much, if anything, for stutter.
  • SleepModezZ - Friday, March 20, 2015 - link

    It would reduce stutter in the sense that if the frame rate were, for example, constantly 30 fps, G-sync would give you every frame when it is ready - keeping the motion fluid. FreeSync with V-SYnc on, on the other hand, would force that into the lowest refresh rate of the monitor. It would double some frames and not others - making the timing of the frames different and making a constant 30 fps motion jerky where G-Sync would not. I would call that jerky motion 'stutter' - FreeSync (currently) has it, G-Sync does not.

    In short, G-Sync retains its variable refresh rate technology when going under the displays min refresh rate. FreeSync does not but switches to constant refresh rate at the monitors min refresh rate - introducing either tearing or stutter. Within the display's refresh rate range they perform the same. When going faster than the refresh rate range - FreeSync gives the option of disabling V-Sync and choosing tearing instead of stuttering. There it is better. I just think that the low fps range is probably more important than the high. I would not buy any FreeSync / Adaptive Sync displays before they demonstrate that they can handle those low fps situations as gracefully as G-Sync does.
  • WatcherCK - Thursday, March 19, 2015 - link

    TFTcentral have done a review of the soon to be released Acer monitor:
    http://www.tftcentral.co.uk/reviews/acer_xb270hu.h...

    And as Ryan said it is a beast, but one question you buy an XB270hu and you plug in your 290x, because the video card doesnt support GSYNC uses the standard scaler? in the Acer to display video data. Now if the Acer uses a scaler from one of the four main manufacturers listed in the article is there a chance it would support Freesync? (Acer wouldnt advertise that obviously since the monitor is a GSYNC branded monitor....)

    So there are a few assumptions above about the operations of GSYNC, but Im curious if this will be the case as it keeps red and green camps happy...

    One other question if anyone is happy to answer, high hertz refresh monitors will they maintain their peak capable refresh when in portrait mode or are they limited to a lower refresh rate or GSYNC for that matter? Im thinking a triple monitor portrait setup for my next build.

    cheers
  • sonicmerlin - Thursday, March 19, 2015 - link

    Will Freesync work with the current gen consoles?
  • SleepModezZ - Thursday, March 19, 2015 - link

    No.

    Adaptive Sync is a Display Port specific standard. What current gen console supports Display Port? None to my knowledge. HDMI is a different standard and I don't think there have been even any rumors about putting adaptive sync technology into the HDMI standard. And if it some day would come - would the current HDMI hardware on the consoles be able to support it after a driver update from AMD? Probably not.
  • Murloc - Thursday, March 19, 2015 - link

    it's not likely to happen any time soon since video and STBs etc. revolve around the usual framerates and TVs do the same so there's no need for this kind of flexibility, tearing is not an issue.

    Too bad that TV standards like HDMI spill over in the computer world (audio, projectors, laptops, etc.) and hamstring progress.
  • sonicmerlin - Friday, March 20, 2015 - link

    Well what if MS and Sony released hardware refreshes (like a slimmed down PS4) that included display port?
  • Gigaplex - Friday, March 20, 2015 - link

    I'm pretty sure that both Xbox One and PS4 use GCN 1.0 hardware, so no, a DisplayPort refresh probably wouldn't help.
  • Norseman4 - Thursday, March 19, 2015 - link

    Can you please verify some information:

    On the specs page for the BenQ XL2730Z (http://gaming.benq.com/gaming-monitor/xl2730z/spec... it states a 54Hz min vertical refresh. This could be a copy/paste issue since it's the same as the min horizontal refresh.

Log in

Don't have an account? Sign up now