Introduction to FreeSync and Adaptive Sync

The first time anyone talked about adaptive refresh rates for monitors – specifically applying the technique to gaming – was when NVIDIA demoed G-SYNC back in October 2013. The idea seemed so logical that I had to wonder why no one had tried to do it before. Certainly there are hurdles to overcome, e.g. what to do when the frame rate is too low, or too high; getting a panel that can handle adaptive refresh rates; supporting the feature in the graphics drivers. Still, it was an idea that made a lot of sense.

The impetus behind adaptive refresh is to overcome visual artifacts and stutter cause by the normal way of updating the screen. Briefly, the display is updated with new content from the graphics card at set intervals, typically 60 times per second. While that’s fine for normal applications, when it comes to games there are often cases where a new frame isn’t ready in time, causing a stall or stutter in rendering. Alternatively, the screen can be updated as soon as a new frame is ready, but that often results in tearing – where one part of the screen has the previous frame on top and the bottom part has the next frame (or frames in some cases).

Neither input lag/stutter nor image tearing are desirable, so NVIDIA set about creating a solution: G-SYNC. Perhaps the most difficult aspect for NVIDIA wasn’t creating the core technology but rather getting display partners to create and sell what would ultimately be a niche product – G-SYNC requires an NVIDIA GPU, so that rules out a large chunk of the market. Not surprisingly, the result was that G-SYNC took a bit of time to reach the market as a mature solution, with the first displays that supported the feature requiring modification by the end user.

Over the past year we’ve seen more G-SYNC displays ship that no longer require user modification, which is great, but pricing of the displays so far has been quite high. At present the least expensive G-SYNC displays are 1080p144 models that start at $450; similar displays without G-SYNC cost about $200 less. Higher spec displays like the 1440p144 ASUS ROG Swift cost $759 compared to other WQHD displays (albeit not 120/144Hz capable) that start at less than $400. And finally, 4Kp60 displays without G-SYNC cost $400-$500 whereas the 4Kp60 Acer XB280HK will set you back $750.

When AMD demonstrated their alternative adaptive refresh rate technology and cleverly called it FreeSync, it was a clear jab at the added cost of G-SYNC displays. As with G-SYNC, it has taken some time from the initial announcement to actual shipping hardware, but AMD has worked with the VESA group to implement FreeSync as an open standard that’s now part of DisplayPort 1.2a, and they aren’t getting any royalties from the technology. That’s the “Free” part of FreeSync, and while it doesn’t necessarily guarantee that FreeSync enabled displays will cost the same as non-FreeSync displays, the initial pricing looks quite promising.

There may be some additional costs associated with making a FreeSync display, though mostly these costs come in the way of using higher quality components. The major scaler companies – Realtek, Novatek, and MStar – have all built FreeSync (DisplayPort Adaptive Sync) into their latest products, and since most displays require a scaler anyway there’s no significant price increase. But if you compare a FreeSync 1440p144 display to a “normal” 1440p60 display of similar quality, the support for higher refresh rates inherently increases the price. So let’s look at what’s officially announced right now before we continue.

FreeSync Displays and Pricing
Comments Locked

350 Comments

View All Comments

  • marraco - Thursday, March 19, 2015 - link

    I ever owned nVidia GPUs (not fue to fanboyism, but the coincidence of geforces being the sweet spot each time I needed a new card).

    Still, I will not pay for G-SYNC. I don't want to be tied to a company.

    I also can't buy a FreeSync, because is not supported by nVidia.

    Also, hardware supported features tend to turn obsolete at a faster rate than software ones, so I do not trust G-Sync.
  • Murloc - Thursday, March 19, 2015 - link

    same here, I can just wait a year or so before upgrading monitor and gpu, their loss. If in the meanwhile AMD comes up with something competitive (i.e. also not an oven please), they win.
  • Norseman4 - Friday, March 20, 2015 - link

    But you can buy an Adaptive Sync monitor and use it with any GPU. You won't get the benefits of FreeSync without AMD, but that is all.
  • Tikcus9666 - Thursday, March 19, 2015 - link

    I aint overly worried, tearing does not bother me, I can't say I really notice it when playing, however I am only playing at 1080p with a Radeon 280
  • steve4king - Thursday, March 19, 2015 - link

    Hats off to Nvidia for delivering G-Sync and getting the ball rolling on this thing. They were the first to create a solution for a very real problem.

    Because of NVidia's pioneering, and because NVidia won't license the technology to AMD, AMD had to find their own solution in re-purposing an existing DP1.2a feature to provide the same function.

    It makes sense for NVidia to refuse to support adaptive refresh, until these displays become commonplace. They had the only card and the only display module that could do this, and they needed to sell as many as they could before the competition's technology was viable.

    Soon NVidia needs to reverse that decision, because I'm not going to buy an inferior monitor, just so that I can slap "The Way It's Meant to Be Played" on the side of my computer.

    I fully expect that both will come together on this one. NVidia had a good run with G-Sync. But now it needs to jump on the bandwagon or risk losing out on GPU sales.
  • PPalmgren - Friday, March 20, 2015 - link

    Unfortunately, I doubt it. While they are great first movers, look at their track record of good tech that could be great tech with industry-wide adoption via less proprietary measures: PhysX, CUDA, 3D Surround, Gsync, etc. They also have a poor history of working with more open platforms like Linux. "Our way or the highway" is the vibe I get.
  • Soulwager - Thursday, March 19, 2015 - link

    What about actually testing the fallback cases, where framerate is outside the monitor's range of refresh rates? We need an input lag comparison when both monitors are maxed out in v-sync mode, and a gpu utilization comparison when framerates dip below the monitor's minimum refresh rate.
  • ncsaephanh - Thursday, March 19, 2015 - link

    Finally, some competition up in here.
  • czesiu - Thursday, March 19, 2015 - link

    "One final topic to address is something that has become more noticeable to me over the past few months. While G-SYNC/FreeSync can make a big difference when frame rates are in the 40~75 FPS range, as you go beyond that point the benefits are a lot less clear. Take the 144Hz ASUS ROG Swift as an example. Even with G-SYNC disabled, the 144Hz refresh rate makes tearing rather difficult to spot, at least in my experience. "

    Does 144hz monitor help when the FPS is ~40?
  • JarredWalton - Thursday, March 19, 2015 - link

    Sure. It draws the frames 3-4 times between updates, so even if half of the frame showed tearing on the first pass it gets cleaned up on the second and third passes. And with VSYNC enabled, you can fall back to 72Hz and 48Hz before you are at ~30 Hz.

Log in

Don't have an account? Sign up now