Introduction to FreeSync and Adaptive Sync

The first time anyone talked about adaptive refresh rates for monitors – specifically applying the technique to gaming – was when NVIDIA demoed G-SYNC back in October 2013. The idea seemed so logical that I had to wonder why no one had tried to do it before. Certainly there are hurdles to overcome, e.g. what to do when the frame rate is too low, or too high; getting a panel that can handle adaptive refresh rates; supporting the feature in the graphics drivers. Still, it was an idea that made a lot of sense.

The impetus behind adaptive refresh is to overcome visual artifacts and stutter cause by the normal way of updating the screen. Briefly, the display is updated with new content from the graphics card at set intervals, typically 60 times per second. While that’s fine for normal applications, when it comes to games there are often cases where a new frame isn’t ready in time, causing a stall or stutter in rendering. Alternatively, the screen can be updated as soon as a new frame is ready, but that often results in tearing – where one part of the screen has the previous frame on top and the bottom part has the next frame (or frames in some cases).

Neither input lag/stutter nor image tearing are desirable, so NVIDIA set about creating a solution: G-SYNC. Perhaps the most difficult aspect for NVIDIA wasn’t creating the core technology but rather getting display partners to create and sell what would ultimately be a niche product – G-SYNC requires an NVIDIA GPU, so that rules out a large chunk of the market. Not surprisingly, the result was that G-SYNC took a bit of time to reach the market as a mature solution, with the first displays that supported the feature requiring modification by the end user.

Over the past year we’ve seen more G-SYNC displays ship that no longer require user modification, which is great, but pricing of the displays so far has been quite high. At present the least expensive G-SYNC displays are 1080p144 models that start at $450; similar displays without G-SYNC cost about $200 less. Higher spec displays like the 1440p144 ASUS ROG Swift cost $759 compared to other WQHD displays (albeit not 120/144Hz capable) that start at less than $400. And finally, 4Kp60 displays without G-SYNC cost $400-$500 whereas the 4Kp60 Acer XB280HK will set you back $750.

When AMD demonstrated their alternative adaptive refresh rate technology and cleverly called it FreeSync, it was a clear jab at the added cost of G-SYNC displays. As with G-SYNC, it has taken some time from the initial announcement to actual shipping hardware, but AMD has worked with the VESA group to implement FreeSync as an open standard that’s now part of DisplayPort 1.2a, and they aren’t getting any royalties from the technology. That’s the “Free” part of FreeSync, and while it doesn’t necessarily guarantee that FreeSync enabled displays will cost the same as non-FreeSync displays, the initial pricing looks quite promising.

There may be some additional costs associated with making a FreeSync display, though mostly these costs come in the way of using higher quality components. The major scaler companies – Realtek, Novatek, and MStar – have all built FreeSync (DisplayPort Adaptive Sync) into their latest products, and since most displays require a scaler anyway there’s no significant price increase. But if you compare a FreeSync 1440p144 display to a “normal” 1440p60 display of similar quality, the support for higher refresh rates inherently increases the price. So let’s look at what’s officially announced right now before we continue.

FreeSync Displays and Pricing
Comments Locked

350 Comments

View All Comments

  • willis936 - Thursday, March 19, 2015 - link

    I would like an actual look at added input latency from these adaptive sync implementations. Nobody has even mentioned it but there's a very real possibility that either the graphics TX or monitor's scaler has to do enough thinking to cause a significant delay from when pixels come it to when they're displayed on the screen. Why isn't the first issue to be scrutinized be the thing that these technologies seek to solve?
  • mutantmagnet - Thursday, March 19, 2015 - link

    Acer already posted the MSRP

    http://us.acer.com/ac/en/US/content/model/UM.HB0AA...

    $800
  • mutantmagnet - Thursday, March 19, 2015 - link

    I forgot to mention it's already on sale in Europe.
  • JarredWalton - Thursday, March 19, 2015 - link

    Google was failing me last night, though granted I haven't slept much in the past two days.
  • ezridah - Thursday, March 19, 2015 - link

    It's odd that on their product page they don't mention G-Sync or the refresh rate anywhere... It's like they don't want to sell it or something.
  • eanazag - Thursday, March 19, 2015 - link

    My monitors last longer than 5 years. Basically I keep them till they die. I have a 19" 1280x1024 on the shared home computer I'm considering replacing. I'd be leaning towards neither or Freesync monitors.

    I currently am sporting AMD GPUs, but I am one of those who go back and forth between vendors and I don't think it is as small a minority as was assumed. I bought two R9 290's when AMD last February. If I was buying right now, I'd be getting a GTX 970. I do like the GeForce Experience software. I'm still considering a GTX 750 Ti.

    I'm not totally sold on what AMD has in the market at the moment. I have a lot of heat concerns using in Crossfire and the wattage is higher than I like. The original 290 blowers sucked. I'd like blower cards again that are quality like Nvidia's.
  • Dorek - Thursday, March 19, 2015 - link

    Wait, you didn't just say that you use two R9 290s ona 1280x1024 monitor, right?
  • medi03 - Thursday, March 19, 2015 - link

    I don't get how 970 is better than 290x. it is slower and more expensive:
    http://www.anandtech.com/show/8568/the-geforce-gtx...

    And total system consumption is lower by about 20-25% (305w on 970 vs 365 on 290x). No big deal
  • JarredWalton - Thursday, March 19, 2015 - link

    It's not "better" but it is roughly equivalent. I've got benchmarks from over 20 games. Average for 290 X at 2560x1440 "Ultra" across those games is 57.4 FPS while the average for 970 is 56.8 FPS. Your link to Crysis: Warhead is one title where AMD wins, but I could counter with GRID 2/Autosport and Lord of the Fallen where NVIDIA wins. And of the two GPUs, 970 will overclock more than 290X if you want to do that.
  • TallestJon96 - Thursday, March 19, 2015 - link

    I'm an NVIDIA User, but in happy to see the proprietary GSYNC get beat down. I've got a 1080p144 non GSYNC panel, so I won't be upgrading for 3-5 years, and hopefully 4k and FreeSync will both be standard by then.

Log in

Don't have an account? Sign up now