Introduction to FreeSync and Adaptive Sync

The first time anyone talked about adaptive refresh rates for monitors – specifically applying the technique to gaming – was when NVIDIA demoed G-SYNC back in October 2013. The idea seemed so logical that I had to wonder why no one had tried to do it before. Certainly there are hurdles to overcome, e.g. what to do when the frame rate is too low, or too high; getting a panel that can handle adaptive refresh rates; supporting the feature in the graphics drivers. Still, it was an idea that made a lot of sense.

The impetus behind adaptive refresh is to overcome visual artifacts and stutter cause by the normal way of updating the screen. Briefly, the display is updated with new content from the graphics card at set intervals, typically 60 times per second. While that’s fine for normal applications, when it comes to games there are often cases where a new frame isn’t ready in time, causing a stall or stutter in rendering. Alternatively, the screen can be updated as soon as a new frame is ready, but that often results in tearing – where one part of the screen has the previous frame on top and the bottom part has the next frame (or frames in some cases).

Neither input lag/stutter nor image tearing are desirable, so NVIDIA set about creating a solution: G-SYNC. Perhaps the most difficult aspect for NVIDIA wasn’t creating the core technology but rather getting display partners to create and sell what would ultimately be a niche product – G-SYNC requires an NVIDIA GPU, so that rules out a large chunk of the market. Not surprisingly, the result was that G-SYNC took a bit of time to reach the market as a mature solution, with the first displays that supported the feature requiring modification by the end user.

Over the past year we’ve seen more G-SYNC displays ship that no longer require user modification, which is great, but pricing of the displays so far has been quite high. At present the least expensive G-SYNC displays are 1080p144 models that start at $450; similar displays without G-SYNC cost about $200 less. Higher spec displays like the 1440p144 ASUS ROG Swift cost $759 compared to other WQHD displays (albeit not 120/144Hz capable) that start at less than $400. And finally, 4Kp60 displays without G-SYNC cost $400-$500 whereas the 4Kp60 Acer XB280HK will set you back $750.

When AMD demonstrated their alternative adaptive refresh rate technology and cleverly called it FreeSync, it was a clear jab at the added cost of G-SYNC displays. As with G-SYNC, it has taken some time from the initial announcement to actual shipping hardware, but AMD has worked with the VESA group to implement FreeSync as an open standard that’s now part of DisplayPort 1.2a, and they aren’t getting any royalties from the technology. That’s the “Free” part of FreeSync, and while it doesn’t necessarily guarantee that FreeSync enabled displays will cost the same as non-FreeSync displays, the initial pricing looks quite promising.

There may be some additional costs associated with making a FreeSync display, though mostly these costs come in the way of using higher quality components. The major scaler companies – Realtek, Novatek, and MStar – have all built FreeSync (DisplayPort Adaptive Sync) into their latest products, and since most displays require a scaler anyway there’s no significant price increase. But if you compare a FreeSync 1440p144 display to a “normal” 1440p60 display of similar quality, the support for higher refresh rates inherently increases the price. So let’s look at what’s officially announced right now before we continue.

FreeSync Displays and Pricing
Comments Locked

350 Comments

View All Comments

  • Oxford Guy - Friday, March 20, 2015 - link

    "Now you want existing displays that are already assembled to be pulled apart and upgraded. That would likely cost more money than just selling the displays at a discount, as they weren't designed to be easily disassembled and upgraded."

    If that's the case... I wonder why that is? Could it be the blithe acceptance of ridiculous cases of planned obsolescence like this?

    Manufacturers piddle out increments of tech constantly to try to keep a carrot on a stick in front of consumers. Just like with games and their DLC nonsense, the new mindset is replace, replace, replace... design the product so it can't be upgraded. Fill up the landfills.

    Sorry, but my $800 panel isn't going to just wear out or be obsolete in short order. People who spent even more are likely to say the same thing. And, again, many of these products are still available for purchase right now. The industry is doing consumers a disservice enough by not having standards (incompatible competing G-Sync and FreeSync) but it's far worse to tell people they need to replace otherwise perfectly satisfactory equipment for a minor feature improvement.

    You say it's not feasible to make monitors that can be upgraded in a relatively minor way like this. I say it's not. It's not like we're talking about installing DisplayPort into a panel that didn't have it or something along those lines. It's time for the monitor industry to stop spewing out tiny incremental changes and expecting wholesale replacement.

    This sort of product and the mindset that accompanies it is optional, not mandatory. Once upon a time things were designed to be upgradable. I suppose the next thing you'll fully endorse are motherboards with the CPUs, RAM, and everything else soldered on (which Apple likes to do) to replace DIY computing... Why not? Think of how much less trouble it will be for everyone.
  • Oxford Guy - Friday, March 20, 2015 - link

    "it's probable that G1 *couldn't* be properly upgraded to support TRIM" "since you were working at Intel's Client SSD department...oh, wait, you weren't." So, I assume I should use the same retort on you with your "probable", eh?
  • Oxford Guy - Friday, March 20, 2015 - link

    The other thing you're missing is that Intel never told consumers that TRIM could not be added with a firmware patch. It never provided anyone with an actual concrete justification. It just did what is typical for these companies and for publications like yours = told people to buy the latest shiny to "upgrade".
  • Gunbuster - Thursday, March 19, 2015 - link

    So G-Sync has been available to purchase for what a year now? And AMD comes to the table with something exactly the same. How impressive.

    Oh and Crossfire driver the traditional trust us Coming soon™
  • chizow - Thursday, March 19, 2015 - link

    18 months later, and not exactly the same, still worst. But yes we must give it to AMD, at least they brought something to the table this time.
  • Gigaplex - Friday, March 20, 2015 - link

    The troll is strong in this one. You keep repeating how this is technically worse than G-SYNC and have absolutely nothing to back it up. You claim forced V-SYNC is an issue with FreeSync, but it's the other way around - you can't turn V-SYNC off with G-SYNC but you can with FreeSync. You don't address the fact that G-SYNC monitors need the proprietary scaler that doesn't have all the features of FreeSync capable scalers (eg more input ports, OSD functionality). You accuse everyone who refutes your argument with AMD fanboy sentimentality, when you yourself are the obvious NVIDIA fanboy. No doubt you'll accuse me of being an AMD fanboy too. How wrong you are.
  • JarredWalton - Friday, March 20, 2015 - link

    Technically the G-SYNC scaler supports an OSD... the options are just more limited as there aren't multiple inputs to support, and I believe NVIDIA doesn't bother with supporting multiple *inaccurate* color modes -- just sRGB and hopefully close to the correct values.
  • chizow - Friday, March 20, 2015 - link

    Actually you're wrong again, Vsync is always off, there is a frame cap turned on via driver but that is not Vsync as the GPU is still controlling frame rate.

    Meanwhile, FreeSync is still clearly tied to Vsync, which is somewhat surprising in its own right since AMD has historically had issues with driver-level Vsync.

    I've never once glossed over the fact G-Sync requires proprietary module, because I've clearly stated the price and tech is justified if it is a better solution and as we saw yesterday, it clearly is.

    I've also acknowledged that multiple inputs an OSD are amenities that are a bonus, but certainly not over these panels excelling at what they are purchased for. I have 2xU2410 companion panels with TONS of inputs for anything I need beyond gaming.
  • darkfalz - Thursday, March 19, 2015 - link

    I have to give it to AMD here - I was skeptical this could be accomplished without dedicated hardware to buffer the video frames on the display, but they've done it. I still wouldn't buy one of their power hungry video cards but it's good for AMD fans. This is good news for G-Sync owners too as it should drive down the artificially inflated price (partly due to lack of competition, partly due to early adoption premium). After fiddling around with triple buffering and tripe buffering overrides for years (granted, less of a problem on DX10/11 as it seems many modern engines have some form of "free" triple buffering) it's good to go to perfect refresh rates. As a big emulation fan, with many arcade games using various refresh rates from 50 to 65 Hz, these displays are also great. Was input lag tested? AMD don't claim to have Vsync-off like input lag reduction. This would be superb in a laptop where displaying every last frame is important (Optimus provides a sort of "free" triple buffering of its own, but it's not the smoothest and often requires you to set a 60 FPS frame cap).
  • darkfalz - Thursday, March 19, 2015 - link

    By G-Sync owners, I Guess I mean NVIDIA fans / prospective G-Sync buyers. G-sync owners (like me) have already paid the premium.

Log in

Don't have an account? Sign up now