FreeSync vs. G-SYNC Performance

One item that piqued our interest during AMD’s presentation was a claim that there’s a performance hit with G-SYNC but none with FreeSync. NVIDIA has said as much in the past, though they also noted at the time that they were "working on eliminating the polling entirely" so things may have changed, but even so the difference was generally quite small – less than 3%, or basically not something you would notice without capturing frame rates. AMD did some testing however and presented the following two slides:

It’s probably safe to say that AMD is splitting hairs when they show a 1.5% performance drop in one specific scenario compared to a 0.2% performance gain, but we wanted to see if we could corroborate their findings. Having tested plenty of games, we already know that most games – even those with built-in benchmarks that tend to be very consistent – will have minor differences between benchmark runs. So we picked three games with deterministic benchmarks and ran with and without G-SYNC/FreeSync three times. The games we selected are Alien Isolation, The Talos Principle, and Tomb Raider. Here are the average and minimum frame rates from three runs:

Gaming Performance Comparison

Gaming Performance Comparison

Except for a glitch with testing Alien Isolation using a custom resolution, our results basically don’t show much of a difference between enabling/disabling G-SYNC/FreeSync – and that’s what we want to see. While NVIDIA showed a performance drop with Alien Isolation using G-SYNC, we weren’t able to reproduce that in our testing; in fact, we even showed a measurable 2.5% performance increase with G-SYNC and Tomb Raider. But again let’s be clear: 2.5% is not something you’ll notice in practice. FreeSync meanwhile shows results that are well within the margin of error.

What about that custom resolution problem on G-SYNC? We used the ASUS ROG Swift with the GTX 970, and we thought it might be useful to run the same resolution as the LG 34UM67 (2560x1080). Unfortunately, that didn’t work so well with Alien Isolation – the frame rates plummeted with G-SYNC enabled for some reason. Tomb Raider had a similar issue at first, but when we created additional custom resolutions with multiple refresh rates (60/85/100/120/144 Hz) the problem went away; we couldn't ever get Alien Isolation to run well with G-SYNC using our custome resolution, however. We’ve notified NVIDIA of the glitch, but note that when we tested Alien Isolation at the native WQHD setting the performance was virtually identical so this only seems to affect performance with custom resolutions and it is also game specific.

For those interested in a more detailed graph of the frame rates of the three runs (six total per game and setting, three with and three without G-SYNC/FreeSync), we’ve created a gallery of the frame rates over time. There’s so much overlap that mostly the top line is visible, but that just proves the point: there’s little difference other than the usual minor variations between benchmark runs. And in one of the games, Tomb Raider, even using the same settings shows a fair amount of variation between runs, though the average FPS is pretty consistent.

FreeSync Features Closing Thoughts
Comments Locked

350 Comments

View All Comments

  • Cerb - Saturday, March 21, 2015 - link

    If it's not working, this is just as wrong. Since it's fairly close, at 24, 25, or almost 30, you will see the tear line creeping up or down the image, if vsync isn't on. It's exceptionally obvious. Usually, you will just see skipped frames on Windows, since the compositor forces vsync for the desktop, and this is generally well-supported by any video player's playback mechanisms. The skipped frames become more noticeable as you watch, but aren't nearly as bad as tearing.
  • looncraz - Saturday, March 21, 2015 - link

    Tearing can happen anytime.

    I'm writing a compositing engine for HaikuOS and I would LOVE to be able to control the refresh timing! when a small update occurs, and the frame buffer is ready, I'd swap it, trigger a monitor refresh, and then be on my way right away.

    As it stands, I have to either always be a frame behind, or try and guess how long composing the frame buffer from the update stream will take before I know anything about what the update stream will be like so I know when to wake up the composite engine control loop.

    That means, even on normal day-to-day stuff, like opening a menu, dragging icons, playing solitaire, browsing the web, etc. FreeSync would be quite useful. As it stands, the best I can do is hope the frame is ready for the next interval, or wait until the next refresh is complete to swap frame buffers - which means that the data on screen is always a frame out of date (or more).

    At 60hz that is a fixed delay multiplier of 16.7, with a minimum multiplicand of 1. Going with higher refresh rates on the desktop is just wasteful (we don't really need 60, except for things to feel smooth due to the delay multiplier effect of the refresh rate).

    If I could use the whole range from 45hz to 75 hz, our (virtual) multiplicand could be 0.75-1.33, instead of exactly 1 or 2. That make a significant difference in jitter.

    Everything would be smoother - and we could drop down to a 45hz refresh interval by default, saving energy in the process, instead of being stuck at at a fixed cadence.
  • Cerb - Saturday, March 21, 2015 - link

    Wrong. it is generating the visuals, and doing so the exact same way, as far as any of this technology is concerned, and screen tearing does happen, because refresh rates vary from our common ones.
  • soccerballtux - Friday, March 20, 2015 - link

    considering the power saving impact it's had on the mobile sector (no sense rendering to pixels that haven't changed, just render to the ones that have), it most definitely would have a significant impact on the laptop market and would be a great 'green' tech in general.
  • erple2 - Friday, March 20, 2015 - link

    No value, except to the consumer that doesn't have to pay the (current) $160+ premium for g-sync. Now, if amd had a gfx card competitor to the gtx980, it'd be marvelous, and a no brainer. Given that the cost is apparently minimal to implement, I don't see that as a problem. Even if you think it's not value added, panel manufacturers shoved the pointless 3d down everyone's throat, so clearly, they're not averse to that behavior.
  • mdriftmeyer - Sunday, March 22, 2015 - link

    It has value for any animated sequence.
  • JonnyDough - Monday, March 23, 2015 - link

    Inside of gaming it has plenty of value - who even cares about the rest? Gaming was a $25.1 billion market in 2010 (ESA annual report). I'd take a billionth of that pie and go out for a nice meal wouldn't you?
  • dragonsqrrl - Thursday, March 19, 2015 - link

    ... No current or upcoming DP spec ...requires... adaptive sync. It's optional, not sure how else you could interpret that, especially when you take the comment I responded to into consideration.
  • eddman - Friday, March 20, 2015 - link

    Wait a minute; that only applies to monitors, right? It'd suck to buy a DP 1.2a/3 video card and find out that it cannot do adaptive-sync.
  • tobi1449 - Friday, March 20, 2015 - link

    Plus FreeSync != Adaptive Sync

Log in

Don't have an account? Sign up now