FreeSync vs. G-SYNC Performance

One item that piqued our interest during AMD’s presentation was a claim that there’s a performance hit with G-SYNC but none with FreeSync. NVIDIA has said as much in the past, though they also noted at the time that they were "working on eliminating the polling entirely" so things may have changed, but even so the difference was generally quite small – less than 3%, or basically not something you would notice without capturing frame rates. AMD did some testing however and presented the following two slides:

It’s probably safe to say that AMD is splitting hairs when they show a 1.5% performance drop in one specific scenario compared to a 0.2% performance gain, but we wanted to see if we could corroborate their findings. Having tested plenty of games, we already know that most games – even those with built-in benchmarks that tend to be very consistent – will have minor differences between benchmark runs. So we picked three games with deterministic benchmarks and ran with and without G-SYNC/FreeSync three times. The games we selected are Alien Isolation, The Talos Principle, and Tomb Raider. Here are the average and minimum frame rates from three runs:

Gaming Performance Comparison

Gaming Performance Comparison

Except for a glitch with testing Alien Isolation using a custom resolution, our results basically don’t show much of a difference between enabling/disabling G-SYNC/FreeSync – and that’s what we want to see. While NVIDIA showed a performance drop with Alien Isolation using G-SYNC, we weren’t able to reproduce that in our testing; in fact, we even showed a measurable 2.5% performance increase with G-SYNC and Tomb Raider. But again let’s be clear: 2.5% is not something you’ll notice in practice. FreeSync meanwhile shows results that are well within the margin of error.

What about that custom resolution problem on G-SYNC? We used the ASUS ROG Swift with the GTX 970, and we thought it might be useful to run the same resolution as the LG 34UM67 (2560x1080). Unfortunately, that didn’t work so well with Alien Isolation – the frame rates plummeted with G-SYNC enabled for some reason. Tomb Raider had a similar issue at first, but when we created additional custom resolutions with multiple refresh rates (60/85/100/120/144 Hz) the problem went away; we couldn't ever get Alien Isolation to run well with G-SYNC using our custome resolution, however. We’ve notified NVIDIA of the glitch, but note that when we tested Alien Isolation at the native WQHD setting the performance was virtually identical so this only seems to affect performance with custom resolutions and it is also game specific.

For those interested in a more detailed graph of the frame rates of the three runs (six total per game and setting, three with and three without G-SYNC/FreeSync), we’ve created a gallery of the frame rates over time. There’s so much overlap that mostly the top line is visible, but that just proves the point: there’s little difference other than the usual minor variations between benchmark runs. And in one of the games, Tomb Raider, even using the same settings shows a fair amount of variation between runs, though the average FPS is pretty consistent.

FreeSync Features Closing Thoughts
Comments Locked

350 Comments

View All Comments

  • Crunchy005 - Saturday, March 21, 2015 - link

    Sorry, tried to play nice higher up...
  • chizow - Saturday, March 21, 2015 - link

    Again, there is nothing religious about technology, it seems you are the ones who are clinging to faith despite photographic evidence to the contrary.
  • silverblue - Saturday, March 21, 2015 - link

    Worse, my dear chizow, worse. 'Worst' doesn't go with 'a', but it does go with 'the' as it denotes an extreme.

    HTH. :)
  • silverblue - Saturday, March 21, 2015 - link

    I think what would clear up your misgivings with FreeSync would be if a very high quality panel with a very wide frequency range (especially dealing with those lower frequencies) was tested. I shouldn't blame the underlying technology for too much yet, especially considering there are downsides to GSync as well.
  • chizow - Saturday, March 21, 2015 - link

    There have been numerous reports that the BenQ panel shares the same high quality AU optronics panel as the ROG Swift, so again, it is obvious the problem is in the spec/implementation, rather than the panel itself. But yes, it is good confirmation that the premium associated with G-Sync is in fact, worth it if the BenQ shows these problems at $630 while the Swift does not at $780.
  • silverblue - Sunday, March 22, 2015 - link

    FreeSync works down to 9Hz; it'd be nice to see a panel that gets even remotely close to this.
  • chizow - Monday, March 23, 2015 - link

    AMD claims the spec goes as low as 9Hz, but as we have seen it has a hard enough working properly at the current minimums of 40 and 48Hz without exhibiting serious issues, so until those problems directly tied to low refresh and pixel decay are resolved, it doesn't really matter what AMD claims on a slidedeck.
  • soccerballtux - Friday, March 20, 2015 - link

    I disagree, the holy grail to get me to upgrade is going to be a 32" 1440p monitor (since 30" 1600p ones are non-existent), and maybe one that's just 60fps.
  • JarredWalton - Friday, March 20, 2015 - link

    I'd really love a good 34" 3440x1440 display with a 120Hz refresh rate. Too bad that's more bandwidth than DP provides. And too bad gaming at 3440x1440 generally requires more than any single GPU other than the GTX Titan X. I've got CrossFire 290X and SLI 970 incidentally; when I'm not testing anything, it's the GTX 970 GPUs that end up in my system for daily use.
  • Impulses - Friday, March 20, 2015 - link

    I'd totally go for a 34" 3440x1440... Wouldn't be any harder to drive than my 3x 24" 1080p IPS displays in Eyefinity. I've resigned myself to CF/SLI, it's not like I play very many games at launch anyway. Anything lower res, smaller, or not as wide would feel like a side grade.

Log in

Don't have an account? Sign up now