FreeSync vs. G-SYNC Performance

One item that piqued our interest during AMD’s presentation was a claim that there’s a performance hit with G-SYNC but none with FreeSync. NVIDIA has said as much in the past, though they also noted at the time that they were "working on eliminating the polling entirely" so things may have changed, but even so the difference was generally quite small – less than 3%, or basically not something you would notice without capturing frame rates. AMD did some testing however and presented the following two slides:

It’s probably safe to say that AMD is splitting hairs when they show a 1.5% performance drop in one specific scenario compared to a 0.2% performance gain, but we wanted to see if we could corroborate their findings. Having tested plenty of games, we already know that most games – even those with built-in benchmarks that tend to be very consistent – will have minor differences between benchmark runs. So we picked three games with deterministic benchmarks and ran with and without G-SYNC/FreeSync three times. The games we selected are Alien Isolation, The Talos Principle, and Tomb Raider. Here are the average and minimum frame rates from three runs:

Gaming Performance Comparison

Gaming Performance Comparison

Except for a glitch with testing Alien Isolation using a custom resolution, our results basically don’t show much of a difference between enabling/disabling G-SYNC/FreeSync – and that’s what we want to see. While NVIDIA showed a performance drop with Alien Isolation using G-SYNC, we weren’t able to reproduce that in our testing; in fact, we even showed a measurable 2.5% performance increase with G-SYNC and Tomb Raider. But again let’s be clear: 2.5% is not something you’ll notice in practice. FreeSync meanwhile shows results that are well within the margin of error.

What about that custom resolution problem on G-SYNC? We used the ASUS ROG Swift with the GTX 970, and we thought it might be useful to run the same resolution as the LG 34UM67 (2560x1080). Unfortunately, that didn’t work so well with Alien Isolation – the frame rates plummeted with G-SYNC enabled for some reason. Tomb Raider had a similar issue at first, but when we created additional custom resolutions with multiple refresh rates (60/85/100/120/144 Hz) the problem went away; we couldn't ever get Alien Isolation to run well with G-SYNC using our custome resolution, however. We’ve notified NVIDIA of the glitch, but note that when we tested Alien Isolation at the native WQHD setting the performance was virtually identical so this only seems to affect performance with custom resolutions and it is also game specific.

For those interested in a more detailed graph of the frame rates of the three runs (six total per game and setting, three with and three without G-SYNC/FreeSync), we’ve created a gallery of the frame rates over time. There’s so much overlap that mostly the top line is visible, but that just proves the point: there’s little difference other than the usual minor variations between benchmark runs. And in one of the games, Tomb Raider, even using the same settings shows a fair amount of variation between runs, though the average FPS is pretty consistent.

FreeSync Features Closing Thoughts
Comments Locked

350 Comments

View All Comments

  • JarredWalton - Friday, March 20, 2015 - link

    FYI, ghosting is a factor of the display and firmware, not of the inherent technology. So while it's valid to say, "The LG FreeSync display has ghosting..." you shouldn't by extension imply FreeSync in and of itself is the cause of ghosting.
  • chizow - Friday, March 20, 2015 - link

    So are you saying a firmware flash is goiing to fix this, essentially for free? Yes that is a bit of a troll but you get the picture. Stop making excuses for AMD and ask these questions to them and panel makers, on record, for real answers. All this conjecture and excuse-making is honestly a disservice to your readers who are going to make some massive investment (not really) into a panel that I would consider completely unusable.

    You remember that Gateway FPD2485W that you did a fantastic review of a few years ago? Would you go back to that as your primary gaming monitor today? Then why dismiss this problem with FreeSync circa 2015?
  • chizow - Friday, March 20, 2015 - link

    Who said no ghosting? lol. There's lots of ghosting, on the FreeSync panels.
  • TheJian - Sunday, March 22, 2015 - link

    You're assuming gsync stays the same price forever. So scalers can improve pricing (in your mind) to zero over time, but NV's will never shrink, get better revs etc...LOL. OK. Also you assume they can't just lower the price any day of the week if desired. Microsoft just decided to give away Windows 10 (only to slow android but still). This is the kind of thing a company can do when they have 3.7B in the bank and no debt (NV, they have debt but if paid off, they'd have ~3.7b left). They could certainly put out a better rev that is cheaper, or subsidize $50-100 of it for a while until they can put out a cheaper version just to slow AMD down.

    They are not equal. See other site reviews besides and AMD portal site like anandtech ;)

    http://www.pcper.com/reviews/Displays/AMD-FreeSync...
    There is no lic fee from NV according to PCper.
    "It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic."
    Which basically shows VENDORS must be marking things up quite a lot. But that is too be expected with ZERO competition until this week.

    "For NVIDIA users though, G-Sync is supported on GTX 900-series, 700-series and 600-series cards nearly across the board, in part thanks to the implementation of the G-Sync module itself."
    Not the case on the AMD side as he says. So again not so free if you don't own a card. NV people that own a card already are basically covered, just buy a monitor.

    Specs of this is misleading too, which anandtech just blows by:
    "The published refresh rate is more than a bit misleading. AMD claims that FreeSync is rated at 9-240 Hz while G-Sync is only quoted at 30-144 Hz. There are two issues with this. First, FreeSync doesn’t determine the range of variable refresh rates, AdaptiveSync does. Second, AMD is using the maximum and minimum refresh range published by the standards body, not an actual implementation or even a planned implementation from any monitor or display. The 30-144 Hz rating for G-Sync is actually seen in shipping displays today (like the ASUS PG278Q ROG Swift). The FreeSync monitors we have in our office are either 40-144 Hz (TN, 2560x1440) or 48-75 Hz (IPS, 2560x1080); neither of which is close to the 9-240 Hz seen in this table."

    Again, read a site that doesn't lean so heavily to AMD. Don't forget to read about the GHOSTING on AMD. One more point, PCper's conclusion:
    "My time with today’s version of FreeSync definitely show it as a step in the right direction but I think it is far from perfect."
    "But there is room for improvement: ghosting concerns, improving the transition experience between VRR windows and non-VRR frame rates and figuring out a way to enable tear-free and stutter-free gaming under the minimum variable refresh rate."
    "FreeSync is doing the right things and is headed in the right direction, but it can’t claim to offer the same experience as G-Sync. Yet."

    Ok then...Maybe Freesync rev2 gets it right ;)
  • soccerballtux - Friday, March 20, 2015 - link

    you must be a headcase or, more likely, are paid for by NVidia to publicly shill. Gsync requires a proprietary NVidia chip installed in the monitor that comes from, and only from, NVidia.

    It's much easier to simply set a flag-byte in the DisplayPort data stream that says "ok render everything since the last render you rendered, now". There's nothing closed about that.
  • chizow - Friday, March 20, 2015 - link

    And? Who cares if it results in a better solution? LOL only a headcase or a paid AMD shill would say removing hardware for a cheaper solution that results in a worst solution is actually better.
  • soccerballtux - Friday, March 20, 2015 - link

    wellll, if it's cheaper and a better solution, then the market cares.
  • chizow - Friday, March 20, 2015 - link

    Except its cheaper and worst, therefore it should be cheaper
  • bloodypulp - Friday, March 20, 2015 - link

    Oh darn... so what you're saying is that I have to purchase the card that costs less, then I have to purchase the monitor that costs less too? Sound like a raw deal... ROFL!!

    And as far as your bogus oppenness argument goes: There is nothing preventing Nvidia from supporting Adaptive Sync. NOTHING. Well, unless you count hubris and greed. In fact, Mobile G-Sync already doesn't even require the module! I guess that expensive module really wasn't necessary after all...

    And lastly, Nvidia has no x86 APU to offer, so they can't offer what AMD can with their Freesync-supporting APUs. Nvidia simply has nothing to compete with there. Even gamers on a tight budget can enjoy Freesync! The same simply cannot be said for GSync.
  • Denithor - Friday, March 20, 2015 - link

    ONLY closed because NVIDIA refuses to support freesync. Much like OpenCL. And PhysX.

Log in

Don't have an account? Sign up now