FreeSync vs. G-SYNC Performance

One item that piqued our interest during AMD’s presentation was a claim that there’s a performance hit with G-SYNC but none with FreeSync. NVIDIA has said as much in the past, though they also noted at the time that they were "working on eliminating the polling entirely" so things may have changed, but even so the difference was generally quite small – less than 3%, or basically not something you would notice without capturing frame rates. AMD did some testing however and presented the following two slides:

It’s probably safe to say that AMD is splitting hairs when they show a 1.5% performance drop in one specific scenario compared to a 0.2% performance gain, but we wanted to see if we could corroborate their findings. Having tested plenty of games, we already know that most games – even those with built-in benchmarks that tend to be very consistent – will have minor differences between benchmark runs. So we picked three games with deterministic benchmarks and ran with and without G-SYNC/FreeSync three times. The games we selected are Alien Isolation, The Talos Principle, and Tomb Raider. Here are the average and minimum frame rates from three runs:

Gaming Performance Comparison

Gaming Performance Comparison

Except for a glitch with testing Alien Isolation using a custom resolution, our results basically don’t show much of a difference between enabling/disabling G-SYNC/FreeSync – and that’s what we want to see. While NVIDIA showed a performance drop with Alien Isolation using G-SYNC, we weren’t able to reproduce that in our testing; in fact, we even showed a measurable 2.5% performance increase with G-SYNC and Tomb Raider. But again let’s be clear: 2.5% is not something you’ll notice in practice. FreeSync meanwhile shows results that are well within the margin of error.

What about that custom resolution problem on G-SYNC? We used the ASUS ROG Swift with the GTX 970, and we thought it might be useful to run the same resolution as the LG 34UM67 (2560x1080). Unfortunately, that didn’t work so well with Alien Isolation – the frame rates plummeted with G-SYNC enabled for some reason. Tomb Raider had a similar issue at first, but when we created additional custom resolutions with multiple refresh rates (60/85/100/120/144 Hz) the problem went away; we couldn't ever get Alien Isolation to run well with G-SYNC using our custome resolution, however. We’ve notified NVIDIA of the glitch, but note that when we tested Alien Isolation at the native WQHD setting the performance was virtually identical so this only seems to affect performance with custom resolutions and it is also game specific.

For those interested in a more detailed graph of the frame rates of the three runs (six total per game and setting, three with and three without G-SYNC/FreeSync), we’ve created a gallery of the frame rates over time. There’s so much overlap that mostly the top line is visible, but that just proves the point: there’s little difference other than the usual minor variations between benchmark runs. And in one of the games, Tomb Raider, even using the same settings shows a fair amount of variation between runs, though the average FPS is pretty consistent.

FreeSync Features Closing Thoughts
Comments Locked

350 Comments

View All Comments

  • imaheadcase - Thursday, March 19, 2015 - link

    The cost means nothing, keep in mind the people buying this stuff pay out the nose already for hardware. Given that most people who by nvidia cards are going to get a Gsync, cost has no meaning.

    This is a double bad thing for AMD..first its still tied to IT'S graphics card (NV already so no to support), and 2nd the monitors announced already are already below the specs Freesync is suppose to do, and worse than next gen Gsync monitors.

    I mean i love competition like the next person, but this is just PR making it seem like its a good thing when its not.
  • SleepyFE - Thursday, March 19, 2015 - link

    Your name says it all. Do you really think manufacturers will beg NVidia to come and mess with their manufacturing process just to include something that only they support? Time will come when phone makers will join and they mostly don't use NVidia GPU's. So now you have NVidia vs AMD and Intel (for ultrabooks) and ARM (Mali) and PowerVR. You think NVidia can hold them off with overpricing and PR?
  • Murloc - Thursday, March 19, 2015 - link

    uhm no?
    I'd want my next monitor to be GPU agnostic ideally.
    And I'd want to use an nvidia card with it because right now AMD cards are still ovens compared to nvidia.
    Not because I like paying through the nose, a 750 Ti doesn't cost much at all.

    I'll hold out since I'm trusting that this thing will solve itself (in favour of the industry standard, adaptive sync) sooner or later.
  • Ranger101 - Friday, March 20, 2015 - link

    So a difference of 10 degrees celsius under load makes an AMD gpu an OVEN and an Nvidia gpu presumably a Fridge by comparison....LOL.
  • Lakku - Wednesday, May 6, 2015 - link

    The reported GPU temp means nothing. That is just an indication of the heatsink/fans ability to remove heat from the GPU. You need to look at power draw. The AMD GPUs draw significantly more power than current nVidia cards for less performance. That power generates heat, heat that needs to go somewhere. So while the AMD cards may be 10 degrees Celsius more, which isn't minimal in and of itself, it is having to dissipate quite a bit more generated heat. The end result is AMD GPUs are putting out quite a bit more heat than nVidia GPUs.
  • althaz - Thursday, March 19, 2015 - link

    There are a bunch of 4k monitors announced. Yet there are no 4k G-Sync monitors available - how is that worse specs?

    I'd buy a 4k 27" G-Sync display at a reasonable price in a heartbeat. In fact I'd buy two.
  • thejshep - Thursday, March 19, 2015 - link

    Acer XB280HK http://www.newegg.com/Product/Product.aspx?Item=N8...
  • arneberg - Thursday, March 19, 2015 - link

    Freesync today are only open to people with radeon cards. AMD made the better deal they let the monitor builders take the cost for freesync. Nvidia made the hardware themself
  • chizow - Thursday, March 19, 2015 - link

    Huh? Who cares if its open/closed/upside down/inside out? G-Sync is better because it is BETTER at what it set out to do. If it is the better overall solution, as we have seen today it is, then it can and should command a premium. This will just be another bulletpoint pro/con for Nvidia vs. AMD. You want better, you have to pay for it, simple as that.
  • lordken - Thursday, March 19, 2015 - link

    better? did we read same article? I cant find where it says that gsync is better than freesync. In what aspect it is better?
    And answer for your 1st question is, anyone with the brain. Thing is that nvidia could enable support for freesync if it wouldnt hurt their pride, which would be big benefit for their customers (you wouldnt be restricted on monitor selection) but they chose what is better for them, pushing gsync & milking more money from you.
    This is pretty stupid, while you may be some average gamer that thinks it is fine to have your monitor selection restricted to 2% normal people probably wouldnt be that happy. The way it should be is that every monitor should support freesync (or whatever you call it) as this is display feature and should have been in 1st place developed by LCD makers but they dont give a shit to provide excellent displays as far as they can sell shit that people are buying like crazy (not refering to gsync monitors now).
    Vendor lockin is always a bad thing.
    Oh and article says that gsync monitor doesnt provide advanced OSD as comon panels today...so yeah gsync is clearly better

Log in

Don't have an account? Sign up now