FreeSync vs. G-SYNC Performance

One item that piqued our interest during AMD’s presentation was a claim that there’s a performance hit with G-SYNC but none with FreeSync. NVIDIA has said as much in the past, though they also noted at the time that they were "working on eliminating the polling entirely" so things may have changed, but even so the difference was generally quite small – less than 3%, or basically not something you would notice without capturing frame rates. AMD did some testing however and presented the following two slides:

It’s probably safe to say that AMD is splitting hairs when they show a 1.5% performance drop in one specific scenario compared to a 0.2% performance gain, but we wanted to see if we could corroborate their findings. Having tested plenty of games, we already know that most games – even those with built-in benchmarks that tend to be very consistent – will have minor differences between benchmark runs. So we picked three games with deterministic benchmarks and ran with and without G-SYNC/FreeSync three times. The games we selected are Alien Isolation, The Talos Principle, and Tomb Raider. Here are the average and minimum frame rates from three runs:

Gaming Performance Comparison

Gaming Performance Comparison

Except for a glitch with testing Alien Isolation using a custom resolution, our results basically don’t show much of a difference between enabling/disabling G-SYNC/FreeSync – and that’s what we want to see. While NVIDIA showed a performance drop with Alien Isolation using G-SYNC, we weren’t able to reproduce that in our testing; in fact, we even showed a measurable 2.5% performance increase with G-SYNC and Tomb Raider. But again let’s be clear: 2.5% is not something you’ll notice in practice. FreeSync meanwhile shows results that are well within the margin of error.

What about that custom resolution problem on G-SYNC? We used the ASUS ROG Swift with the GTX 970, and we thought it might be useful to run the same resolution as the LG 34UM67 (2560x1080). Unfortunately, that didn’t work so well with Alien Isolation – the frame rates plummeted with G-SYNC enabled for some reason. Tomb Raider had a similar issue at first, but when we created additional custom resolutions with multiple refresh rates (60/85/100/120/144 Hz) the problem went away; we couldn't ever get Alien Isolation to run well with G-SYNC using our custome resolution, however. We’ve notified NVIDIA of the glitch, but note that when we tested Alien Isolation at the native WQHD setting the performance was virtually identical so this only seems to affect performance with custom resolutions and it is also game specific.

For those interested in a more detailed graph of the frame rates of the three runs (six total per game and setting, three with and three without G-SYNC/FreeSync), we’ve created a gallery of the frame rates over time. There’s so much overlap that mostly the top line is visible, but that just proves the point: there’s little difference other than the usual minor variations between benchmark runs. And in one of the games, Tomb Raider, even using the same settings shows a fair amount of variation between runs, though the average FPS is pretty consistent.

FreeSync Features Closing Thoughts
Comments Locked

350 Comments

View All Comments

  • Welsh Jester - Friday, March 20, 2015 - link

    To add to my last post, i think 1440p Freesync screens will be good for people with only 1 higher end GPU. Since they won't have to deal with micro stuttering that multi cards bring, smooth experience at 40+ fps. Good news.
  • FlushedBubblyJock - Saturday, March 21, 2015 - link

    I was just about ready to praise AMD but then I see "must have and use display port"...
    Well, at least it appears AMD got it to work on their crap, and without massive Hz restrictions as they were on earlier reviews.
    So, heck, AMD might have actually done something right for once ?
    I am cautious - I do expect some huge frikkin problem we can't see right now --- then we will be told to ignore it, then we will be told it's nothing, then there's a workaround fix, then after a couple years it will be full blow admitted and AMD will reluctantly "fix it" in "the next release of hardware".
    Yes, that's what I expect, so I'm holding back the praise for AMD because I've been burned to a crisp before.
  • cykodrone - Saturday, March 21, 2015 - link

    I actually went to the trouble to make an account to say sometimes I come here just to read the comments, some of the convos have me rolling on the floor busting my guts laughing, seriously, this is free entertainment at its best! Aside from that, the cost of this Nvidia e-penis would feed 10 starving children for a month. I mean seriously, at what point is it overkill? By that I mean is there any game out there that would absolutely not run good enough on a slightly lesser card at half the price? When I read this card alone requires 250W, my eyes popped out of my head, holy electric bill batman, but I guess if somebody has a 1G to throw away on an e-penis, they don't have electric bill worries. One more question, what kind of CPU/motherboard would you need to back this sucker up? I think this card would be retarded without at least the latest i7 Extreme(ly overpriced), can you imagine some tool dropping this in an i3? What I'm saying is, this sucker would need an expensive 'bed' too, otherwise, you'd just be wasting your time and money.
  • cykodrone - Saturday, March 21, 2015 - link

    This got posted to the wrong story, was meant for the NVIDIA GeForce GTX Titan X Review, my humble apologies.
  • mapesdhs - Monday, March 23, 2015 - link

    No less amusing though. ;D

    Btw, I've tested 1/2/3x GTX 980 on a P55 board, it works a lot better than one might think.
    Also test 1/2x 980 with an i5 760, again works quite well. Plus, the heavier the game, the
    less they tend to rely on main CPU power, especially as the resolution/detail rises.

    Go to 3dmark.com, Advanced Search, Fire Strike, enter i7 870 or i5 760 & search, my
    whackadoodle results come straight back. :D I've tested with a 5GHz 2700K and 4.8GHz
    3930K aswell, the latter are quicker of course, but not that much quicker, less than most
    would probably assume.

    Btw, the Titan X is more suited to solo pros doing tasks that are mainly FP32, like After Effects.
    However, there's always a market for the very best, and I know normal high street stores make
    their biggest profit margins on premium items (and the customers who buy them), so it's an
    important segment - it drives everything else in a way.

    Ian.
  • mapesdhs - Monday, March 23, 2015 - link

    (Damn, still no edit, I meant to say the 3-way testing was with an i7 870 on a P55)
  • Vinny DePaul - Sunday, March 22, 2015 - link

    I am a big fan of open standard. More the merrier. I stay with nVidia because they support their products better. I was a big fan of AMD GPU but the drivers were so buggy. nVidia updates their drivers so quickly and the support is just a lot better. I like G-sync. It worths the extra money. I hope my monitor can support FreeSync with a firmware upgrade. (Not that I have an AMD GPU.)
  • Teknobug - Sunday, March 22, 2015 - link

    Now if I only can find a 24" monitor with these features, anything bigger than 24" is too large for me.
  • gauravnba - Monday, March 23, 2015 - link

    Lot of G-Sync versus AMD bashing here. Personally it all comes down to whether or not I am being confined to an eco-system when going for either technology. If nVidia starts to become like Apple in that respect, I'm not very comfortable with it.
    However, I wonder if to adapt to FreeSync, does it take a lot of addition or modification of hardware on the GPU end. That might be one reason that nVidia didn't have to change much of their architecture on the GPU during the G-Sync launch and confined that to the scaler.
    AMD worked with VESA to get this working on GCN 1.1, but not on GCN 1.0. This may be another area where the technologies are radically different- one is heavily reliant on the scaler while the other may be able to divide the work to a certain extent? Then again, I'm quite ignorant of how scalers and GPUs work in this case.
  • PixelSupreme - Monday, March 23, 2015 - link

    To be honest I don't give half a fudge about FreeSync or G-Sync. What gets my attention is ULMB/ strobed backlight. An IPS-Display (or well, OLED but...), WQHD and strobing that works on a range of refresh rates, including some that are mutliples of 24. That would be MY holy grail. The announced Acer XB270HU comes close but ULMB apparently only works on 85Hz ans 100Hz.

Log in

Don't have an account? Sign up now