Closing Thoughts

It took a while to get here, but if the proof is in the eating of the pudding, FreeSync tastes just as good as G-SYNC when it comes to adaptive refresh rates. Within the supported refresh rate range, I found nothing to complain about. Perhaps more importantly, while you’re not getting a “free” monitor upgrade, the current prices of the FreeSync displays are very close to what you’d pay for an equivalent display that doesn’t have adaptive sync. That’s great news, and with the major scaler manufacturers on board with adaptive sync the price disparity should only shrink over time.

The short summary is that FreeSync works just as you’d expect, and at least in our limited testing so far there have been no problems. Which isn’t to say that FreeSync will work with every possible AMD setup right now. As noted last month, the initial FreeSync driver that AMD provided (Catalyst 15.3 Beta 1) only allows FreeSync to work with single GPU configurations. Another driver should be coming next month that will support FreeSync with CrossFire setups.

Besides needing a driver and FreeSync display, you also need a GPU that uses AMD’s GCN 1.1 or later architecture. The list at present consists of the R7 260/260X, R9 285, R9 290/290X/295X2 discrete GPUs, as well as the Kaveri APUs – A6-7400K, A8-7600/7650K, and A10-7700K/7800/7850K. First generation GCN 1.0 cards (HD 7950/7970 or R9 280/280X and similar) are not supported.

All is not sunshine and roses, however. Part of the problem with reviewing something like FreeSync is that we're inherently tied to the hardware we receive, in this case the LG 34UM67 display. Armed with an R9 290X and running at the native resolution, the vast majority of games will run at 48FPS or above even at maximum detail settings, though of course there are exceptions. This means they look and feel smooth. But what happens with more demanding games or with lower performance GPUs? If you're running without VSYNC, you'd get tearing below 48FPS, while with VSYNC you'd get stuttering.

Neither is ideal, but how much this impacts your experience will depend on the game and individual. G-SYNC handles dropping below the minimum FPS more gracefully than FreeSync, though if you're routinely falling below the minimum FreeSync refresh rate we'd argue that you should lower the settings. Mostly what you get with FreeSync/G-SYNC is the ability to have smooth gaming at 40-60 FPS and not just 60+ FPS.

Other sites are reporting ghosting on FreeSync displays, but that's not inherent to the technology. Rather, it's a display specific problem (just as the amount of ghosting on normal LCDs is display specific). Using higher quality panels and hardware designed to reduce/eliminate ghosting is the solution. The FreeSync displays so far appear to not have the same level of anti-ghosting as the currently available G-SYNC panels, which is unfortunate if true. (Note that we've only looked at the LG 34UM67, so we can't report on all the FreeSync displays.) Again, ghosting shouldn't be a FreeSync issue so much as a panel/scaler/firmware problem, so we'll hold off on further commentary until we get to the monitor reviews.

One final topic to address is something that has become more noticeable to me over the past few months. While G-SYNC/FreeSync can make a big difference when frame rates are in the 40~75 FPS range, as you go beyond that point the benefits are a lot less clear. Take the 144Hz ASUS ROG Swift as an example. Even with G-SYNC disabled, the 144Hz refresh rate makes tearing rather difficult to spot, at least in my experience. Considering pixel response times for LCDs are not instantaneous and combine that with the way our human eyes and brain process the world and for all the hype I still think having high refresh rates with VSYNC disabled gets you 98% of the way to the goal of smooth gaming with no noticeable visual artifacts (at least for those of us without superhuman eyesight).

Overall, I’m impressed with what AMD has delivered so far with FreeSync. AMD gamers in particular will want to keep an eye on the new and upcoming FreeSync displays. They may not be the “must have” upgrade right now, but if you’re in the market and the price premium is less than $50, why not get FreeSync? On the other hand, for NVIDIA users things just got more complicated. Assuming you haven’t already jumped on the G-SYNC train, there’s now this question of whether or not NVIDIA will support non-G-SYNC displays that implement DisplayPort’s Adaptive Sync technology. I have little doubt that NVIDIA can support FreeSync panels, but whether they will support them is far less certain. Given the current price premium on G-SYNC displays, it’s probably a good time to sit back and wait a few months to see how things develop.

There is one G-SYNC display that I’m still waiting to see, however: Acer’s 27” 1440p144 IPS (AHVA) XB270HU. It was teased at CES and it could very well be the holy grail of displays. It’s scheduled to launch next month, and official pricing is $799 (with some pre-orders now online at higher prices). We might see a FreeSync variant of the XB270HU as well in the coming months, if not from Acer than likely from some other manufacturer. For those that work with images and movies as well as playing games, IPS/AHVA displays with G-SYNC or FreeSync support are definitely needed.

Wrapping up, if you haven’t upgraded your display in a while, now is a good time to take stock of the various options. IPS and other wide viewing angle displays have come down quite a bit in pricing, and there are overclockable 27” and 30” IPS displays that don’t cost much at all. Unfortunately, if you want a guaranteed high refresh rate, there’s a good chance you’re going to have to settle for TN. The new UltraWide LG displays with 75Hz IPS panels at least deliver a moderate improvement though, and they now come with FreeSync as an added bonus.

Considering a good display can last 5+ years, making a larger investment isn’t a bad idea, but by the same token rushing into a new display isn’t advisable either as you don't want to end up stuck with a "lemon" or a dead technology. Take some time, read the reviews, and then find the display that you will be happy to use for the next half decade. At least by then we should have a better idea of which display technologies will stick around.

FreeSync vs. G-SYNC Performance
Comments Locked

350 Comments

View All Comments

  • Welsh Jester - Friday, March 20, 2015 - link

    To add to my last post, i think 1440p Freesync screens will be good for people with only 1 higher end GPU. Since they won't have to deal with micro stuttering that multi cards bring, smooth experience at 40+ fps. Good news.
  • FlushedBubblyJock - Saturday, March 21, 2015 - link

    I was just about ready to praise AMD but then I see "must have and use display port"...
    Well, at least it appears AMD got it to work on their crap, and without massive Hz restrictions as they were on earlier reviews.
    So, heck, AMD might have actually done something right for once ?
    I am cautious - I do expect some huge frikkin problem we can't see right now --- then we will be told to ignore it, then we will be told it's nothing, then there's a workaround fix, then after a couple years it will be full blow admitted and AMD will reluctantly "fix it" in "the next release of hardware".
    Yes, that's what I expect, so I'm holding back the praise for AMD because I've been burned to a crisp before.
  • cykodrone - Saturday, March 21, 2015 - link

    I actually went to the trouble to make an account to say sometimes I come here just to read the comments, some of the convos have me rolling on the floor busting my guts laughing, seriously, this is free entertainment at its best! Aside from that, the cost of this Nvidia e-penis would feed 10 starving children for a month. I mean seriously, at what point is it overkill? By that I mean is there any game out there that would absolutely not run good enough on a slightly lesser card at half the price? When I read this card alone requires 250W, my eyes popped out of my head, holy electric bill batman, but I guess if somebody has a 1G to throw away on an e-penis, they don't have electric bill worries. One more question, what kind of CPU/motherboard would you need to back this sucker up? I think this card would be retarded without at least the latest i7 Extreme(ly overpriced), can you imagine some tool dropping this in an i3? What I'm saying is, this sucker would need an expensive 'bed' too, otherwise, you'd just be wasting your time and money.
  • cykodrone - Saturday, March 21, 2015 - link

    This got posted to the wrong story, was meant for the NVIDIA GeForce GTX Titan X Review, my humble apologies.
  • mapesdhs - Monday, March 23, 2015 - link

    No less amusing though. ;D

    Btw, I've tested 1/2/3x GTX 980 on a P55 board, it works a lot better than one might think.
    Also test 1/2x 980 with an i5 760, again works quite well. Plus, the heavier the game, the
    less they tend to rely on main CPU power, especially as the resolution/detail rises.

    Go to 3dmark.com, Advanced Search, Fire Strike, enter i7 870 or i5 760 & search, my
    whackadoodle results come straight back. :D I've tested with a 5GHz 2700K and 4.8GHz
    3930K aswell, the latter are quicker of course, but not that much quicker, less than most
    would probably assume.

    Btw, the Titan X is more suited to solo pros doing tasks that are mainly FP32, like After Effects.
    However, there's always a market for the very best, and I know normal high street stores make
    their biggest profit margins on premium items (and the customers who buy them), so it's an
    important segment - it drives everything else in a way.

    Ian.
  • mapesdhs - Monday, March 23, 2015 - link

    (Damn, still no edit, I meant to say the 3-way testing was with an i7 870 on a P55)
  • Vinny DePaul - Sunday, March 22, 2015 - link

    I am a big fan of open standard. More the merrier. I stay with nVidia because they support their products better. I was a big fan of AMD GPU but the drivers were so buggy. nVidia updates their drivers so quickly and the support is just a lot better. I like G-sync. It worths the extra money. I hope my monitor can support FreeSync with a firmware upgrade. (Not that I have an AMD GPU.)
  • Teknobug - Sunday, March 22, 2015 - link

    Now if I only can find a 24" monitor with these features, anything bigger than 24" is too large for me.
  • gauravnba - Monday, March 23, 2015 - link

    Lot of G-Sync versus AMD bashing here. Personally it all comes down to whether or not I am being confined to an eco-system when going for either technology. If nVidia starts to become like Apple in that respect, I'm not very comfortable with it.
    However, I wonder if to adapt to FreeSync, does it take a lot of addition or modification of hardware on the GPU end. That might be one reason that nVidia didn't have to change much of their architecture on the GPU during the G-Sync launch and confined that to the scaler.
    AMD worked with VESA to get this working on GCN 1.1, but not on GCN 1.0. This may be another area where the technologies are radically different- one is heavily reliant on the scaler while the other may be able to divide the work to a certain extent? Then again, I'm quite ignorant of how scalers and GPUs work in this case.
  • PixelSupreme - Monday, March 23, 2015 - link

    To be honest I don't give half a fudge about FreeSync or G-Sync. What gets my attention is ULMB/ strobed backlight. An IPS-Display (or well, OLED but...), WQHD and strobing that works on a range of refresh rates, including some that are mutliples of 24. That would be MY holy grail. The announced Acer XB270HU comes close but ULMB apparently only works on 85Hz ans 100Hz.

Log in

Don't have an account? Sign up now