Closing Thoughts

It took a while to get here, but if the proof is in the eating of the pudding, FreeSync tastes just as good as G-SYNC when it comes to adaptive refresh rates. Within the supported refresh rate range, I found nothing to complain about. Perhaps more importantly, while you’re not getting a “free” monitor upgrade, the current prices of the FreeSync displays are very close to what you’d pay for an equivalent display that doesn’t have adaptive sync. That’s great news, and with the major scaler manufacturers on board with adaptive sync the price disparity should only shrink over time.

The short summary is that FreeSync works just as you’d expect, and at least in our limited testing so far there have been no problems. Which isn’t to say that FreeSync will work with every possible AMD setup right now. As noted last month, the initial FreeSync driver that AMD provided (Catalyst 15.3 Beta 1) only allows FreeSync to work with single GPU configurations. Another driver should be coming next month that will support FreeSync with CrossFire setups.

Besides needing a driver and FreeSync display, you also need a GPU that uses AMD’s GCN 1.1 or later architecture. The list at present consists of the R7 260/260X, R9 285, R9 290/290X/295X2 discrete GPUs, as well as the Kaveri APUs – A6-7400K, A8-7600/7650K, and A10-7700K/7800/7850K. First generation GCN 1.0 cards (HD 7950/7970 or R9 280/280X and similar) are not supported.

All is not sunshine and roses, however. Part of the problem with reviewing something like FreeSync is that we're inherently tied to the hardware we receive, in this case the LG 34UM67 display. Armed with an R9 290X and running at the native resolution, the vast majority of games will run at 48FPS or above even at maximum detail settings, though of course there are exceptions. This means they look and feel smooth. But what happens with more demanding games or with lower performance GPUs? If you're running without VSYNC, you'd get tearing below 48FPS, while with VSYNC you'd get stuttering.

Neither is ideal, but how much this impacts your experience will depend on the game and individual. G-SYNC handles dropping below the minimum FPS more gracefully than FreeSync, though if you're routinely falling below the minimum FreeSync refresh rate we'd argue that you should lower the settings. Mostly what you get with FreeSync/G-SYNC is the ability to have smooth gaming at 40-60 FPS and not just 60+ FPS.

Other sites are reporting ghosting on FreeSync displays, but that's not inherent to the technology. Rather, it's a display specific problem (just as the amount of ghosting on normal LCDs is display specific). Using higher quality panels and hardware designed to reduce/eliminate ghosting is the solution. The FreeSync displays so far appear to not have the same level of anti-ghosting as the currently available G-SYNC panels, which is unfortunate if true. (Note that we've only looked at the LG 34UM67, so we can't report on all the FreeSync displays.) Again, ghosting shouldn't be a FreeSync issue so much as a panel/scaler/firmware problem, so we'll hold off on further commentary until we get to the monitor reviews.

One final topic to address is something that has become more noticeable to me over the past few months. While G-SYNC/FreeSync can make a big difference when frame rates are in the 40~75 FPS range, as you go beyond that point the benefits are a lot less clear. Take the 144Hz ASUS ROG Swift as an example. Even with G-SYNC disabled, the 144Hz refresh rate makes tearing rather difficult to spot, at least in my experience. Considering pixel response times for LCDs are not instantaneous and combine that with the way our human eyes and brain process the world and for all the hype I still think having high refresh rates with VSYNC disabled gets you 98% of the way to the goal of smooth gaming with no noticeable visual artifacts (at least for those of us without superhuman eyesight).

Overall, I’m impressed with what AMD has delivered so far with FreeSync. AMD gamers in particular will want to keep an eye on the new and upcoming FreeSync displays. They may not be the “must have” upgrade right now, but if you’re in the market and the price premium is less than $50, why not get FreeSync? On the other hand, for NVIDIA users things just got more complicated. Assuming you haven’t already jumped on the G-SYNC train, there’s now this question of whether or not NVIDIA will support non-G-SYNC displays that implement DisplayPort’s Adaptive Sync technology. I have little doubt that NVIDIA can support FreeSync panels, but whether they will support them is far less certain. Given the current price premium on G-SYNC displays, it’s probably a good time to sit back and wait a few months to see how things develop.

There is one G-SYNC display that I’m still waiting to see, however: Acer’s 27” 1440p144 IPS (AHVA) XB270HU. It was teased at CES and it could very well be the holy grail of displays. It’s scheduled to launch next month, and official pricing is $799 (with some pre-orders now online at higher prices). We might see a FreeSync variant of the XB270HU as well in the coming months, if not from Acer than likely from some other manufacturer. For those that work with images and movies as well as playing games, IPS/AHVA displays with G-SYNC or FreeSync support are definitely needed.

Wrapping up, if you haven’t upgraded your display in a while, now is a good time to take stock of the various options. IPS and other wide viewing angle displays have come down quite a bit in pricing, and there are overclockable 27” and 30” IPS displays that don’t cost much at all. Unfortunately, if you want a guaranteed high refresh rate, there’s a good chance you’re going to have to settle for TN. The new UltraWide LG displays with 75Hz IPS panels at least deliver a moderate improvement though, and they now come with FreeSync as an added bonus.

Considering a good display can last 5+ years, making a larger investment isn’t a bad idea, but by the same token rushing into a new display isn’t advisable either as you don't want to end up stuck with a "lemon" or a dead technology. Take some time, read the reviews, and then find the display that you will be happy to use for the next half decade. At least by then we should have a better idea of which display technologies will stick around.

FreeSync vs. G-SYNC Performance
Comments Locked

350 Comments

View All Comments

  • chizow - Saturday, March 21, 2015 - link

    You seem to be taking my comments pretty seriously Jarred, as you should, given they draw a lot of questions to your credibility and capabilities in writing a competent "review" of the technology being discussed. But its np, no one needs to take me seriously, this isn't my job, unlike yours even if it is part time. The downside is, reviews like this make it harder for anyone to take you or the content on this site seriously, because as you can see, there are a number of other individuals that have taken issue to your Engadget-like review. I am sure there are a number of people that will take this review as gospel, go out and buy FreeSync panels, discover ghosting issues not covered in this "review" and ultimately, lose trust in what this site represents. Not that you seem to care.

    As for being limited in equipment, that's just another poor excuse and asterisk you've added to the footnotes here. It takes a max $300 camera, far less than a single performance graphics card, and maybe $50 in LED, diodes and USB input doublers (hell you can even make your own if you know how to splice wires) at Digikey or RadioShack to test this. Surely, Ryan and your new parent company could foot this bill for a new test methodology if there was actually interest in conducting a serious review of the technology. Numerous sites have already given the methodology for input lag and ghosting with a FAR smaller budget than AnandTech, all you would have to do is mimic their test set-up with a short acknowledgment which I am sure they would appreciate from the mighty AnandTech.

    But its OK, like the FCAT issue its obvious AT had no intention of actually covering the problems with FreeSync, I guess if it takes a couple of Nvidia "zealots" to get to the bottom of it and draw attention to AMD's problems to ultimately force them to improve their products, so be it. Its obvious the actual AMD fans and spoon-fed press aren't willing to tackle them.

    As for blanket-statements lol, that's a good one. I guess we should just take your unsubstantiated points of view, which are unsurprisingly, perfectly aligned with AMD's, at face value without any amount of critical thinking and skepticism?

    It's frankly embarrassing to read some of the points you've made from someone who actually works in this industry, for example:

    1) One shred of confirmation that G-Sync carries royalties. Your "semantics" mean nothing here.
    2) One shred of confirmation that existing, pre-2015 panels can be made compatible with a firmware upgrade.
    3) One shred of confirmation that G-Sync somehow faces the uphill battle compared to FreeSync, given known market indicators and factual limitations on FreeSync graphics card support.

    All points you have made in an effort to show FreeSync in a better light, while downplaying G-Sync.

    As for the last bit, again, if you have to sacrifice your gaming quality in an attempt to meet FreeSync's minimum standard refresh rate, the solution has already failed, given one of the major benefits of VRR is the ability to crank up settings without having to resort to Vsync On and the input lag associated with it. For example, in your example, if you have to drop settings from Very High to High just so that your FPS don't drop below 45FPS for 10% of the time, you've already had to sacrifice your image quality for the other 90% it stays above that. That is a failure of a solution if the alternative is to just repeat frames for that 10% as needed. But hey, to each their own, this kind of testing and information would be SUPER informative in an actual comprehensive review.

    As for your own viewpoints on competition, who cares!?!?!? You're going to color your review and outlook in an attempt to paint FreeSync in a more favorable light, simply because it aligns with your own viewpoints on competition? Thank you for confirming your reasoning for posting such a biased and superficial review. You think this is going to matter to someone who is trying to make an informed decision, TODAY, on which technology to choose? Again, if you want to get into the socioeconomic benefits of competition and why we need AMD to survive, post this as an editiorial, but to put "Review" in the title is a disservice to your readers and the history of this website, hell, even your own previous work.
  • steve4king - Monday, March 23, 2015 - link

    Thanks Jarred. I really appreciate your work on this. However, I do disagree to some extent on the low-end FPS issue. The biggest potential benefit to Adaptive Refresh is smoothing out the tearing and judder that happens when the frame rate is inconsistent and drops. I also would not play at settings where my average frame-rate fell below 60fps.. However, my settings will take into account the average FPS, where most scenes may be glassy-smooth, while in a specific area the frame-rate may drop substantially. That's where I really need adaptive-sync to shine. And from most reports, that's where G-Sync does shine. I expect low end flicker could be solved with a doubling of frames, and understand you cannot completely solve judder if the frame-rate is too low.
  • tsk2k - Friday, March 20, 2015 - link

    Thanks for your reply Jarred.
    I was just throwing a tantrum cause I wanted a more in-depth article.
  • 5150Joker - Friday, March 20, 2015 - link

    I own a G-Sync ASUS ROG PG278Q display and while it's fantastic, I'd prefer NVIDIA just give up on G-Sync and go with the flow and adapt ASync/FreeSync. It's clearly working as well (which was my biggest hesitation) so there's no reason to continue forcing users to pay a premium on proprietary technology that more and more display manufacturers will not support. If LG or Samsung push out a 34" widescreen display that is AHVA/IPS with low response time and 144 Hz support, I'll probably sell my ROG Swift and switch, even if it is a FreeSync display. Like Jared said in his article, you don't notice tearing with a 144 Hz display so G-Sync/FreeSync make little to no impact.
  • chizow - Friday, March 20, 2015 - link

    And what if going to Adaptive Sync results in a worst experience? Personally I have no problems if Nvidia uses an inferior Adaptive Sync based solution, but I would still certainly want them to continue developing in and investing in G-Sync, as I know for a fact I would not be happy with what FreeSync has shown today.
  • wira123 - Friday, March 20, 2015 - link

    inferior / worst experience ?
    meanwhile the review from anandtech, guru3d, hothardware, overclock3d, hexus, techspot, hardwareheaven, and the list could goes on forever. They clearly stated that the experiences with both freesync & g-sync are equal / comparable, but freesync cost less as an added bonus.
    Before you accuse me as an AMD fanboy, i own an intel haswell cpu & zotac GTX 750 (should i take pics as prove ?).
    Based of review from numerous tech sites. my conclusion is : either g-sync will end up just like betamax, or nvdia forced to adopt adaptive sync.
  • chizow - Friday, March 20, 2015 - link

    These tests were done in some kind of limited/closed test environment apparently, so yes all of these reviews are what I would consider incomplete and superficial. There are a few sites however that delve deeper and notice significant issues. I have already posted links to them in the comments, if you made it this far to comment you would've come upon them already and either chose to ignore them or missed them. Feel free to look them over and come to your own conclusions, but it is obvious to me, the high floor refresh and the ghosting make FreeSync worst than G-Sync without a doubt.
  • wira123 - Friday, March 20, 2015 - link

    Yeah since pcper gospel review was apparently made by jesus.
    And 95% reviewer around the world who praised freesync are heretic, isn't that right ?

    V-sync can be turned off-on as you wish if the fps surpass the monitor refresh cap range, unlike G-sync. And none report ghostring effect SO FAR, you are daydreaming or what ?
    Still waiting for tomshardware review, even though i already know what their verdict will be
  • chizow - Saturday, March 21, 2015 - link

    Who needs god when you have actual screenshots and video of the problems? This is tech we are talking about, not religion, but I am sure to AMD fans they are one and the same.
  • Crunchy005 - Saturday, March 21, 2015 - link

    @chizow anything that doesn't say Nvidia is God is incomplete and superficial to you. You are putting down a lot of hard work out into a lot of reviews for one review that pointed out a minor issue.

    Show us more than one and maybe we will loom at it but your paper gospel means nothing when there are a ton more articles that contradict it. Also what makes you the expert here when all these reviews say it is the same/comparable and you yourself has not seen freesync in person. If you think you can do a better job start a blog and show us. Otherwise stop with your anti AMD pro Nvidia campaign and get off your high horse. In these comments you even attacked Jarred who works hard in his short time that he gets hardware to give us as much relevant info that he can. You don't show respect to others work here and you make blanket statements with nothing to support yourself.

Log in

Don't have an account? Sign up now