FreeSync Displays

There are four FreeSync displays launching today, one each from Acer and BenQ, and two from LG. Besides the displays launching today, seven additional displays should show up in the coming weeks (months?). Here’s the current list of FreeSync compatible displays, with pricing where it has been disclosed.

FreeSync Compatible Displays
Manufacturer Model Diagonal Resolution Refresh Panel Price
Acer XG270HU 27" 2560x1440 40-144Hz TN $499
BenQ XL2730Z 27" 2560x1440 40-144Hz TN $599
LG Electronics 34UM67 34" 2560x1080 48-75Hz IPS $649
LG Electronics 29UM67 29" 2560x1080 48-75Hz IPS $449
Nixeus NX-VUE24 24" 1920x1080 144Hz TN ?
Samsung UE590 28" 3840x2160 60Hz TN ?
Samsung UE590 23.6" 3840x2160 60Hz TN ?
Samsung UE850 31.5" 3840x2160 60Hz TN? ?
Samsung UE850 28" 3840x2160 60Hz TN? ?
Samsung UE850 23.6" 3840x2160 60Hz TN? ?
Viewsonic VX2701mh 27" 1920x1080 144Hz TN ?

The four displays launching today cover two primary options. For those that want higher refresh rates, Acer and BenQ have TN-based 40-144Hz displays. Both are 27” WQHD displays, so it’s quite probable that they’re using the same panel, perhaps even the same panel that we’ve seen in the ASUS ROG Swift. The two LG displays meanwhile venture out into new territory as far as adaptive refresh rates are concerned. LG has both a smaller 29” and a larger 34” 2560x1080 (UW-UXGA) display, and both sport IPS panels (technically AU Optronics' AHVA, but it's basically the same as IPS).

The other upcoming displays all appear to be using TN panels, though it's possible Samsung might offer PLS. The UE590 appears to be TN for certain, with 170/160 degree viewing angles according to DigitalTrends. The UE850 on the other hand is targeted more at imaging professionals, so PLS might be present; we'll update if we can get any confirmation of panel type.

One of the big benefits with FreeSync is going to be support for multiple video inputs – the G-SYNC displays so far are all limited to a single DisplayPort connection. The LG displays come with DisplayPort, HDMI, and DVI-D inputs (along with audio in/out), and the Acer is similarly equipped. Neither one has any USB ports, though the BenQ does have a built-in USB hub with ports on the side.

Our testing was conducted on the 34UM67, and let me just say that it’s quite the sight sitting on my desk. I’ve been bouncing between the ASUS ROG Swift and Acer XB280HK for the past several months, and both displays have their pros and cons. I like the high resolution of the Acer at times, but I have to admit that my aging eyes often struggle when running it at 4K and I have to resort to DPI scaling (which introduces other problems). The ASUS on the other hand is great with its high refresh rates, and the resolution is more readable without scaling. The big problem with both displays is that they’re TN panels, and having come from using a 30” IPS display for the past eight years that’s a pretty painful compromise.

Plopping the relatively gigantic 34UM67 on my desk is in many ways like seeing a good friend again after a long hiatus. “Dear IPS (AHVA), I’ve missed having you on my desktop. Please don’t leave me again!” For the old and decrepit folks like me, dropping to 2560x1080 on a 34” display also means reading text at 100% zoom is not a problem. But when you’re only a couple feet away, the relatively low DPI does make the pixels much more visible to the naked eye. It even has built-in speakers (though they’re not going to compete with any standalone speakers in terms of audio quality).

The launch price of $649 is pretty impressive; we’ve looked at a few other 21:9 displays in the past, and while the resolution doesn’t match LG’s 34UM95, the price is actually $50 less than the LG 34UM65’s original $699 MSRP (though it’s now being sold at $599). So at most, it looks like putting in the new technology to make a FreeSync display costs $50, and probably less than that. Anyway, we’ll have a full review of the LG 34UM67 in the coming weeks, but for now let’s return to the FreeSync discussion.

Pricing vs. G-SYNC

It certainly appears that AMD and their partners are serious about pricing FreeSync aggressively, though there aren’t direct comparisons available for some of the models. The least expensive FreeSync displays start at just $449, which matches the least expensive G-SYNC display (AOC G2460PG) on price but with generally better specs (29” 2560x1080 and IPS at 75Hz vs. 24” 1920x1080 TN at 144Hz). Looking at direct comparisons, the Acer XG270HU and BenQ XL2730Z are WQHD 144Hz panels, which pits them against the $759 ASUS ROG Swift that we recently reviewed, giving FreeSync a $160 to $260 advantage. As AMD puts it, that’s almost enough for another GPU (depending on which Radeon you’re using, of course).



Based on pricing alone, FreeSync looks poised to give G-SYNC some much needed competition. And it’s not just about the price, as there are other advantages to FreeSync that we’ll cover more on the next page. But for a moment let’s focus just on the AMD FreeSync vs. NVIDIA G-SYNC ecosystems.

Right now NVIDIA enjoys a performance advantage over AMD in terms of GPUs, and along with that they currently carry a price premium, particularly at the high end. While the R9 290X and GTX 970 are pretty evenly matched, the GTX 980 tends to lead by a decent amount in most games. Any users willing to spend $200 extra per GPU to buy a GTX 980 instead of an R9 290X might also be willing to pay $200 more for a G-SYNC compatible display. After all, it’s the only game in town for NVIDIA users right now.

AMD and other companies can support FreeSync, but until – unless! – NVIDIA supports the standard, users will be forced to choose between AMD + FreeSync or NVIDIA + G-SYNC. That’s unfortunate for any users that routinely switch between AMD and NVIDIA GPUs, though the number of people outside of hardware reviewers that regularly go back and forth is miniscule. Ideally we’d see one standard win out and the other fade away (i.e. Betamax, HD-DVD, etc.), but with a one year lead and plenty of money invested it’s unlikely NVIDIA will abandon G-SYNC any time soon.

Prices meanwhile are bound to change, as up to now there has been no competition for NVIDIA’s G-SYNC monitors. With FreeSync finally available, we expect prices for G-SYNC displays will start to come down, and in fact we’re already seeing $40-$125 off the original MSRP for most of the G-SYNC displays. Will that be enough to keep NVIDIA’s proprietary G-SYNC technology viable? Most likely, as both FreeSync and G-SYNC are gamer focused more than anything; if a gamer prefers NVIDIA, FreeSync isn’t likely to get them to switch sides. But if you don’t have any GPU preference, you’re in the market for a new gaming PC, and you’re planning on buying a new monitor to go with it, R9 290X + FreeSync could save a couple hundred dollars compared to GTX 970 + G-SYNC.

There's something else to consider with the above list of monitors as well: four currently shipping FreeSync displays exist on the official day of launch, and Samsung alone has five more FreeSync displays scheduled for release in the near future. Eleven FreeSync displays in the near term might not seem like a huge deal, but compare that with G-SYNC: even with a one year lead (more or less), NVIDIA currently only lists six displays with G-SYNC support, and the upcoming Acer XB270HU makes for seven. AMD also claims there will be 20 FreeSync compatible displays shipping by the end of the year. In terms of numbers, then, DP Adaptive Sync (and by extension FreeSync) look to be winning this war.

Introduction to FreeSync and Adaptive Refresh FreeSync Features
Comments Locked

350 Comments

View All Comments

  • chizow - Saturday, March 21, 2015 - link

    You seem to be taking my comments pretty seriously Jarred, as you should, given they draw a lot of questions to your credibility and capabilities in writing a competent "review" of the technology being discussed. But its np, no one needs to take me seriously, this isn't my job, unlike yours even if it is part time. The downside is, reviews like this make it harder for anyone to take you or the content on this site seriously, because as you can see, there are a number of other individuals that have taken issue to your Engadget-like review. I am sure there are a number of people that will take this review as gospel, go out and buy FreeSync panels, discover ghosting issues not covered in this "review" and ultimately, lose trust in what this site represents. Not that you seem to care.

    As for being limited in equipment, that's just another poor excuse and asterisk you've added to the footnotes here. It takes a max $300 camera, far less than a single performance graphics card, and maybe $50 in LED, diodes and USB input doublers (hell you can even make your own if you know how to splice wires) at Digikey or RadioShack to test this. Surely, Ryan and your new parent company could foot this bill for a new test methodology if there was actually interest in conducting a serious review of the technology. Numerous sites have already given the methodology for input lag and ghosting with a FAR smaller budget than AnandTech, all you would have to do is mimic their test set-up with a short acknowledgment which I am sure they would appreciate from the mighty AnandTech.

    But its OK, like the FCAT issue its obvious AT had no intention of actually covering the problems with FreeSync, I guess if it takes a couple of Nvidia "zealots" to get to the bottom of it and draw attention to AMD's problems to ultimately force them to improve their products, so be it. Its obvious the actual AMD fans and spoon-fed press aren't willing to tackle them.

    As for blanket-statements lol, that's a good one. I guess we should just take your unsubstantiated points of view, which are unsurprisingly, perfectly aligned with AMD's, at face value without any amount of critical thinking and skepticism?

    It's frankly embarrassing to read some of the points you've made from someone who actually works in this industry, for example:

    1) One shred of confirmation that G-Sync carries royalties. Your "semantics" mean nothing here.
    2) One shred of confirmation that existing, pre-2015 panels can be made compatible with a firmware upgrade.
    3) One shred of confirmation that G-Sync somehow faces the uphill battle compared to FreeSync, given known market indicators and factual limitations on FreeSync graphics card support.

    All points you have made in an effort to show FreeSync in a better light, while downplaying G-Sync.

    As for the last bit, again, if you have to sacrifice your gaming quality in an attempt to meet FreeSync's minimum standard refresh rate, the solution has already failed, given one of the major benefits of VRR is the ability to crank up settings without having to resort to Vsync On and the input lag associated with it. For example, in your example, if you have to drop settings from Very High to High just so that your FPS don't drop below 45FPS for 10% of the time, you've already had to sacrifice your image quality for the other 90% it stays above that. That is a failure of a solution if the alternative is to just repeat frames for that 10% as needed. But hey, to each their own, this kind of testing and information would be SUPER informative in an actual comprehensive review.

    As for your own viewpoints on competition, who cares!?!?!? You're going to color your review and outlook in an attempt to paint FreeSync in a more favorable light, simply because it aligns with your own viewpoints on competition? Thank you for confirming your reasoning for posting such a biased and superficial review. You think this is going to matter to someone who is trying to make an informed decision, TODAY, on which technology to choose? Again, if you want to get into the socioeconomic benefits of competition and why we need AMD to survive, post this as an editiorial, but to put "Review" in the title is a disservice to your readers and the history of this website, hell, even your own previous work.
  • steve4king - Monday, March 23, 2015 - link

    Thanks Jarred. I really appreciate your work on this. However, I do disagree to some extent on the low-end FPS issue. The biggest potential benefit to Adaptive Refresh is smoothing out the tearing and judder that happens when the frame rate is inconsistent and drops. I also would not play at settings where my average frame-rate fell below 60fps.. However, my settings will take into account the average FPS, where most scenes may be glassy-smooth, while in a specific area the frame-rate may drop substantially. That's where I really need adaptive-sync to shine. And from most reports, that's where G-Sync does shine. I expect low end flicker could be solved with a doubling of frames, and understand you cannot completely solve judder if the frame-rate is too low.
  • tsk2k - Friday, March 20, 2015 - link

    Thanks for your reply Jarred.
    I was just throwing a tantrum cause I wanted a more in-depth article.
  • 5150Joker - Friday, March 20, 2015 - link

    I own a G-Sync ASUS ROG PG278Q display and while it's fantastic, I'd prefer NVIDIA just give up on G-Sync and go with the flow and adapt ASync/FreeSync. It's clearly working as well (which was my biggest hesitation) so there's no reason to continue forcing users to pay a premium on proprietary technology that more and more display manufacturers will not support. If LG or Samsung push out a 34" widescreen display that is AHVA/IPS with low response time and 144 Hz support, I'll probably sell my ROG Swift and switch, even if it is a FreeSync display. Like Jared said in his article, you don't notice tearing with a 144 Hz display so G-Sync/FreeSync make little to no impact.
  • chizow - Friday, March 20, 2015 - link

    And what if going to Adaptive Sync results in a worst experience? Personally I have no problems if Nvidia uses an inferior Adaptive Sync based solution, but I would still certainly want them to continue developing in and investing in G-Sync, as I know for a fact I would not be happy with what FreeSync has shown today.
  • wira123 - Friday, March 20, 2015 - link

    inferior / worst experience ?
    meanwhile the review from anandtech, guru3d, hothardware, overclock3d, hexus, techspot, hardwareheaven, and the list could goes on forever. They clearly stated that the experiences with both freesync & g-sync are equal / comparable, but freesync cost less as an added bonus.
    Before you accuse me as an AMD fanboy, i own an intel haswell cpu & zotac GTX 750 (should i take pics as prove ?).
    Based of review from numerous tech sites. my conclusion is : either g-sync will end up just like betamax, or nvdia forced to adopt adaptive sync.
  • chizow - Friday, March 20, 2015 - link

    These tests were done in some kind of limited/closed test environment apparently, so yes all of these reviews are what I would consider incomplete and superficial. There are a few sites however that delve deeper and notice significant issues. I have already posted links to them in the comments, if you made it this far to comment you would've come upon them already and either chose to ignore them or missed them. Feel free to look them over and come to your own conclusions, but it is obvious to me, the high floor refresh and the ghosting make FreeSync worst than G-Sync without a doubt.
  • wira123 - Friday, March 20, 2015 - link

    Yeah since pcper gospel review was apparently made by jesus.
    And 95% reviewer around the world who praised freesync are heretic, isn't that right ?

    V-sync can be turned off-on as you wish if the fps surpass the monitor refresh cap range, unlike G-sync. And none report ghostring effect SO FAR, you are daydreaming or what ?
    Still waiting for tomshardware review, even though i already know what their verdict will be
  • chizow - Saturday, March 21, 2015 - link

    Who needs god when you have actual screenshots and video of the problems? This is tech we are talking about, not religion, but I am sure to AMD fans they are one and the same.
  • Crunchy005 - Saturday, March 21, 2015 - link

    @chizow anything that doesn't say Nvidia is God is incomplete and superficial to you. You are putting down a lot of hard work out into a lot of reviews for one review that pointed out a minor issue.

    Show us more than one and maybe we will loom at it but your paper gospel means nothing when there are a ton more articles that contradict it. Also what makes you the expert here when all these reviews say it is the same/comparable and you yourself has not seen freesync in person. If you think you can do a better job start a blog and show us. Otherwise stop with your anti AMD pro Nvidia campaign and get off your high horse. In these comments you even attacked Jarred who works hard in his short time that he gets hardware to give us as much relevant info that he can. You don't show respect to others work here and you make blanket statements with nothing to support yourself.

Log in

Don't have an account? Sign up now