FreeSync Displays

There are four FreeSync displays launching today, one each from Acer and BenQ, and two from LG. Besides the displays launching today, seven additional displays should show up in the coming weeks (months?). Here’s the current list of FreeSync compatible displays, with pricing where it has been disclosed.

FreeSync Compatible Displays
Manufacturer Model Diagonal Resolution Refresh Panel Price
Acer XG270HU 27" 2560x1440 40-144Hz TN $499
BenQ XL2730Z 27" 2560x1440 40-144Hz TN $599
LG Electronics 34UM67 34" 2560x1080 48-75Hz IPS $649
LG Electronics 29UM67 29" 2560x1080 48-75Hz IPS $449
Nixeus NX-VUE24 24" 1920x1080 144Hz TN ?
Samsung UE590 28" 3840x2160 60Hz TN ?
Samsung UE590 23.6" 3840x2160 60Hz TN ?
Samsung UE850 31.5" 3840x2160 60Hz TN? ?
Samsung UE850 28" 3840x2160 60Hz TN? ?
Samsung UE850 23.6" 3840x2160 60Hz TN? ?
Viewsonic VX2701mh 27" 1920x1080 144Hz TN ?

The four displays launching today cover two primary options. For those that want higher refresh rates, Acer and BenQ have TN-based 40-144Hz displays. Both are 27” WQHD displays, so it’s quite probable that they’re using the same panel, perhaps even the same panel that we’ve seen in the ASUS ROG Swift. The two LG displays meanwhile venture out into new territory as far as adaptive refresh rates are concerned. LG has both a smaller 29” and a larger 34” 2560x1080 (UW-UXGA) display, and both sport IPS panels (technically AU Optronics' AHVA, but it's basically the same as IPS).

The other upcoming displays all appear to be using TN panels, though it's possible Samsung might offer PLS. The UE590 appears to be TN for certain, with 170/160 degree viewing angles according to DigitalTrends. The UE850 on the other hand is targeted more at imaging professionals, so PLS might be present; we'll update if we can get any confirmation of panel type.

One of the big benefits with FreeSync is going to be support for multiple video inputs – the G-SYNC displays so far are all limited to a single DisplayPort connection. The LG displays come with DisplayPort, HDMI, and DVI-D inputs (along with audio in/out), and the Acer is similarly equipped. Neither one has any USB ports, though the BenQ does have a built-in USB hub with ports on the side.

Our testing was conducted on the 34UM67, and let me just say that it’s quite the sight sitting on my desk. I’ve been bouncing between the ASUS ROG Swift and Acer XB280HK for the past several months, and both displays have their pros and cons. I like the high resolution of the Acer at times, but I have to admit that my aging eyes often struggle when running it at 4K and I have to resort to DPI scaling (which introduces other problems). The ASUS on the other hand is great with its high refresh rates, and the resolution is more readable without scaling. The big problem with both displays is that they’re TN panels, and having come from using a 30” IPS display for the past eight years that’s a pretty painful compromise.

Plopping the relatively gigantic 34UM67 on my desk is in many ways like seeing a good friend again after a long hiatus. “Dear IPS (AHVA), I’ve missed having you on my desktop. Please don’t leave me again!” For the old and decrepit folks like me, dropping to 2560x1080 on a 34” display also means reading text at 100% zoom is not a problem. But when you’re only a couple feet away, the relatively low DPI does make the pixels much more visible to the naked eye. It even has built-in speakers (though they’re not going to compete with any standalone speakers in terms of audio quality).

The launch price of $649 is pretty impressive; we’ve looked at a few other 21:9 displays in the past, and while the resolution doesn’t match LG’s 34UM95, the price is actually $50 less than the LG 34UM65’s original $699 MSRP (though it’s now being sold at $599). So at most, it looks like putting in the new technology to make a FreeSync display costs $50, and probably less than that. Anyway, we’ll have a full review of the LG 34UM67 in the coming weeks, but for now let’s return to the FreeSync discussion.

Pricing vs. G-SYNC

It certainly appears that AMD and their partners are serious about pricing FreeSync aggressively, though there aren’t direct comparisons available for some of the models. The least expensive FreeSync displays start at just $449, which matches the least expensive G-SYNC display (AOC G2460PG) on price but with generally better specs (29” 2560x1080 and IPS at 75Hz vs. 24” 1920x1080 TN at 144Hz). Looking at direct comparisons, the Acer XG270HU and BenQ XL2730Z are WQHD 144Hz panels, which pits them against the $759 ASUS ROG Swift that we recently reviewed, giving FreeSync a $160 to $260 advantage. As AMD puts it, that’s almost enough for another GPU (depending on which Radeon you’re using, of course).



Based on pricing alone, FreeSync looks poised to give G-SYNC some much needed competition. And it’s not just about the price, as there are other advantages to FreeSync that we’ll cover more on the next page. But for a moment let’s focus just on the AMD FreeSync vs. NVIDIA G-SYNC ecosystems.

Right now NVIDIA enjoys a performance advantage over AMD in terms of GPUs, and along with that they currently carry a price premium, particularly at the high end. While the R9 290X and GTX 970 are pretty evenly matched, the GTX 980 tends to lead by a decent amount in most games. Any users willing to spend $200 extra per GPU to buy a GTX 980 instead of an R9 290X might also be willing to pay $200 more for a G-SYNC compatible display. After all, it’s the only game in town for NVIDIA users right now.

AMD and other companies can support FreeSync, but until – unless! – NVIDIA supports the standard, users will be forced to choose between AMD + FreeSync or NVIDIA + G-SYNC. That’s unfortunate for any users that routinely switch between AMD and NVIDIA GPUs, though the number of people outside of hardware reviewers that regularly go back and forth is miniscule. Ideally we’d see one standard win out and the other fade away (i.e. Betamax, HD-DVD, etc.), but with a one year lead and plenty of money invested it’s unlikely NVIDIA will abandon G-SYNC any time soon.

Prices meanwhile are bound to change, as up to now there has been no competition for NVIDIA’s G-SYNC monitors. With FreeSync finally available, we expect prices for G-SYNC displays will start to come down, and in fact we’re already seeing $40-$125 off the original MSRP for most of the G-SYNC displays. Will that be enough to keep NVIDIA’s proprietary G-SYNC technology viable? Most likely, as both FreeSync and G-SYNC are gamer focused more than anything; if a gamer prefers NVIDIA, FreeSync isn’t likely to get them to switch sides. But if you don’t have any GPU preference, you’re in the market for a new gaming PC, and you’re planning on buying a new monitor to go with it, R9 290X + FreeSync could save a couple hundred dollars compared to GTX 970 + G-SYNC.

There's something else to consider with the above list of monitors as well: four currently shipping FreeSync displays exist on the official day of launch, and Samsung alone has five more FreeSync displays scheduled for release in the near future. Eleven FreeSync displays in the near term might not seem like a huge deal, but compare that with G-SYNC: even with a one year lead (more or less), NVIDIA currently only lists six displays with G-SYNC support, and the upcoming Acer XB270HU makes for seven. AMD also claims there will be 20 FreeSync compatible displays shipping by the end of the year. In terms of numbers, then, DP Adaptive Sync (and by extension FreeSync) look to be winning this war.

Introduction to FreeSync and Adaptive Refresh FreeSync Features
Comments Locked

350 Comments

View All Comments

  • barleyguy - Thursday, March 19, 2015 - link

    I was already shopping for a 21:9 monitor for my home office. I'm now planning to order a 29UM67 as soon as I see one in stock. The GPU in that machine is an R7/260X, which is on the compatible list. :-)
  • boozed - Thursday, March 19, 2015 - link

    "the proof is in the eating of the pudding"

    Thankyou for getting this expression right!

    Oh, and Freesync looks cool too.
  • D. Lister - Thursday, March 19, 2015 - link

    I have had my reservations with claims made by AMD these days, and my opinion of 'FreeSync' wasn't quite in contrast. If this actually works at least just as well as G-Sync (as claimed by this rather brief review) with various hardware/software setups then it is indeed a praiseworthy development. I personally would certainly be glad that the rivalry of two tech giants resulted (even if only inadvertently) in something that benefits the consumer.
  • cmdrdredd - Thursday, March 19, 2015 - link

    I love the arguments about "freesync is an open standard" when it doesn't matter. 80% of the market is Nvidia and won't be using it. Intel is a non-issue because not many people are playing games that benefit from adaptive v-sync. Think about it, either way you're stuck. If you buy a GSync monitor now you likely will upgrade your GPU before the monitor goes out. So your options are only Nvidia. If you buy a freesync monitor your options are only AMD. So everyone arguing against gsync because you're stuck with Nvidia, have fun being stuck with AMD the other way around.

    Best to not even worry about either of these unless you absolutely do not see yourself changing GPU manufacturers for the life of the display.
  • barleyguy - Friday, March 20, 2015 - link

    NVidia is 71% of the AIB market, as of the latest released numbers from Hexus. That doesn't include AMD's APUs, which also support Freesync and are often used by "midrange" gamers.

    The relevance of being an open standard though, is that monitor manufacturers can add it with almost zero extra cost. If it's built into nearly every monitor in a couple of years, then NVidia might have a reason to start supporting it.
  • tsk2k - Thursday, March 19, 2015 - link

    @Jarred Walton
    You disappoint me.
    What you said about G-sync below minimum refresh rate is not correct, also there seems to be issues with ghosting on freesync. I encourage everyone to go to PCper(dot)com and read a much more in-depth article on the subject.
    Get rekt anandtech.
  • JarredWalton - Friday, March 20, 2015 - link

    If you're running a game and falling below the minimum refresh rate, you're using settings that are too demanding for your GPU. I've spent quite a few hours playing games on the LG 34UM67 today just to see if I could see/feel issues below 48 FPS. I can't say that I did, though I also wasn't running settings that dropped below 30 FPS. Maybe I'm just getting too old, but if the only way to quantify the difference is with expensive equipment, perhaps we're focusing too much on the theoretical rather than the practical.

    Now, there will undoubtedly be some that say they really see/feel the difference, and maybe they do. There will be plenty of others where it doesn't matter one way or the other. But if you've got an R9 290X and you're looking at the LG 34UM67, I see no reason not to go that route. Of course you need to be okay with a lower resolution and a more limited range for VRR, and you're also willing to go with a slower response time IPS (AHVA) panel rather than dealing with TN problems. Many people are.

    What's crazy to me is all the armchair experts reading our review and the PCPer review and somehow coming out with one or the other of us being "wrong". I had limited time with the FreeSync display, but even so there was nothing I encountered that caused me any serious concern. Are there cases where FreeSync doesn't work right? Yes. The same applies to G-SYNC. (For instance, at 31 FPS on a G-SYNC display, you won't get frame doubling but you will see some flicker in my experience. So that 30-40 FPS range is a problem for G-SYNC as well as FreeSync.)

    I guess it's all a matter of perspective. Is FreeSync identical to G-SYNC? No, and we shouldn't expect it to be. The question is how much the differences matter. Remember the anisotropic filtering wars of last decade where AMD and NVIDIA were making different optimizations? Some were perhaps provably better, but in practice most gamers didn't really care. It was all just used as flame bait and marketing fluff.

    I would agree that right now you can make the case the G-SYNC is provably better than FreeSync in some situations, but then both are provably better than static refresh rates. It's the edge cases where NVIDIA wins (specifically, when frame rates fall below the minimum VRR rate), but when that happens you're already "doing it wrong". Seriously, if I play a game and it starts to stutter, I drop the quality settings a notch. I would wager most gamers do the same. When we're running benchmarks and comparing performance, it's all well and good to say GPU 1 is better than GPU 2, but in practice people use settings that provide a good experience.

    Example:
    Assassin's Creed: Unity runs somewhat poorly on AMD GPUs. Running at Ultra settings or even Very High in my experience is asking for problems, no matter if you have a FreeSync display or not. Stick with High and you'll be a lot happier, and in the middle of a gaming session I doubt anyone will really care about the slight drop in visual fidelity. With an R9 290X running at 2560x1080 High, ACU typically runs at 50-75FPS on the LG 34UM67; with a GTX 970, it would run faster and be "better". But unless you have both GPUs and for some reason you like swapping between them, it's all academic: you'll find settings that work and play the game, or you'll switch to a different game.

    Bottom Line: AMD users can either go with FreeSync or not; they have no other choice. NVIDIA users likewise can go with G-SYNC or not. Both provide a smoother gaming experience than 60Hz displays, absolutely... but with a 120/144Hz panel only the high speed cameras and eagle eyed youth will really notice the difference. :-)
  • chizow - Friday, March 20, 2015 - link

    Haha love it, still feisty I see even in your "old age" there Jarred. I think all the armchair experts want is for you and AT to use your forum on the internet to actually do the kind of testing and comparisons that matter for the products being discussed, not just provide another Engadget-like experience of superficial touch-feely review, dismissing anything actually relevant to this tech and market as not being discernable to someone "your age".
  • JarredWalton - Friday, March 20, 2015 - link

    It's easy to point out flaws in testing; it's a lot harder to get the hardware necessary to properly test things like input latency. AnandTech doesn't have a central location, so I basically test with what I have. Things I don't have include gadgets to measure refresh rate in a reliable fashion, high speed cameras, etc. Another thing that was lacking: time. I received the display on March 17, in the afternoon; sometimes you just do what you can in the time you're given.

    You however are making blanket statements that are pro-NVIDIA/anti-AMD, just as you always do. The only person that takes your comments seriously is you, and perhaps other NVIDIA zealots. Mind you, I prefer my NVIDIA GPUs to my AMD GPUs for a variety of reasons, but I appreciate competition and in this case no one is going to convince me that the closed ecosystem of G-SYNC is the best way to do things long-term. Short-term it was the way to be first, but now there's an open DisplayPort standard (albeit an optional one) and NVIDIA really should do everyone a favor and show that they can support both.

    If NVIDIA feels G-SYNC is ultimately the best way to do things, fine -- support both and let the hardware enthusiasts decide which they actually want to use. With only seven G-SYNC displays there's not a lot of choice right now, and if most future DP1.2a and above displays use scalers that support Adaptive Sync it would be stupid not to at least have an alternate mode.

    But if the only real problem with FreeSync is when you fall below the minimum refresh rate you get judder/tearing, that's not a show stopper. As I said above, if that happens to me I'm already changing my settings. (I do the same with G-SYNC incidentally: my goal is 45+ FPS, as below 40 doesn't really feel smooth to me. YMMV.)
  • Soulwager - Saturday, March 21, 2015 - link

    You can test absolute input latency to sub millisecond precision with ~50 bucks worth of hobby electronics, free software, and some time to play with it. For example, an arduino micro, a photoresistor, a second resistor to make a divider, a breadboard, and a usb cable. Set the arduino up to emulate a mouse, and record the difference in timing between a mouse input and the corresponding change in light intensity. Let it log a couple minutes of press/release cycles, subtract 1ms of variance for USB polling, and there you go, full chain latency. If you have access to a CRT, you can get a precise baseline as well.

    As for sub-VRR behavior, if you leave v-sync on, does the framerate drop directly to 20fps, or is AMD using triple buffering?

Log in

Don't have an account? Sign up now