FreeSync Displays

There are four FreeSync displays launching today, one each from Acer and BenQ, and two from LG. Besides the displays launching today, seven additional displays should show up in the coming weeks (months?). Here’s the current list of FreeSync compatible displays, with pricing where it has been disclosed.

FreeSync Compatible Displays
Manufacturer Model Diagonal Resolution Refresh Panel Price
Acer XG270HU 27" 2560x1440 40-144Hz TN $499
BenQ XL2730Z 27" 2560x1440 40-144Hz TN $599
LG Electronics 34UM67 34" 2560x1080 48-75Hz IPS $649
LG Electronics 29UM67 29" 2560x1080 48-75Hz IPS $449
Nixeus NX-VUE24 24" 1920x1080 144Hz TN ?
Samsung UE590 28" 3840x2160 60Hz TN ?
Samsung UE590 23.6" 3840x2160 60Hz TN ?
Samsung UE850 31.5" 3840x2160 60Hz TN? ?
Samsung UE850 28" 3840x2160 60Hz TN? ?
Samsung UE850 23.6" 3840x2160 60Hz TN? ?
Viewsonic VX2701mh 27" 1920x1080 144Hz TN ?

The four displays launching today cover two primary options. For those that want higher refresh rates, Acer and BenQ have TN-based 40-144Hz displays. Both are 27” WQHD displays, so it’s quite probable that they’re using the same panel, perhaps even the same panel that we’ve seen in the ASUS ROG Swift. The two LG displays meanwhile venture out into new territory as far as adaptive refresh rates are concerned. LG has both a smaller 29” and a larger 34” 2560x1080 (UW-UXGA) display, and both sport IPS panels (technically AU Optronics' AHVA, but it's basically the same as IPS).

The other upcoming displays all appear to be using TN panels, though it's possible Samsung might offer PLS. The UE590 appears to be TN for certain, with 170/160 degree viewing angles according to DigitalTrends. The UE850 on the other hand is targeted more at imaging professionals, so PLS might be present; we'll update if we can get any confirmation of panel type.

One of the big benefits with FreeSync is going to be support for multiple video inputs – the G-SYNC displays so far are all limited to a single DisplayPort connection. The LG displays come with DisplayPort, HDMI, and DVI-D inputs (along with audio in/out), and the Acer is similarly equipped. Neither one has any USB ports, though the BenQ does have a built-in USB hub with ports on the side.

Our testing was conducted on the 34UM67, and let me just say that it’s quite the sight sitting on my desk. I’ve been bouncing between the ASUS ROG Swift and Acer XB280HK for the past several months, and both displays have their pros and cons. I like the high resolution of the Acer at times, but I have to admit that my aging eyes often struggle when running it at 4K and I have to resort to DPI scaling (which introduces other problems). The ASUS on the other hand is great with its high refresh rates, and the resolution is more readable without scaling. The big problem with both displays is that they’re TN panels, and having come from using a 30” IPS display for the past eight years that’s a pretty painful compromise.

Plopping the relatively gigantic 34UM67 on my desk is in many ways like seeing a good friend again after a long hiatus. “Dear IPS (AHVA), I’ve missed having you on my desktop. Please don’t leave me again!” For the old and decrepit folks like me, dropping to 2560x1080 on a 34” display also means reading text at 100% zoom is not a problem. But when you’re only a couple feet away, the relatively low DPI does make the pixels much more visible to the naked eye. It even has built-in speakers (though they’re not going to compete with any standalone speakers in terms of audio quality).

The launch price of $649 is pretty impressive; we’ve looked at a few other 21:9 displays in the past, and while the resolution doesn’t match LG’s 34UM95, the price is actually $50 less than the LG 34UM65’s original $699 MSRP (though it’s now being sold at $599). So at most, it looks like putting in the new technology to make a FreeSync display costs $50, and probably less than that. Anyway, we’ll have a full review of the LG 34UM67 in the coming weeks, but for now let’s return to the FreeSync discussion.

Pricing vs. G-SYNC

It certainly appears that AMD and their partners are serious about pricing FreeSync aggressively, though there aren’t direct comparisons available for some of the models. The least expensive FreeSync displays start at just $449, which matches the least expensive G-SYNC display (AOC G2460PG) on price but with generally better specs (29” 2560x1080 and IPS at 75Hz vs. 24” 1920x1080 TN at 144Hz). Looking at direct comparisons, the Acer XG270HU and BenQ XL2730Z are WQHD 144Hz panels, which pits them against the $759 ASUS ROG Swift that we recently reviewed, giving FreeSync a $160 to $260 advantage. As AMD puts it, that’s almost enough for another GPU (depending on which Radeon you’re using, of course).



Based on pricing alone, FreeSync looks poised to give G-SYNC some much needed competition. And it’s not just about the price, as there are other advantages to FreeSync that we’ll cover more on the next page. But for a moment let’s focus just on the AMD FreeSync vs. NVIDIA G-SYNC ecosystems.

Right now NVIDIA enjoys a performance advantage over AMD in terms of GPUs, and along with that they currently carry a price premium, particularly at the high end. While the R9 290X and GTX 970 are pretty evenly matched, the GTX 980 tends to lead by a decent amount in most games. Any users willing to spend $200 extra per GPU to buy a GTX 980 instead of an R9 290X might also be willing to pay $200 more for a G-SYNC compatible display. After all, it’s the only game in town for NVIDIA users right now.

AMD and other companies can support FreeSync, but until – unless! – NVIDIA supports the standard, users will be forced to choose between AMD + FreeSync or NVIDIA + G-SYNC. That’s unfortunate for any users that routinely switch between AMD and NVIDIA GPUs, though the number of people outside of hardware reviewers that regularly go back and forth is miniscule. Ideally we’d see one standard win out and the other fade away (i.e. Betamax, HD-DVD, etc.), but with a one year lead and plenty of money invested it’s unlikely NVIDIA will abandon G-SYNC any time soon.

Prices meanwhile are bound to change, as up to now there has been no competition for NVIDIA’s G-SYNC monitors. With FreeSync finally available, we expect prices for G-SYNC displays will start to come down, and in fact we’re already seeing $40-$125 off the original MSRP for most of the G-SYNC displays. Will that be enough to keep NVIDIA’s proprietary G-SYNC technology viable? Most likely, as both FreeSync and G-SYNC are gamer focused more than anything; if a gamer prefers NVIDIA, FreeSync isn’t likely to get them to switch sides. But if you don’t have any GPU preference, you’re in the market for a new gaming PC, and you’re planning on buying a new monitor to go with it, R9 290X + FreeSync could save a couple hundred dollars compared to GTX 970 + G-SYNC.

There's something else to consider with the above list of monitors as well: four currently shipping FreeSync displays exist on the official day of launch, and Samsung alone has five more FreeSync displays scheduled for release in the near future. Eleven FreeSync displays in the near term might not seem like a huge deal, but compare that with G-SYNC: even with a one year lead (more or less), NVIDIA currently only lists six displays with G-SYNC support, and the upcoming Acer XB270HU makes for seven. AMD also claims there will be 20 FreeSync compatible displays shipping by the end of the year. In terms of numbers, then, DP Adaptive Sync (and by extension FreeSync) look to be winning this war.

Introduction to FreeSync and Adaptive Refresh FreeSync Features
Comments Locked

350 Comments

View All Comments

  • chizow - Thursday, March 19, 2015 - link

    @lordken and yes I am well aware Gsync is tied to Nvidia lol, but like I said, will I bet on the market leader with ~70% market share and installed user base (actually much higher than this, since Kepler is 100% vs. GCN1.1 is maybe 30%? over the cards sold since 2012) over the solution that holds a minor share of the dGPU market and even a smaller share of the CPU/APU market.
  • chizow - Thursday, March 19, 2015 - link

    And why don't you stop your biased preconceptions and actually read some articles that don't just take AMD's slidedecks at face value? Read a review that actually tries to tackle the real issues I am referring to, while actually TALKING to the vendors and doing some investigative reporting:

    http://www.pcper.com/reviews/Displays/AMD-FreeSync...

    You will see, there are some major issues still with FreeSync that still need to be answered and addressed.
  • JarredWalton - Thursday, March 19, 2015 - link

    It's not a "major issue" so much as a limitation of the variable refresh rate range and how AMD chooses to handle it. With NVIDIA it refreshes the same frame at least twice if you drop below 30Hz, and that's fine but it would have to introduce some lag. (When a frame is being refreshed, there's no way to send the next frame to the screen = lag.) AMD gives two options: VSYNC off or VSYNC on. With VSYNC off, you get tearing but less lag/latency. With VSYNC on you get stuttering if you fall below the minimum VRR rate.

    The LG displays are actually not a great option here, as 48Hz at minimum is rather high -- 45 FPS for example will give you tearing or stutter. So you basically want to choose settings for games such that you can stay above 48 FPS with this particular display. But that's certainly no worse than the classic way of doing things where people either live with tearing or aim for 60+ FPS -- 48 FPS is more easily achieved than 60 FPS.

    The problem right now is we're all stuck comparing different implementations. A 2560x1080 IPS display is inherently different than a 2560x1440 TN display. LG decided 48Hz was the minimum refresh rate, most likely to avoid flicker; others have allowed some flicker while going down to 30Hz. You'll definitely see flicker on G-SYNC at 35FPS/35Hz in my experience, incidentally. I can't knock FreeSync and AMD for a problem that is arguably the fault of the display, so we'll look at it more when we get to the actual display review.

    As to the solution, well, there's nothing stopping AMD from just resending a frame if the FPS is too low. They haven't done this in the current driver, but this is FreeSync 1.0 beta.

    Final thought: I don't think most people looking to buy the LG 34UM67 are going to be using a low-end GPU, and in fact with current prices I suspect most people that upgrade will already have an R9 290/290X. Part of the reason I didn't notice issues with FreeSync is that with a single R9 290X in most games the FPS is well over 48. More time is needed for testing, obviously, and a single LCD with FreeSync isn't going to represent all future FreeSync displays. Don't try and draw conclusions from one sample, basically.
  • chizow - Friday, March 20, 2015 - link

    @Jarred

    How is it not a major issue? You think that level of ghosting is acceptable and comparable to G-Sync!?!?! My have your standards dropped, if that is the case I do not think you are qualified to write this review, or at least post it under Editorial, or even better, post it under the AMD sponsored banner.

    Fact is, below the stated minimum refresh, FreeSync is WORST than a non-VRR monitor would be, as all the tearing and input lag is there AND you get awful flickering and ghosting too.

    And how do you know it is a limitation of panel technology when Nvidia's solution exhibits none of these issues at typical refresh rates as low as 20Hz, and especially at the higher refresh rates that AMD starts to experience it? Don't you have access to the sources and players here? I mean we know you have AMD's side of the story, but why don't you ask these same questions to Nvidia, the scaler makers, the monitor makers as well? It could certainly be a limitation of the spec don't you think? If monitor makers are just designing a monitor to AMD's FreeSync spec, and AMD is claiming they can alleviate this via a driver update, it sounds to me like the limitation is in the specification, not the technology, especially when Nvidia's solution does not have these issues. In fact, if you had asked Nvidia, as PCPer did, they may very well have explained to you why FreeSync ghosts/flickers, and their solution does not: From PCPer, again:

    " But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates."

    Science and hardware trumps hunches and hearsay, imo. :)

    Also, you might need to get with Ryan to fully understand the difference between G-Sync and FreeSync at low refresh. G-Sync simply displays the same frame twice. There is no sense of input lag, as input lag would be if the next refreshed panel was tied to a different input I/O. That is not the case with G-Sync, because the held frame 2nd is still tied to the input of the 1st frame, but the next live frame has a live input. All you perceive is low FPS, not input lag. There is a difference. It would be like playing a game at 30FPS on a 60Hz monitor with no Vsync. Still, certainly much better than AMD's solution of having everything fall apart at a framerate that is still quite high and hard to obtain for many video cards.

    The LG is a horrible situation, who wants to be tied to a solution that is only effective in such a tight framerate band? If you are actually going to do some "testing", why don't you test something meaningful like a gaming session that shows the % of frames in any particular game with a particular graphics card that shows that fall outside of the "supported" refresh rates. I think you will find the amount of time spent outside of these bands is actually pretty high in demanding games and titles at the higher than 1080p games on the market today.

    And you definitely see flicker at 35fps/35Hz on a G-Sync panel? Prove it. I have an ROG Swift and there is no flicker as low as 20FPS which is common in the CPU-limited MMO games out there. Not any noticeable flicker. You have access to both technologies, prove it. Post a video, post pictures, post the kind of evidence and do the kind of testing you would actually expect from a professional reviewer on a site like AT instead of addressing the deficiencies in your article with hearsay and anecdotal evidence.

    Last part, again I'd recommend running the test I suggested on multiple panels with multiple cards and mapping out the frame rates to see the % that fall outside or below these minimum FreeSync thresholds. I think you would be surprised, especially given many of these panels are above 1080p. Even that LG is only ~1.35x 1080p, but most of these panels are 1440p premium panels and I can tell you for a fact a single 970/290/290X/980 class card is NOT enough to maintain 40+FPS in many recent demanding games at high settings. And as of now, CF is not an option. So another strike against FreeSync, if you want to use it, your realistic options are a 290/X at the minimum or there's the real possibility you are below the minimum threshold.

    Hopefully you don't take this too harshly or personally, while there is some directed comments in there, there's also a lot of constructive feedback. I have been a fan of some of your work in the past but this is certainly not your best effort or an effort worthy of AT, imo. The biggest problem I have and we've gotten into it a bit in the past is that you repeat many of the same misconceptions that helped shape and perpetuate all the "noise" surrounding FreeSync. For example, you mention it again in this article, yet do we have any confirmation from ANYONE that existing scalers and panels can simply be flashed to FreeSync with a firmware update? If not, why bother repeating the myth?
  • Darkito - Friday, March 20, 2015 - link

    @Jared

    What do you make of this PCPerformance article?

    http://www.pcper.com/reviews/Displays/AMD-FreeSync...

    "

    G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module."

    Especially those last few sentences. You say AMD can just duplicate frames like G-Sync but according to this article it's actually something in the G-Sync module that enables it. Is there truth to that?
  • Socketofpoop - Thursday, March 19, 2015 - link

    Not worth the typing effort. Chizow is a well known nvidia fanboy or possibly a shill for them. As long as it is green it is best to him. Bent over, cheeks spread and ready for nvidias next salvo all the time.
  • chizow - Friday, March 20, 2015 - link

    @Socketofpoop, I'm well known among AMD fanboys! I'm so flattered!

    I would ask this of you and the lesser-known AMD fanboys out there. If a graphics card had all the same great features, performance, support with existing prices that Nvidia offers, but had an AMD logo and red cooler on the box, I would buy the AMD card in a heartbeat. No questions asked. Would you if roles were reversed? Of course not, because you're an AMD fan and obviously brand preference matters to you more than what is actually the better product.
  • Black Obsidian - Thursday, March 19, 2015 - link

    I hate to break it to you, but history has not been kind to the technically superior but proprietary and/or higher cost solution. HD-DVD, miniDisc, Laserdisc, Betamax... the list goes on.
  • JarredWalton - Thursday, March 19, 2015 - link

    Something else interesting to note is that there are 11 FreeSync displays already in the works (with supposedly nine more unannounced), compared to seven G-SYNC displays. In terms of numbers, FreeSync on the day of launch has nearly caught up to G-SYNC.
  • chizow - Thursday, March 19, 2015 - link

    Did you pull that off AMD's slidedeck too Jarred? What's interesting to note is you list the FreeSync displays "in the works" without counting the G-Sync panels "in the works"? And 3 monitors is now "nearly caught up to" 7? Right.

    A brand new panel is a big investment (not really), I guess everyone should place their bets carefully. I'll bet on the market leader that holds a commanding share of the dGPU market, consistently provides the best graphics cards, great support and features, and isn't riddled with billions in debt with a gloomy financial outlook.

Log in

Don't have an account? Sign up now