Today at AMD's Future of Compute event in Singapore, AMD announced partnerships with several companies. One of the more noteworthy announcements is that Samsung will be making FreeSync enabled displays that should be available in March 2015. The displays consist of the 23.6" and 28" UD590, and there will be 23.6", 28", and 31.5" variants of the UE850. These are all UHD (4K) displays, and Samsung has stated their intention to support Adaptive-Sync (and thereby FreeSync) on all of their UHD displays in the future.

FreeSync is AMD's alternative to NVIDIA's G-SYNC, with a few key differences. The biggest difference is that AMD proposed an extension to DisplayPort called Adaptive-Sync, and the VESA group accepted this extension as an amendment to the DisplayPort 1.2a specifications. Adaptive-Sync is thus an open standard that FreeSync leverages to enable variable refresh rates. As far as system requirements for FreeSync, other than a display that supports DisplayPort Adaptive-Sync, you need a supported AMD GPU with a DisplayPort connection and a driver from AMD with FreeSync support.

FreeSync is also royalty free, which should help manufacturers in controlling costs on FreeSync capable displays. There are other costs to creating a display that can support Adaptive-Sync, naturally, so we wouldn't expect price parity with existing LCDs in the near term. On the FreeSync FAQ, AMD notes that the manufacturing and validation requirements to support variable refresh rates without visual artifacts are higher than traditional LCDs, and thus cost-sensitive markets will likely hold off on adopting the standard for now. Over time, however, if Adaptive-Sync catches on then economies of scale come into play and we could see widespread adoption.

Being an open standard does have its drawbacks. NVIDIA was able to partner up with companies and develop G-SYNC and deploy it about a year ago, and there are now 4K 60Hz G-SYNC displays (Acer's XB280HK) and QHD 144Hz G-SYNC display (ASUS' ROG Swift PG278Q) that have been shipping for several months. In many ways G-SYNC showed the viability of adaptive refresh rates, but regardless of who gets credit the technology is quite exciting. If Adaptive-Sync does gain traction, as an open standard there's nothing to stop NVIDIA from supporting the technology and altering G-SYNC to work with Adaptive-Sync displays, but we'll have to wait and see on that front.

Pricing for the Samsung displays has not been announced, though the existing UD590 models tend to cost around $600 for the 28" version. I'd expect the Adaptive-Sync enabled monitors to have at least a moderate price premium, but we'll see when they become available some time around March 2015.

Source: AMD

POST A COMMENT

73 Comments

View All Comments

  • chizow - Friday, November 21, 2014 - link

    Intel sets any price they want? This is nonsense and an internet tech meme that needs to die. What is the most you have ever spent on a CPU? What would compel you to spend considerably more than that over just using the CPU that you already have? What kind of performance gain would you need Intel to show just to buy another CPU at that same price point? There are very simple economic factors that dictate Intel's pricing even in a virtual monopoly, such as discretionary income, substitutes and the price elasticity of demand. You can see it in the comments section every time they release a new CPU, every single consumer is making those decisions in real-time and the discussion doesn't end in: "Intel can make a $1000 CPU and I would be compelled to buy it" given that has been happening for years and that obviously isn't the case.

    And regarding the 8800GT, you're wrong, Nvidia was dominating AMD for over a year with the 8800GTX and 8800GTS with R600 repeatedly delayed and ultimately, non-competitive. In the face of no actual competition, Nvidia did the unthinkable, they launched nearly flagship performance at a fraction of the cost, $230, which led them to even new heights. So yes, even in the face of no competition, the statement competition is always good and necessary is not unilaterally true.
    Reply
  • dragonsqrrl - Monday, November 24, 2014 - link

    @ legokill101: Intel has essentially maintained the same price points since Conroe, price points AMD used to occupy, all while consistently improving performance and efficiency. No one is saying monopoly is a good thing, but your assertion that Intel can and is charging whatever they want for their CPUs is simply baseless. That same argument parroted by AMD fanboys for the past ~8 years is in fact getting cause/effect mixed up. Reply
  • Alexey291 - Sunday, November 23, 2014 - link

    you certainly sound like a preaching moron. i.e. the worst kind.

    A moron who thinks he knows some TRUTH and must spread it to the masses.

    Go away please.
    Reply
  • chizow - Monday, November 24, 2014 - link

    What am I preaching? Reality? That's the best kind, maybe a bigger dose of it for AMD fanboys will diminish having to wade through all these nonsensical half-truths and memes in the future.

    Feel free to post something worthwhile, until then I'll be right here thanks.
    Reply
  • BillyHerrington - Tuesday, November 25, 2014 - link

    Since when does anandtech turned into the like of fudzilla & wccftech ?
    Where everyone calling each other fanboy and stupid.
    Reply
  • Horza - Wednesday, November 26, 2014 - link

    @Alex291
    Seems like you've got his number, straight into a rant about giving everyone a dose of reality.
    Reply
  • DanaGoyette - Friday, November 21, 2014 - link

    23.6 inches and "4K" (assuming 3840x2160) -- that's 186.69 PPI. Sounds amazing!

    Yes, some software breaks under scaling, but I'm okay with that. I've been happy with 144PPI laptops (150% scaling) since 2008, and this 200% scaling should get rid of the blurriness.

    I hope these will also have a 120Hz strobed backlight mode, and 8-bit color depth. My current XL2420TE has amazing motion in strobed mode, but also has horrible banding.
    Reply
  • Wolfpup - Friday, November 21, 2014 - link

    From interviews with both companies, it really sounds like G-Sync has some serious advantages...but with it costing much more to implement and not being royalty free, I can't see it going anywhere.

    And where are the non-TN monitors that support it?

    FreeSync at least seems like it might happen...except practically speaking if Nvidia doesn't support it (I can't remember if they are) it does me no good as I need to see AMD doing YEARS of Nvidia quality drivers before I consider them again...and I need to see them supporting old hardware like Nvidia does.
    Reply
  • MrSpadge - Friday, November 21, 2014 - link

    FreeSync might really take off if Intel adopted it. Not because their GPUs would be so strong, but because they might be able to combine it with any GPU via Lucid Hydra. Reply
  • chizow - Saturday, November 22, 2014 - link

    Hydra? Really? Haven't heard that name in awhile. Dead tech is dead. I almost thought about using Lucid VirtuMVP on my 4770K, almost.

    But yeah if you want to take the chance, Intel is selling Larrabee Xeon Phis for dirt cheap right now!

    https://software.intel.com/en-us/articles/special-...
    Reply

Log in

Don't have an account? Sign up now