FreeSync Over HDMI to Hit Retail in Q1’16

After pushing DisplayPort Freesync out the door earlier this year, back at Computex 2015 AMD began demonstrating a further Freesync proof-of-concept implementation: FreeSync over HDMI.

Implemented over a customized version of HDMI 1.4a and utilizing a prototype Realtek timing controller (TCON), AMD was able to demonstrate variable refresh rate technology running over HDMI. At the time of the presentation AMD was very clear that the purpose of the presentation was to shop around the concept and to influence the various members of the HDMI consortium, but they were also clear that bringing variable refresh rate tech to HDMI was something the company wanted to bring to retail sooner than later.

Sooner, as it turns out, was the operative word there. As part of their presentation last week, RTG has announced that FreeSync over HDMI will be heading to retail, and that it will be doing so very soon: Q1’16. This is just a year after the first DisplayPort adaptive sync monitors hit retail, which for a display technology is a rather speedy turnaround from proof of concept to retail product.

Now there are some key technical differences from FreeSync over DisplayPort(FS-DP) that should be noted here. Unlike FS-DP, which was just AMD’s implementation of DisplayPort adaptive sync on their GPUs and software stack, FS-HDMI is not an open standard, at least not at this time. HDMI does not have a variable refresh rate technology standard, and while RTG is pushing to have one included in a future version of HDMI, the HDMI consortium moves too slowly for RTG’s tastes. As a result RTG is looking to go it alone, and will be implementing FS-HDMI by creating a vendor specific extension for HDMI.

The use of vendor specific extensions is perfectly legal within the HDMI standard, but it does mean that FS-HDMI is proprietary, at least until such a time where the HDMI standard adopts a common variable refresh rate standard. This means that FS-HDMI monitors will need to support RTG’s proprietary extensions, which in turn requires TCON/monitor vendors to work a bit more closely with RTG than was necessary with FS-DP. Meanwhile RTG for their part hasn’t yet decided what to do about the proprietary nature of their implementation – they are open to sharing it, but they also want to retain control and avoid any scenario that results in outright balkanization of HDMI variable refresh rate technology. The fact that it’s an RTG-controlled specification calls into question whether any other GPU vendor would want to implement it in the first place – so concerns about openness may prove to be moot – but it does mean that it’s going to be up to RTG to make or break FS-HDMI.

Perhaps more surprising, and certainly a feather in RTG’s cap, is that RTG has brought so many TCON vendors on-board so early. Along with Realtek, Novatek and Mstar will all be producing TCONs that support FS-HDMI, so TCONs will be available from multiple vendors relatively quickly. With variable refresh rate tech it’s the TCONs that really decide whether the tech is supported, so this is an important set of partnerships for RTG to lock in so soon. Meanwhile traditional AMD/RTG display partners such as Acer, LG, and Samsung will be producing retail monitors with FS-HDMI capabilities.

Meanwhile at this point RTG isn’t talking about GPU compatibility in great detail, however it sounds like FS-HDMI support will be brought over to some of RTG’s current GPUs. Most likely these are the GCN 1.1+ Radeon 300 series cards, with GCN 1.1 also being the minimum requirement for FS-DP. AMD’s Carrizo APU should also support the technology, and RTG is specifically promoting that notebooks implementing an APU + dGPU Radeon dual graphics configuration will also support FS-HDMI, an important development especially given the fact that DisplayPort support is non-existent on consumer AMD laptops.

In fact the lack of DisplayPort availability in displays overall is a big part of why RTG has pursued this. According to numbers from RTG, only about 30% of all monitors sold include a DisplayPort, while the other 70% are only implementing HDMI or HDMI + DVI. Consequently FS-DP is an inherently limited market and the majority of monitor buyers will never be able to use FS-DP. Meanwhile from what I hear the actual cost of implementing variable refresh rate support on a TCON is very low, which means that RTG could get far greater penetration for FreeSync by extending it to support HDMI, not to mention bringing down the overall cost of entry-level FreeSync monitors. We’re still talking about a highly price sensitive commodity market – after all, there’s a reason that most monitors don’t ship with a DisplayPort – but if the costs of adding FreeSync are as low as RTG hints, then there is a market for consumers who would spend a bit more on a variable refresh rate monitor but don’t know anything about display I/O standards beyond HDMI.

Finally, along those lines, it should be no surprise that the first FS-HDMI monitors that have been announced are all focused on lower cost and lower resolution displays. That FP-HDMI is being implemented over HDMI 1.4 immediately rules out 4K monitors, so instead all announced monitors are 1080p or ultra-wide 21:9 aspect ratio 2560x1080 and 3440x1440 monitors. Otherwise there are a few more unknowns here that I expect we’ll see addressed ahead of the Q1 launch, particularly which monitors will support a wide-enough range of rates for low framerate compensation to work.

FreeSync Laptops: Shipping Now

Along with the FreeSync over HDMI announcement, RTG also used their event to announce the first FreeSync-capable laptop, the Lenovo Y700. One of the models of the laptop ships with a variable refresh rate capable 15.6” 1080p IPS panel, and when paired up with a Carrizo APU and R9 M380 GPU, can utilize FreeSync to control the refresh rate. The one notable limitation here is that while this otherwise a rather typical DisplayPort adaptive sync setup within a laptop, the specific panel being used here is only supports a range of 40Hz to 60Hz, so the first FreeSync laptop has a narrow effective range and can’t support LFC.

DisplayPort 1.3 & HDMI 2.0a: Support Coming In 2016 High Dynamic Range: Setting the Stage For The Next Generation
Comments Locked

99 Comments

View All Comments

  • Azix - Wednesday, December 9, 2015 - link

    GCN is just a name. It does not mean there aren't major improvements. Nvidia constantly changing their architecture name is not necessarily an indication its better, its usually the same improvements over an older arch.

    Also it seems AMD is ahead of the game with GCN and nvidia is playing catchup, having to cut certain things out to keep up.
  • BurntMyBacon - Thursday, December 10, 2015 - link

    @Azix: "Also it seems AMD is ahead of the game with GCN and nvidia is playing catchup, having to cut certain things out to keep up."

    I wouldn't go that far. nVidia is simply focused differently that ATi at the moment. ATi gave up compute performance for gaming back in the 6xxx series and brought compute performance back with the 7xxx series. Given AMD's HSA initiative, I doubt we'll see them make that sacrifice again.

    nVidia on the other hand decided to do something similar going from Fermi to little Kepler (6xx series). They brought compute back to some extent for big Kepler (high end 7xx series), but dropped it again for Maxwell. This approach does make some sense as the majority of the market at the moment doesn't really care about DP compute. The ones that do can get a Titan, a Quadro, or if the pattern holds, a somewhat more DP capable consumer grade card once every other generation. On the other hand, dropping the DP compute hardware allows them to more significantly increase gaming performance at similar power levels on the same process. In a sense, this means the gamer isn't paying as much for hardware that is of little or no use to gaming.

    At the end of the day, it is nVidia that seems to be ahead in the here and now, though not by as much as some suggest. It is possible that ATi is ahead when it comes to DX12 gaming and their cards may age more gracefully than Maxwell, but that remains to be seen. More important will be where the tech lays out when DX12 games are common. Even then, I don't think that Maxwell with have as much of a problem as some fear given that DX11 will still be an option.
  • Yorgos - Thursday, December 10, 2015 - link

    What are the improvements that Maxwell offer?
    They can run better the binary blobs from crapworks?
    e.g. lightning effects http://oi64.tinypic.com/2mn2ds3.jpg
    or efficiency
    http://www.overclock.net/t/1497172/did-you-know-th...
    or 3.5 GB vram,
    or obsolete architecture for the current generation of games(which has already started)

    Unless you have money to waste, there is no other option in the GPU segment.
    a lot of GTX 700 series owners say so.(amongst others)
  • Michael Bay - Thursday, December 10, 2015 - link

    I love how you convenienty forgot to mention not turning your case into a working oven, and then grasped sadly for muh 3.5 GBs as if it matters in the real world with overwhelming 1080p everywhere.

    And then there is an icing on a cake in the form of hopeless wail on "current generation of games(which has already started)". You sorry amdheads really don`t see irony even if it hits you in the face.
  • slickr - Friday, December 11, 2015 - link

    Nvidia still pretty much uses the same techniques/technology as they had in their old 600 series graphics. Just because they've names the architecture differently doesn't mean it is.

    AMD's major architectural change will be in 2016 when they move to 14nm FinFet, so will Nvidia's when they move to 16nm FinFet.

    AMD already has design elements like HBM in their current GCN Fiji architecture that they can more easily implement for their new GPU's which are supposed to start arriving in late Q2 2016.
  • Zarniw00p - Tuesday, December 8, 2015 - link

    FS-HDMI for game consoles, DP 1.3 for Apple who would like to update their Mac Pros with DP 1.3 and 5k displays.
  • Jtaylor1986 - Tuesday, December 8, 2015 - link

    This is a bit strange since almost the entire presentation depends on action by the display manufacturer industry and industry standard groups. I look forward to them announcing what they are doing in 2016, not what they are trying to get the rest of the industry to do in 2016.
  • bug77 - Tuesday, December 8, 2015 - link

    Well, most of the stuff a video card does depends on action by the display manufacturer...
  • BurntMyBacon - Thursday, December 10, 2015 - link

    @Jtaylor1986: "This is a bit strange since almost the entire presentation depends on action by the display manufacturer industry and industry standard groups."

    That's how it works when you want a non-proprietary solution that allows your competitors to use it as well. ATi doesn't want to give away their tech any more than nVidia does. However, they also realize that Intel is the dominant graphics manufacturer in the market. If they can get Intel on board with a technology, then there is some assurance that the money invested isn't wasted.

    @Jtaylor1986: "I look forward to them announcing what they are doing in 2016, not what they are trying to get the rest of the industry to do in 2016."

    Point of interest: It is hard to get competitors to work together. They don't just come together and do it. Getting Acer, LG, and Samsung to standardize on a tech (particularly a proprietary one) means that there has already been significant effort invested in the tech. Also, getting Mstar, Novatek, and Realtek to make compatible hardware is similar to getting nVidia and ATi or AMD and Intel to do the same. IBM forced Intel's hand. Microsoft's DirectX standard forced ATi and nVidia (and 3DFX for that matter) to work in a compatible way.

    Beyond that, it isn't as if ATi has is doing nothing. It is simply that their work requires cooperation for all parties involved. Cooperation that they've apparently obtained. This is what is required when you think about the experience beyond just your hardware. nVidia does similar with their Tesla partners, The Way it's Meant to be Played program, and CUDA support. Sure, they built the chip themselves with gsync, but they still had to get support from monitor manufacturers to get the chip into a monitor.
  • TelstarTOS - Thursday, December 10, 2015 - link

    and the display industry has been EXTREMELY slow at producing BIG, quality display, not even counting 120 and 144hz. I dont see them embracing HDR (10-12 bit panels) any soon :(
    My monitor bought last year is going to last awhile, until they catch up with the video card makers.

Log in

Don't have an account? Sign up now