Under The Hood for Displays: Custom Resolutions, Freesync Improvements, & Framerate Target Control

Continuing our look into Crimson’s new features, AMD has also implemented some new & improved functionality specifically targeted at displays. The company has been more aggressive about display technologies and features since embarking on their Freesync project, and that is reflected in some of the changes made here.

Custom Resolution Support

First and foremost, AMD has finally (and at long last) implemented support for custom resolutions within their control panel. Custom resolution support is something of a niche feature – most users will never find it, let alone need it – however it’s extremely useful for those users who do need it. In our own case, for example, we use this feature with our Sharp PN-K321 4K monitor in order to run 1440p@60Hz on it, as the monitor doesn’t explicitly support that setting and Windows would rather upscale 1440p to 2160p@30Hz when left to its own devices.

Custom resolution support is another area where AMD is catching up with NVIDIA, as the latter has supported custom resolutions for several years now. In the meantime it’s been possible to use third-party utilities such as Custom Resolution Utility with AMD’s drivers to force the matter, but bringing support within AMD’s drivers is still a notable improvement.

AMD has never previously supported this feature in part due to the very low but nonetheless real risk of damage. If given video settings it can’t use, a properly behaving monitor should simply reject the input. However not all devices are perfect, and it is possible (however unlikely) that a monitor could damage itself trying to run with unsupported settings. This is why for both AMD and NVIDIA, custom resolutions come with a warning and are not covered under their respective warranties.

On a side note, one thing that was interesting to find was that this was one of the features not implemented in Radeon Settings. Rather the custom resolution control panel is part of the pared down Catalyst Control Center, now called Radeon Additional Settings. Considering that AMD never supported custom resolutions until now, it’s a bit surprising that they’d add it to CCC rather than Radeon Settings. But I suspect this has a lot to do with why CCC is still in use to begin with; that not all of the necessary monitor controls are available in Radeon Settings at this time.

Freesync Improvements: Low Framerate Compensation

With Omega AMD included initial support for Freesync, and now with Crimson AMD is rolling out some new Freesync functionality that changes how the technology works at the GPU level.

NVIDIA, never one to shy away from throwing barbs at the competition, has in the past called out AMD for how Freesync has handled minimum refresh rates. Specifically, that when the framerate falls below the minimum refresh rate of the monitor, Freesync setups would revert to non-Freesync operation, either locking into v-sync operation or traditional v-sync off style tearing depending on whether v-sync was enabled. Though in some ways better than what NVIDIA offered at the time (v-sync control) on the other hand it meant that the benefits of Freesync would be lost if the framerate fell below the minimum. Meanwhile NVIDIA, though not publishing exactly what they do, would seem to use some form of frame repeating in order to keep G-Sync active, repeating frames to keep variable refresh going rather than working at the minimum refresh rate.

This is something AMD appears to have taken to heart, and while they don’t specifically name NVIDIA in their presentation, all signs point to it being a reaction to NVIDIA’s barbs and marketing angle. As a result the Crimson driver introduces a new technology for Freesync which AMD is calling Low Framerate Compensation (LFC). LFC is designed to directly address what the GPU and Freesync monitor do when the framerate falls below the minimum refresh rate.

In AMD’s slide above, they list out the five refresh scenarios, and the two scenarios that LFC specifically applies to. So long as the framerate is above the minimum refresh rate, Freesync is unchanged. However when the framerate falls below the minimum, AMD has instituted a series of changes to reduce judder. Unfortunately, not unlike NVIDIA, AMD is treating this as a “secret sauce” and isn’t disclosing what exactly they’re doing to alleviate the issue. However based on what we’re seeing and AMD’s description (along with practical solutions to the problem), our best guess is that AMD is implementing frame repeating to keep the instantaneous refresh rate above the monitor’s minimum.

Frame reuse is simple in concept but tricky in execution. Not unlike CrossFire, there’s a strong element of prediction here, as the GPU needs to guess when the next frame may be ready so that it can set the appropriate refresh rate and repeat a frame the appropriate number of times. Hence, in one of the few things they do say about the technology, that AMD is implementing an “adaptive algorithm” to handle low framerate situations. Ultimately if AMD does this right, then it should reduce judder both when v-sync is enabled and when it is disabled, by aligning frame repeats and the refresh rate such that the next frame isn’t unnecessarily delayed.

The good news here is that this is a GPU-side change, so it doesn’t require any changes to existing monitors – they simply receive new variable refresh timings. However in revealing a bit more about the technology, AMD does note that LFC is only enabled with monitors that have a maximum refresh rate greater than or equal to 2.5 times the minimum refresh rate (e.g. 30Hz to 75Hz), as AMD needs a wide enough variable refresh range to run at a multiple of framerates right on the edge of the minimum (e.g. 45fps). This means LFC can’t be used with Freesync monitors that have a narrow refresh rate, such as the 48Hz to 75Hz models. Ultimately owners of those monitors don’t lose anything, but they also won’t gain anything with LFC.

As it stands we’ve only had a very limited amount of time to toy with Freesync on the new drivers, but what we’re seeing so far looks solid. But we’re definitely curious in seeing how daily Freesync users respond to this.

Finally, along with the LFC news, for the Crimson driver release AMD has offered a brief update on the status of Freesync-over-HDMI, reiterating that the company is still working on the technology. AMD first demonstrated the concept at Computex 2015 back in June, and while they still have a long way to go before it can make it into a retail product, the company continues to believe adaptive-synchronization is a viable and meaningful addition for HDMI.

Framerate Target Control: Wider Ranges

Back in June for the launch of the Radeon R9 Fury X, AMD introduced a new frame limiting feature called Framerate Target Control (FRTC). FRTC offered an alternative to v-sync, allowing users to cap the framerate of a game at an arbitrary framerate, selected via AMD’s control panel. While FRTC worked it had one unfortunate limitation, and that was that it only worked over a very limited range – 55fps to 95fps. Though this was sufficient to cap the framerate right below 60fps or directly above it, users have been asking for wider ranges to support higher framerate monitors or to limit a game to even lower framerates such as 30fps.

For Crimson AMD has gone ahead and widened the range of FRTC. It can now cap a game at between 30fps and 200fps, a range over four-times as wide. At the same time AMD has mentioned in our briefing that they’ve also done some additional work to better restrict GPU clockspeeds when FRTC is in use to maximize the power savings from using it to limit the amount of work the GPU does. Now the GPU will operate at a lower clockspeed more often, increasing the amount of power saved versus letting a video card run uncapped.

Under The Hood: DirectX 9, Shader Caching, Liquid VR, and Power Consumption Radeon Settings: The New Face of AMD’s Drivers
Comments Locked

146 Comments

View All Comments

  • zeeBomb - Tuesday, November 24, 2015 - link

    Woo awesome! Great rundown Daniel and Ryan!
  • Samus - Wednesday, November 25, 2015 - link

    Totally. Props on covering this, it's about time AMD brought their A-game to NVidia so they can stop using their drivers as a crutch.
  • Dalamar6 - Thursday, November 26, 2015 - link

    Click Display -> additional settings > cringe and wish you didn't just shill there.

    Literally just a metro style frontend for the old CCC.
  • Oxford Guy - Thursday, November 26, 2015 - link

    And how frequently does anyone do that? You still get multiple styles in Windows 10, btw.
  • Chaitanya - Tuesday, November 24, 2015 - link

    Thanks for the review, looks like finally AMD has paid attention to drivers of its graphics hardware.
  • Zeus Hai - Tuesday, November 24, 2015 - link

    It's a joke they still dont even have Adaptive V-sync. I have overclocked my Monitor for years, and Adaptive Vsync has always been the killer feature for me.

    Everything AMD chosed to ignore in a fight with Intel or Nvidia always hit them extremely hard in the end.
  • Glenn37216 - Tuesday, November 24, 2015 - link

    Yup , Also you still cant run borderless windows in Crossfirein Direct X titles. With SLI you can ...
    Mantle is the only exception but what a failure that tech is ..its beyond comprehension.
  • looncraz - Tuesday, November 24, 2015 - link

    I wouldn't call Mantle a failure, there are several games out there that use it to great effect (I prefer it in BF4 over DX11), it helped to push DX12 into a desirable direction, and it is being turned into the Vulcan API.

    AMD accomplished what they wanted from it, and they no longer need to put resources into it since everyone else is doing it for them.
  • Samus - Wednesday, November 25, 2015 - link

    Honestly, Mantle in BF4 is a joke. There are lots of rendering issues, erratic performance/minimum framerate drops, and the lingering VRAM crashes on cards with 2GB or less, like my R9 380. I would have gone for 4GB, but 4GB is ridiculous for 1080P.
  • nagi603 - Wednesday, November 25, 2015 - link

    Pray tell why 4GB is ridiculous for 1080p. There aren't many AAA-games that use less than 2GB VRAM on 1080p on my 290X.

Log in

Don't have an account? Sign up now