Under The Hood for Displays: Custom Resolutions, Freesync Improvements, & Framerate Target Control

Continuing our look into Crimson’s new features, AMD has also implemented some new & improved functionality specifically targeted at displays. The company has been more aggressive about display technologies and features since embarking on their Freesync project, and that is reflected in some of the changes made here.

Custom Resolution Support

First and foremost, AMD has finally (and at long last) implemented support for custom resolutions within their control panel. Custom resolution support is something of a niche feature – most users will never find it, let alone need it – however it’s extremely useful for those users who do need it. In our own case, for example, we use this feature with our Sharp PN-K321 4K monitor in order to run 1440p@60Hz on it, as the monitor doesn’t explicitly support that setting and Windows would rather upscale 1440p to 2160p@30Hz when left to its own devices.

Custom resolution support is another area where AMD is catching up with NVIDIA, as the latter has supported custom resolutions for several years now. In the meantime it’s been possible to use third-party utilities such as Custom Resolution Utility with AMD’s drivers to force the matter, but bringing support within AMD’s drivers is still a notable improvement.

AMD has never previously supported this feature in part due to the very low but nonetheless real risk of damage. If given video settings it can’t use, a properly behaving monitor should simply reject the input. However not all devices are perfect, and it is possible (however unlikely) that a monitor could damage itself trying to run with unsupported settings. This is why for both AMD and NVIDIA, custom resolutions come with a warning and are not covered under their respective warranties.

On a side note, one thing that was interesting to find was that this was one of the features not implemented in Radeon Settings. Rather the custom resolution control panel is part of the pared down Catalyst Control Center, now called Radeon Additional Settings. Considering that AMD never supported custom resolutions until now, it’s a bit surprising that they’d add it to CCC rather than Radeon Settings. But I suspect this has a lot to do with why CCC is still in use to begin with; that not all of the necessary monitor controls are available in Radeon Settings at this time.

Freesync Improvements: Low Framerate Compensation

With Omega AMD included initial support for Freesync, and now with Crimson AMD is rolling out some new Freesync functionality that changes how the technology works at the GPU level.

NVIDIA, never one to shy away from throwing barbs at the competition, has in the past called out AMD for how Freesync has handled minimum refresh rates. Specifically, that when the framerate falls below the minimum refresh rate of the monitor, Freesync setups would revert to non-Freesync operation, either locking into v-sync operation or traditional v-sync off style tearing depending on whether v-sync was enabled. Though in some ways better than what NVIDIA offered at the time (v-sync control) on the other hand it meant that the benefits of Freesync would be lost if the framerate fell below the minimum. Meanwhile NVIDIA, though not publishing exactly what they do, would seem to use some form of frame repeating in order to keep G-Sync active, repeating frames to keep variable refresh going rather than working at the minimum refresh rate.

This is something AMD appears to have taken to heart, and while they don’t specifically name NVIDIA in their presentation, all signs point to it being a reaction to NVIDIA’s barbs and marketing angle. As a result the Crimson driver introduces a new technology for Freesync which AMD is calling Low Framerate Compensation (LFC). LFC is designed to directly address what the GPU and Freesync monitor do when the framerate falls below the minimum refresh rate.

In AMD’s slide above, they list out the five refresh scenarios, and the two scenarios that LFC specifically applies to. So long as the framerate is above the minimum refresh rate, Freesync is unchanged. However when the framerate falls below the minimum, AMD has instituted a series of changes to reduce judder. Unfortunately, not unlike NVIDIA, AMD is treating this as a “secret sauce” and isn’t disclosing what exactly they’re doing to alleviate the issue. However based on what we’re seeing and AMD’s description (along with practical solutions to the problem), our best guess is that AMD is implementing frame repeating to keep the instantaneous refresh rate above the monitor’s minimum.

Frame reuse is simple in concept but tricky in execution. Not unlike CrossFire, there’s a strong element of prediction here, as the GPU needs to guess when the next frame may be ready so that it can set the appropriate refresh rate and repeat a frame the appropriate number of times. Hence, in one of the few things they do say about the technology, that AMD is implementing an “adaptive algorithm” to handle low framerate situations. Ultimately if AMD does this right, then it should reduce judder both when v-sync is enabled and when it is disabled, by aligning frame repeats and the refresh rate such that the next frame isn’t unnecessarily delayed.

The good news here is that this is a GPU-side change, so it doesn’t require any changes to existing monitors – they simply receive new variable refresh timings. However in revealing a bit more about the technology, AMD does note that LFC is only enabled with monitors that have a maximum refresh rate greater than or equal to 2.5 times the minimum refresh rate (e.g. 30Hz to 75Hz), as AMD needs a wide enough variable refresh range to run at a multiple of framerates right on the edge of the minimum (e.g. 45fps). This means LFC can’t be used with Freesync monitors that have a narrow refresh rate, such as the 48Hz to 75Hz models. Ultimately owners of those monitors don’t lose anything, but they also won’t gain anything with LFC.

As it stands we’ve only had a very limited amount of time to toy with Freesync on the new drivers, but what we’re seeing so far looks solid. But we’re definitely curious in seeing how daily Freesync users respond to this.

Finally, along with the LFC news, for the Crimson driver release AMD has offered a brief update on the status of Freesync-over-HDMI, reiterating that the company is still working on the technology. AMD first demonstrated the concept at Computex 2015 back in June, and while they still have a long way to go before it can make it into a retail product, the company continues to believe adaptive-synchronization is a viable and meaningful addition for HDMI.

Framerate Target Control: Wider Ranges

Back in June for the launch of the Radeon R9 Fury X, AMD introduced a new frame limiting feature called Framerate Target Control (FRTC). FRTC offered an alternative to v-sync, allowing users to cap the framerate of a game at an arbitrary framerate, selected via AMD’s control panel. While FRTC worked it had one unfortunate limitation, and that was that it only worked over a very limited range – 55fps to 95fps. Though this was sufficient to cap the framerate right below 60fps or directly above it, users have been asking for wider ranges to support higher framerate monitors or to limit a game to even lower framerates such as 30fps.

For Crimson AMD has gone ahead and widened the range of FRTC. It can now cap a game at between 30fps and 200fps, a range over four-times as wide. At the same time AMD has mentioned in our briefing that they’ve also done some additional work to better restrict GPU clockspeeds when FRTC is in use to maximize the power savings from using it to limit the amount of work the GPU does. Now the GPU will operate at a lower clockspeed more often, increasing the amount of power saved versus letting a video card run uncapped.

Under The Hood: DirectX 9, Shader Caching, Liquid VR, and Power Consumption Radeon Settings: The New Face of AMD’s Drivers
Comments Locked

146 Comments

View All Comments

  • looncraz - Wednesday, November 25, 2015 - link

    The 7970 trashes the 680 in newer games. Shadow of Mordor is nearly twice as fast on a 7970. In addition, the 7970 has been given increased performance with the Crimson driver, whereas the 600 and 700 series have been left to rot.

    I suspect AMD will keep it up for the next few years since they both need the good will and will also still be selling cards based on GCN, so the fixes and improvements should apply almost universally.
  • Refuge - Friday, November 27, 2015 - link

    They just discontinued driver support for all pre GCN GPU's, so much for that good will eh? lol

    http://www.maximumpc.com/amd-ends-driver-support-f...
  • Oxford Guy - Thursday, November 26, 2015 - link

    I remember the superiority of two versions of its driver bricking cards, including expensive Fermi models.
  • looncraz - Tuesday, November 24, 2015 - link

    I don't like Adaptive V-sync at all, can't stand screen tearing. That, and with 144hz monitor and pushing 100+fps, you won't have any lag of which to speak.

    When the system can't keep up with the monitor, you get tearing... not cool. If you're stuck at 60hz or can't push the frames, then you need to reduce settings or buy a better video card and monitor.
  • Zeus Hai - Wednesday, November 25, 2015 - link

    You're off the point, mate.

    144Hz, having constant 100+fps on screen (means a very powerful system), new card, new monitor, those are all hardware. And we're talking about driver optimization, software features.

    Reduce settings? You will still need Vsync if the fps goes over 60Hz, and what if somehow the fps still can't keep up with the Hz at reduced settings? Triple buffering will worsen the burden, and introduce more lag.

    So, adaptive vsync may seem to be simple at first, but actually has some deep thoughts down inside it.
  • looncraz - Wednesday, November 25, 2015 - link

    Why would I have a game fall below 60hz? Not sure I've had that happen since I've bought my R9 290, except for a bugged Crysis mod :p

    Even then, V-Sync on a 120~144hz monitor introduces a maximum 8ms latency, which is nominally 4ms (statistics and all), which is well inside the reaction time window. Games already have input lag built in, as well as compensation algorithms (for fast-paced games, like BF4). If the game falls below 60hz (16.7ms/frame), then the game itself is introducing dramatically more input lag than V-Sync.

    By turning off V-sync, you actually ARE NOT decreasing input lag, either, you are displaying a PARTIAL FRAME. This causes tearing. Monitors STILL only update at their refresh rate, only now you have to deal with tearing. The nominal benefit of turning off V-Sync, on 120hz display, is only 4ms, since that is the only cost from having V-Sync on.

    AMD's FreeSync or nVidia's GSync are more appropriate solutions, since the monitor itself will now respond more quickly. This means that the nominal benefit can be 8ms or more.
  • xdrol - Thursday, November 26, 2015 - link

    They actually do, just for some reason it's not enablable (that's a word from now) in the official interface. Install RadeonPro and tada.wav. Or get a FreeSync monitor.
  • Dalamar6 - Wednesday, November 25, 2015 - link

    Why look at that, brown nosing shills on a tech forum! Whod'a thunk it!

    Try using AMD on Arch linux. Or linux in general. There basically are no drivers.
    NVidia sucks donkey balls on linux too, but at least they HAVE DRIVERS that don't cripple your system entirely.

    AMD has not REALLY improved their drivers in a good 15+ years, why start now?
  • fluxtatic - Wednesday, November 25, 2015 - link

    Why would AMD put any resources into Linux? That sweet 1% marketshare they're missing out on must really be hurting.them. Wait, .5% or less, since NVidia and Intel are in it, too.

    The whiny entitlement of the Linux crowd gets really, really tired after a while.
  • Oxford Guy - Thursday, November 26, 2015 - link

    The fanbois are just desperate for any pretext to bash. Apparently Linux is now the low-hanging fruit.

Log in

Don't have an account? Sign up now