Under The Hood for Displays: Custom Resolutions, Freesync Improvements, & Framerate Target Control

Continuing our look into Crimson’s new features, AMD has also implemented some new & improved functionality specifically targeted at displays. The company has been more aggressive about display technologies and features since embarking on their Freesync project, and that is reflected in some of the changes made here.

Custom Resolution Support

First and foremost, AMD has finally (and at long last) implemented support for custom resolutions within their control panel. Custom resolution support is something of a niche feature – most users will never find it, let alone need it – however it’s extremely useful for those users who do need it. In our own case, for example, we use this feature with our Sharp PN-K321 4K monitor in order to run 1440p@60Hz on it, as the monitor doesn’t explicitly support that setting and Windows would rather upscale 1440p to 2160p@30Hz when left to its own devices.

Custom resolution support is another area where AMD is catching up with NVIDIA, as the latter has supported custom resolutions for several years now. In the meantime it’s been possible to use third-party utilities such as Custom Resolution Utility with AMD’s drivers to force the matter, but bringing support within AMD’s drivers is still a notable improvement.

AMD has never previously supported this feature in part due to the very low but nonetheless real risk of damage. If given video settings it can’t use, a properly behaving monitor should simply reject the input. However not all devices are perfect, and it is possible (however unlikely) that a monitor could damage itself trying to run with unsupported settings. This is why for both AMD and NVIDIA, custom resolutions come with a warning and are not covered under their respective warranties.

On a side note, one thing that was interesting to find was that this was one of the features not implemented in Radeon Settings. Rather the custom resolution control panel is part of the pared down Catalyst Control Center, now called Radeon Additional Settings. Considering that AMD never supported custom resolutions until now, it’s a bit surprising that they’d add it to CCC rather than Radeon Settings. But I suspect this has a lot to do with why CCC is still in use to begin with; that not all of the necessary monitor controls are available in Radeon Settings at this time.

Freesync Improvements: Low Framerate Compensation

With Omega AMD included initial support for Freesync, and now with Crimson AMD is rolling out some new Freesync functionality that changes how the technology works at the GPU level.

NVIDIA, never one to shy away from throwing barbs at the competition, has in the past called out AMD for how Freesync has handled minimum refresh rates. Specifically, that when the framerate falls below the minimum refresh rate of the monitor, Freesync setups would revert to non-Freesync operation, either locking into v-sync operation or traditional v-sync off style tearing depending on whether v-sync was enabled. Though in some ways better than what NVIDIA offered at the time (v-sync control) on the other hand it meant that the benefits of Freesync would be lost if the framerate fell below the minimum. Meanwhile NVIDIA, though not publishing exactly what they do, would seem to use some form of frame repeating in order to keep G-Sync active, repeating frames to keep variable refresh going rather than working at the minimum refresh rate.

This is something AMD appears to have taken to heart, and while they don’t specifically name NVIDIA in their presentation, all signs point to it being a reaction to NVIDIA’s barbs and marketing angle. As a result the Crimson driver introduces a new technology for Freesync which AMD is calling Low Framerate Compensation (LFC). LFC is designed to directly address what the GPU and Freesync monitor do when the framerate falls below the minimum refresh rate.

In AMD’s slide above, they list out the five refresh scenarios, and the two scenarios that LFC specifically applies to. So long as the framerate is above the minimum refresh rate, Freesync is unchanged. However when the framerate falls below the minimum, AMD has instituted a series of changes to reduce judder. Unfortunately, not unlike NVIDIA, AMD is treating this as a “secret sauce” and isn’t disclosing what exactly they’re doing to alleviate the issue. However based on what we’re seeing and AMD’s description (along with practical solutions to the problem), our best guess is that AMD is implementing frame repeating to keep the instantaneous refresh rate above the monitor’s minimum.

Frame reuse is simple in concept but tricky in execution. Not unlike CrossFire, there’s a strong element of prediction here, as the GPU needs to guess when the next frame may be ready so that it can set the appropriate refresh rate and repeat a frame the appropriate number of times. Hence, in one of the few things they do say about the technology, that AMD is implementing an “adaptive algorithm” to handle low framerate situations. Ultimately if AMD does this right, then it should reduce judder both when v-sync is enabled and when it is disabled, by aligning frame repeats and the refresh rate such that the next frame isn’t unnecessarily delayed.

The good news here is that this is a GPU-side change, so it doesn’t require any changes to existing monitors – they simply receive new variable refresh timings. However in revealing a bit more about the technology, AMD does note that LFC is only enabled with monitors that have a maximum refresh rate greater than or equal to 2.5 times the minimum refresh rate (e.g. 30Hz to 75Hz), as AMD needs a wide enough variable refresh range to run at a multiple of framerates right on the edge of the minimum (e.g. 45fps). This means LFC can’t be used with Freesync monitors that have a narrow refresh rate, such as the 48Hz to 75Hz models. Ultimately owners of those monitors don’t lose anything, but they also won’t gain anything with LFC.

As it stands we’ve only had a very limited amount of time to toy with Freesync on the new drivers, but what we’re seeing so far looks solid. But we’re definitely curious in seeing how daily Freesync users respond to this.

Finally, along with the LFC news, for the Crimson driver release AMD has offered a brief update on the status of Freesync-over-HDMI, reiterating that the company is still working on the technology. AMD first demonstrated the concept at Computex 2015 back in June, and while they still have a long way to go before it can make it into a retail product, the company continues to believe adaptive-synchronization is a viable and meaningful addition for HDMI.

Framerate Target Control: Wider Ranges

Back in June for the launch of the Radeon R9 Fury X, AMD introduced a new frame limiting feature called Framerate Target Control (FRTC). FRTC offered an alternative to v-sync, allowing users to cap the framerate of a game at an arbitrary framerate, selected via AMD’s control panel. While FRTC worked it had one unfortunate limitation, and that was that it only worked over a very limited range – 55fps to 95fps. Though this was sufficient to cap the framerate right below 60fps or directly above it, users have been asking for wider ranges to support higher framerate monitors or to limit a game to even lower framerates such as 30fps.

For Crimson AMD has gone ahead and widened the range of FRTC. It can now cap a game at between 30fps and 200fps, a range over four-times as wide. At the same time AMD has mentioned in our briefing that they’ve also done some additional work to better restrict GPU clockspeeds when FRTC is in use to maximize the power savings from using it to limit the amount of work the GPU does. Now the GPU will operate at a lower clockspeed more often, increasing the amount of power saved versus letting a video card run uncapped.

Under The Hood: DirectX 9, Shader Caching, Liquid VR, and Power Consumption Radeon Settings: The New Face of AMD’s Drivers
Comments Locked

146 Comments

View All Comments

  • psychobriggsy - Tuesday, November 24, 2015 - link

    WCCFTech compared the previous AMD WHQL release to the new one.
    Anandtech compared the most recent AMD beta release to the new one.

    Which one is more uesful, for users of non-beta drivers?
  • gamervivek - Tuesday, November 24, 2015 - link

    WCCFTech compared these drivers as listed on their first page of the review.

    Drivers AMD - 15.11.1 Beta
    AMD Crimson

    So your question is moot.
  • Teizo - Tuesday, November 24, 2015 - link

    You are going to take WCCF Tech's numbers over Anandtech? Holy heck, what is the world coming too....
  • iamkyle - Tuesday, November 24, 2015 - link

    Why are we even talking about this? Comparing reviews from a joke of a website being WCCFtech is laughable at best.
  • yannigr2 - Tuesday, November 24, 2015 - link

    Browsers crusing all over the place and Nvidia posting hotfixes every week only to make a month or more until fixing, Nvidia cards using 70-130W more power when system is sitting idle and running at over 144Hz that wasn't fixed yet, but you came in the conclusion that AMD drivers had serious problems in 2015 because of a memory leak that was fixed in less than 24 hours?

    YOU HAVE TO BE JOKING.
  • Chaser - Tuesday, November 24, 2015 - link

    Vague. Baseless, fanboy, FUD. Maxwell mops the floor with AMD's winter space heaters.
  • xthetenth - Tuesday, November 24, 2015 - link

    I bought myself a 970. When I got a friend a 1440p monitor and a card to drive it I got her a 290. I'm getting progressively more jealous as I have to OC my card to keep up, it heats my room while running 2D because apparently more than one 1440p screen is difficult to push from desktop in tyool 2015 (nobody told my Surface, it does it just fine with a TDP less than the power my 970 wastes on integrated graphics), and best yet, for the past two or so weeks I can't tab out of a game without spending half my time waiting for my computer to stop being locked up. Best yet, I get to watch the past few years of AMD cards get steady performance boosts to the point early GCN cards have gone up an entire tier and wonder about how my card's going to get supported.

    All that for power savings I could get out of a $2.50 light bulb? That isn't even close to worth it. I've owned an 8800GT, 260, 560, 760 and the 970 and the past year's the worst showing I've seen from NV and the one that's made me decide to get AMD next time unless the performance gap is as big as between Fury and an OC 980 Ti, which I seriously doubt will be the case.
  • psychobriggsy - Tuesday, November 24, 2015 - link

    Until you use a 144Hz monitor, or multiple monitors, it seems.
  • xthetenth - Tuesday, November 24, 2015 - link

    Yeah, unless I run a third party utility, I would probably be using more power overall with my 970 than a 290 because of how it decides it has to clock up to push 2D.
  • looncraz - Tuesday, November 24, 2015 - link

    No doubt you would.

    My R9 290, with two 1080p monitors, one 144hz (though I run it at 120hz) idles at around 96 to 102 watts, and pulls about 126 to 138 watts while playing video (a nice improvement with the Crimson driver, as I was previously pulling 142 to 152w on the same-ish videos).

    In fact, this video card has only become better and better, whereas every nVidia card I've ever had seemed to age poorly. I have a 7870XT in my HTPC and it it easily ~30% faster playing Hitman and BF4 than when I first played either.

Log in

Don't have an account? Sign up now