At the tail-end of 2014 AMD launched their Catalyst 14.12 driver set, better known as the Omega driver set. With Omega, AMD shifted their development process for the Catalyst driver set, focusing on delivering feature updates in fewer, larger updates while interim driver releases would focus on bug fixes, performance improvements, and adding new cards. The Omega release in turn was the first of these major releases, delivering a number of new features for Catalyst such as Virtual Super Resolution, preliminary support for FreeSync, and of course a number of performance improvements.

When briefing the press on Omega, one of AMD’s points was that if it was successful they were intending to make it a yearly release – in essence putting major feature updates on a yearly cadence – and after the reaction to Omega AMD has gone ahead and done just that. So launching today and serving as the cornerstone of AMD’s video driver plans for 2016 is this year’s major update, Radeon Software Crimson Edition. (Download)

Meet Radeon Software Crimson Edition 15.11

AMD and the Radeon Technologies Group first announced their plans for Radeon Software Crimson Edition back at the start of the month, in a preview/teaser of what they were working on. AMD’s initial preview focused on the UI aspects of Crimson, namely the new control panel application, Radeon Settings. And while AMD’s preview actually covered Radeon Settings in a fair bit of depth, like all good previews AMD kept everything going on under the hood for Crimson equally under wraps. As a result we have quite a bit to discuss today, with Crimson rolling out with a number of bug fixes and feature additions on top of the control panel overhaul AMD originally announced.

But before diving into matters, let’s talk about AMD’s announced release schedule for Crimson going forward. Like Omega before it, Crimson is an annual release and a cornerstone for AMD’s driver plans for the next year. Along with renaming their driver stack from Catalyst to simply Radeon Software, the Crimson branding will be sticking with this release cycle. Come late 2016, for their next major feature update the Crimson branding will be replaced with another red-themed name.

Meanwhile one point of criticism towards AMD in 2015 has been the limited number of WHQL certified driver releases for the year. AMD had plenty of beta releases over the year – averaging once a month despite the fact that the company stopped adhering to a fixed monthly release schedule in 2012 – however they only released 3 WHQL certified releases. WHQL certification is in and of itself a thorny issue – it is an additional layer of quality assurance, which is good, but it doesn’t cover game-specific bugs, which are the bulk of the bugs most gamers are going to run into – so while it’s useful it alone won’t make a driver good or bad. None the less AMD will be addressing the lack of WHQL certified releases for 2016.

AMD’s plans call for up to 6 driver releases to be WHQL certified next year, with additional beta releases as necessary as AMD already does today. Frankly the “up to” designation leaves AMD quite a bit of wiggle room in case they fall short, so it’s not a very solid promise. But on the other hand it’s legitimately difficult to plan for a specific number of WHQL releases a year in advance – one can’t predict bugs – so AMD does need some wiggle room in case they can’t meet that schedule. That said, if AMD wants to seriously address the complaints about the lack of WHQL releases in 2015 and retain their integrity, then they need to deliver on those 6 releases for 2016.

Speaking of quality assurance, AMD tells us that they have once again increased their QA testing, and stability is a top focus for 2016. With the Omega driver AMD ramped up both their automated and human testing to cover more test cases and system configurations, and for Crimson AMD has done this again.

As drivers approach (and in some cases exceed) the complexity of an operating system, comprehensive driver QA becomes increasingly invaluable, and Windows 10’s aggressive driver update mechanism will bring driver quality to the forefront. So although AMD has never not focused on driver quality and stability, there is always room for improvement. And particularly in AMD’s case, some of the Catalyst releases have shipped with some major issues despite AMD’s QA process improvements for Omega – the web browser memory leak comes to mind – so AMD definitely needs to improve their processes to prevent future issues.

As for Crimson in particular, AMD notes that they have knocked out a large number of bugs. AMD also notes that a number of these bugs came in and/or were prioritized via user feedback, so they’re asking that we remind everyone that AMD has a bug reporting form and that they’re encouraging anyone experiencing a driver bug to use it.

Under The Hood: DirectX 9, Shader Caching, Liquid VR, and Power Consumption
Comments Locked

146 Comments

View All Comments

  • looncraz - Wednesday, November 25, 2015 - link

    I only play BF4 with Mantle, and I've never noticed a single glitch (I did when it first came out (with colors), so I ran DX11 for a while).

    The resolution actually doesn't dictate how much RAM you need as much as people think. A 1080p frame buffer only weighs in at ~8MB, 4k is ~34MB. You need VRAM to store all of the textures and other game data. Your resolution has an effect on VRAM use only for certain features.
  • i_create_bugs - Wednesday, November 25, 2015 - link

    Except that you also need room for multiple render targets. Not just RGBA. Typically diffuse, normal, stencil, etc. Plus on top of that you need stencil/Z-buffer. Those buffers can also be 64 bits per pixel, if float16 pixel formats are used.

    Additionally sometimes frame buffer width is a bit more than actual resolution due to hardware limitations. So 1920 wide buffer might actually have room for 2048 pixels in real memory layout.

    Lower end guess for 1080P is 32 x 2^ceil(log2(1920)) x 1080. So at least 32 x 2048 x 1080 bytes. 67.5 MB per frame at 1080P. For 4k (3840x2160), 32 x 4096 x 2160 = 270 MB.

    Plus op top of that you need some RGBA frames for double / triple buffering.
  • looncraz - Thursday, November 26, 2015 - link

    You calculated it for BITS, not BYTES.

    Also, we usually end up aligning just a few pixels on the end (as in two or three).

    A 1920x1080 buffer will be allocated as a slightly wider, but no higher, buffer. FOUR bytes per pixel (not 32). That gives 7.91MB per frame buffer.

    As for the z coordinate, we usually use the last 8 bits of the above buffer. Why? Because 8-bits per color channel is what anyone usually ever uses. This is called D24S8.

    When you increase the resolution of your game yourself, you are increasing the size of the frame buffers, including the flip queue, post-processing buffers, and a few others. Basically, you can generally assume there are 15 frame-buffer linked sized buffers.

    So at 1080p, you need 118MB of VRAM for the buffers, and at 4k you need 475MB. This is why you can see VSR running so well on video cards with only 2GB of RAM. You do need more RAM, but it isn't drastic. What can make a more drastic difference is the game using resolution-specific textures. THAT can eat up an extra GB or so, depending on the game developer. Older games, or games meant for 1080p, however, will not have 4k texture packs.
  • The_Countess - Friday, November 27, 2015 - link

    you just described the game running in directX as well.
  • dsumanik - Tuesday, November 24, 2015 - link

    Nvidia drivers have been superior for the last decade, end of story. I suspect in many cases when the silicon battle was close, this is how NVIDIA kept the edge.

    If crimson can fulfill it's ambitious vision, things will get might interesting next year.

    Got my fingers crossed for ya AMD.
  • Dalamar6 - Wednesday, November 25, 2015 - link

    NVidia's superior drivers are why AMD was bargain binned even during the times when their performance:price ratio was actually significantly better.

    Of course we're talking Windows, AMD literally has NO linux presence at all, and literally cripples rolling distributions, and this rebadged driver won't change that.
  • Gigaplex - Wednesday, November 25, 2015 - link

    AMD has their open source driver presence. For a lot of their hardware, it's very stable and performs well. It's pretty slow to support brand new hardware though.
  • Fallen Kell - Wednesday, November 25, 2015 - link

    You mean 2D support is pretty stable and performs well. 3D is abysmal performance. There is a reason why not a single Steam Machine configuration out there has an AMD graphic card as an option, and it is because they all have HORRIBLE 3D performance.
  • Beany2013 - Wednesday, November 25, 2015 - link

    As a user of Ubuntu and Debian, and AMD GPUs, I have to agree with Fallen Kell; it's not as bad as it was, but major updates (such as GCC updates, as happened with Ubuntu 15.10) utterly, utterly break things.

    It's working now on Wiley-proposed, but jesus, what a pain in the arse.

    I'm hoping that this, and other pressure (like not having any realistic Steam Machine presence) might make will force them to up their game. Majorly.

    Performance when it works though, is fine - in some cases though, it's just that you have to force it to work. With hard liquor. And swearing. And Fire.
  • FourEyedGeek - Wednesday, November 25, 2015 - link

    There is one area AMD beats NVIDIA in drivers, old cards. NVIDIA haven't paid as much attention to older cards as AMD has, though it could be because AMD use the same architecture for longer periods of time. At release the NVIDIA 680 was faster than the 7970, but on modern games with new drivers the 7950 can even beat the 680 in some games.

Log in

Don't have an account? Sign up now