How AMD’s Dynamic Switchable Graphics Works

One of the things we discussed with AMD was the technical details of their dynamic switchable graphics. At a high level, things might appear similar to NVIDIA’s Optimus, but dig a little deeper and you start to find differences. To recap how switchable graphics works, let’s start at the top.

The original switchable graphics technologies used the IGP and dedicated GPU as discrete devices. Both were connected to the necessary display outputs, with some hardware muxes that could select the active device. This requires more cost in the motherboard, and switching results in a blanking of the display as one device is deactivated and the other comes online. In the earliest implementations, you had to reboot when switching, and the system would start with either the IGP or dGPU active. Later implementations moved to software controlled muxes and dynamic switching, which required Windows Vista to work properly (since the IGP driver would unload, the GPU driver would start, and then the display content would activate on the GPU).

NVIDIA’s Optimus changes things quite a bit, as there are no longer any muxes. The display ports are always linked to the IGP output, and NVIDIA’s drivers simply look for calls to applications that the dedicated GPU can help accelerate. When they detect such an application—and the user can add their own custom apps—the drivers wake up the GPU and send it the rendering commands. The GPU does all of the necessary work, and then the result is copied directly into the IGP framebuffer, omitting any flickering or other undesirable effects as the IGP is constantly connected to the display output. The GPU can wake up in a fraction of a second, and when it’s no longer needed it will power down completely. NVIDIA even demonstrated this by removing the dGPU from a test system while it was powered on. The only catch is that the drivers need to have some knowledge of the applications/games in order to know when to use the GPU.

The details of AMD’s Dynamic Switchable Graphics are similar in practice to Optimus, but with a few differences. First, AMD always has both the IGP and GPU driver loaded, with a proxy driver funneling commands to the appropriate GPU. Where NVIDIA is able to completely power off the GPU under Optimus, AMD has modified their GPUs so that the PCI-E bus is isolated from the rest of the chip. Now when the GPU isn’t needed, everything powers down except for that PCI-E connection, so Windows doesn’t try to load/unload the GPU driver. The PCI-E link state gets retained, and a small amount (around 50mW) is needed to keep the PCI-E state active, but as far as Windows knows the GPU is still ready and waiting for input. AMD also informed us that their new GPUs use link adapter mode instead of multi adapter mode, and that this plays a role in their dynamic switchable graphics, but we didn’t receive any additional details on this subject.

As far as getting content from the dGPU to the display, the IGP always maintains a connection to the display ports, and it appears AMD’s drivers copy data over the PCI-E bus to the IGP framebuffer, similar to Optimus. Where things get interesting is that there are no muxes in AMD’s dynamic switchable graphics implementations, but there is still an option to fall back to manual switching. For this mode, AMD is able to use the display output ports of the Intel IGP, so their GPU doesn’t need separate output ports (e.g. with muxes). With the VAIO C, both dynamic and manual switching are supported, and you can set the mode as appropriate. Here are some static shots of the relevant AMD Catalyst Control Center screens.

In terms of the drivers, right now you get a single large driver package that includes a proxy driver, an Intel IGP driver, and AMD’s GPU driver all rolled into one. Long-term, AMD says they have plans to make their GPU driver fully independent from Intel’s IGP driver. They say this will only require some packaging updates and that they should make this change some time in 2012, but for now they continue to offer a monolithic driver package. OEMs apparently get this driver on a monthly basis (or can at least request it), but it’s up to the OEMs to validate the driver for their platform and release it to the public.

In the case of non-switchable graphics, AMD has a monthly driver update that we refer to as “reference drivers” that is publicly available. At present, you download a utility that will check your laptop GPU ID to see if the laptop is officially supported by the reference driver. Right now certain OEMs like to maintain control of the drivers so the AMD utility will refuse to download the full driver suite. In such cases, users have to wait for the manufacturers to roll out updates (Sony, Toshiba, and Panasonic all fall into this category). In the past, we have been able to download the reference driver using a “sanctioned” laptop (e.g. something from Acer), and we were able to install the reference driver on a non-sanctioned laptop. However, this does not work with switchable graphics laptops; you need the monolithic driver package for such systems.

That takes care of the high-level overview of how AMD’s Dynamic Switchable Graphics works, as well as a few other related items. The details are a little light, but that at least gives us an introduction to AMD’s current switchable graphics solutions. With the hardware and software discussions out of the way, let’s turn to our gaming results first and see how the two solutions and GPUs compare in performance as well as compatibility.

Switchable Graphics - Meet the Contenders Medium Detail Gaming Comparison
Comments Locked

91 Comments

View All Comments

  • JarredWalton - Tuesday, September 20, 2011 - link

    Thanks for the input -- I've corrected the 6700M mistakes now. Somehow I got it stuck in my brain that 6700M was rebadged 5700M, but that's only the 6300M and 6800M. Thanks also for the updates on Linux--good to know how it does/doesn't work.
  • rflynn88 - Tuesday, September 20, 2011 - link

    Correction required:

    The 6700M series chips do support dynamic switching. I'm not sure if the 6300M series does. The HP dv6t can be optioned out with a Core i5/i7 Sandy Bridge chip and the 6770M with dynamic switching. There was actually a big issue with the dynamic switching not working on the dv6t for OpenGL applications, which was only recently remedied with a BIOS update.
  • JarredWalton - Tuesday, September 20, 2011 - link

    Corrected -- see above note. I mistakenly confused the 6700M situation with the 6300M/6800M, which are rebadged 5000M parts and would not have the decoupled PCI-E interface to allow for dynamic switching.
  • BernardP - Tuesday, September 20, 2011 - link

    What I would like to do with Optimus is disable it completely and have my notebook always use the Nvidia graphics, even on the desktop. I don't care about battery life, as my notebook is almost never running on battery.

    After searching on the web, I have found no way to disable Optimus. Anybody here have a solution?
  • bhima - Tuesday, September 20, 2011 - link

    Yes. Open up your NVIDIA settings. Change your Global Settings preffered graphics processor to the NVIDIA card. Voila!
  • BernardP - Tuesday, September 20, 2011 - link

    As simple as that? I'm hopeful...

    Please allow me a bit of residual doubt, considering, for example, the following discussion where there is a mention of this suggested setting.

    http://superuser.com/questions/282734/how-to-disab...

    However, I'll try your suggestion on a new Alienware laptop with Optimus one of my friends just bought.
  • JarredWalton - Tuesday, September 20, 2011 - link

    The above change won't disable Optimus so much as always try to run programs on the dGPU. To my knowledge, there is no way to disable it completely, since with Optimus there is no direct link between the dGPU and the display outputs. All data has to get copied over the PCI-E bus to the IGP framebuffer.
  • BernardP - Wednesday, September 21, 2011 - link

    That's also my understanding, hence my doubts...
  • seapeople - Tuesday, September 20, 2011 - link

    I can see not caring about battery life, but you don't care about heat and/or noise either? More power use = more heat = constant fan turning on and off vs silent operation using only the integrated GPU.

    Less heat also equates to longer hardware lifetime.

    I just don't understand why you would want a fat GPU idling while you browse AnandTech instead of the low power Intel GPU built into your computer?
  • BernardP - Wednesday, September 21, 2011 - link

    Because I want a not-so-fat NVidia GPU, such as GT520M or GT525M, to pair with a high quality 1980x1080 IPS screen.... and then, use NVidia scaling to set up a custom resolution that will allow my old presbyobic eyes to see what's on the screen. For a 15.6 in. screen, 1440x900 is about right and with the NVidia custom resolution tools, there is no significant loss of quality.

    AMD graphics drivers don't allow setting up custom resolutions. Can Intel graphics drivers do it?

    And I know that one can make things bigger onscreen with Windows settings, but it doesn't work all the time for everything. There is no substitute for setting up custom resolutions.

Log in

Don't have an account? Sign up now