How AMD’s Dynamic Switchable Graphics Works

One of the things we discussed with AMD was the technical details of their dynamic switchable graphics. At a high level, things might appear similar to NVIDIA’s Optimus, but dig a little deeper and you start to find differences. To recap how switchable graphics works, let’s start at the top.

The original switchable graphics technologies used the IGP and dedicated GPU as discrete devices. Both were connected to the necessary display outputs, with some hardware muxes that could select the active device. This requires more cost in the motherboard, and switching results in a blanking of the display as one device is deactivated and the other comes online. In the earliest implementations, you had to reboot when switching, and the system would start with either the IGP or dGPU active. Later implementations moved to software controlled muxes and dynamic switching, which required Windows Vista to work properly (since the IGP driver would unload, the GPU driver would start, and then the display content would activate on the GPU).

NVIDIA’s Optimus changes things quite a bit, as there are no longer any muxes. The display ports are always linked to the IGP output, and NVIDIA’s drivers simply look for calls to applications that the dedicated GPU can help accelerate. When they detect such an application—and the user can add their own custom apps—the drivers wake up the GPU and send it the rendering commands. The GPU does all of the necessary work, and then the result is copied directly into the IGP framebuffer, omitting any flickering or other undesirable effects as the IGP is constantly connected to the display output. The GPU can wake up in a fraction of a second, and when it’s no longer needed it will power down completely. NVIDIA even demonstrated this by removing the dGPU from a test system while it was powered on. The only catch is that the drivers need to have some knowledge of the applications/games in order to know when to use the GPU.

The details of AMD’s Dynamic Switchable Graphics are similar in practice to Optimus, but with a few differences. First, AMD always has both the IGP and GPU driver loaded, with a proxy driver funneling commands to the appropriate GPU. Where NVIDIA is able to completely power off the GPU under Optimus, AMD has modified their GPUs so that the PCI-E bus is isolated from the rest of the chip. Now when the GPU isn’t needed, everything powers down except for that PCI-E connection, so Windows doesn’t try to load/unload the GPU driver. The PCI-E link state gets retained, and a small amount (around 50mW) is needed to keep the PCI-E state active, but as far as Windows knows the GPU is still ready and waiting for input. AMD also informed us that their new GPUs use link adapter mode instead of multi adapter mode, and that this plays a role in their dynamic switchable graphics, but we didn’t receive any additional details on this subject.

As far as getting content from the dGPU to the display, the IGP always maintains a connection to the display ports, and it appears AMD’s drivers copy data over the PCI-E bus to the IGP framebuffer, similar to Optimus. Where things get interesting is that there are no muxes in AMD’s dynamic switchable graphics implementations, but there is still an option to fall back to manual switching. For this mode, AMD is able to use the display output ports of the Intel IGP, so their GPU doesn’t need separate output ports (e.g. with muxes). With the VAIO C, both dynamic and manual switching are supported, and you can set the mode as appropriate. Here are some static shots of the relevant AMD Catalyst Control Center screens.

In terms of the drivers, right now you get a single large driver package that includes a proxy driver, an Intel IGP driver, and AMD’s GPU driver all rolled into one. Long-term, AMD says they have plans to make their GPU driver fully independent from Intel’s IGP driver. They say this will only require some packaging updates and that they should make this change some time in 2012, but for now they continue to offer a monolithic driver package. OEMs apparently get this driver on a monthly basis (or can at least request it), but it’s up to the OEMs to validate the driver for their platform and release it to the public.

In the case of non-switchable graphics, AMD has a monthly driver update that we refer to as “reference drivers” that is publicly available. At present, you download a utility that will check your laptop GPU ID to see if the laptop is officially supported by the reference driver. Right now certain OEMs like to maintain control of the drivers so the AMD utility will refuse to download the full driver suite. In such cases, users have to wait for the manufacturers to roll out updates (Sony, Toshiba, and Panasonic all fall into this category). In the past, we have been able to download the reference driver using a “sanctioned” laptop (e.g. something from Acer), and we were able to install the reference driver on a non-sanctioned laptop. However, this does not work with switchable graphics laptops; you need the monolithic driver package for such systems.

That takes care of the high-level overview of how AMD’s Dynamic Switchable Graphics works, as well as a few other related items. The details are a little light, but that at least gives us an introduction to AMD’s current switchable graphics solutions. With the hardware and software discussions out of the way, let’s turn to our gaming results first and see how the two solutions and GPUs compare in performance as well as compatibility.

Switchable Graphics - Meet the Contenders Medium Detail Gaming Comparison
Comments Locked

91 Comments

View All Comments

  • Wolfpup - Wednesday, September 21, 2011 - link

    The article mentions, but probably doesn't make the statement strong enough-you CAN'T use AMD's drivers with Sandy Bridge + AMD systems, which is why I don't even think they should be on the market, let alone anyone actually buy them.

    I'd been looking at HP's Sandy Bridge systems until learning that. AMD + AMD systems SHOULD work, although the A series stuff isn't listed yet. But the c series chips work, so I'd be surprised if they don't get A series officially supported.

    Unfortunately there's unsupported stuff using Nvidia too...apparently Sager's systems aren't supported (along with some Sony's, and I've heard the new high end Dell ones with Floptimus have issues too...)

    Unfortunately it's hard to find out which notebooks can use normal drivers, even though that's a HUGE selling point!
  • inplainview - Tuesday, September 20, 2011 - link

    My 2011 MPB 15 inch supports:

    AMD Radeon HD 6750M:

    Chipset Model: AMD Radeon HD 6750M
    Type: GPU
    Bus: PCIe
    PCIe Lane Width: x8
    VRAM (Total): 1024 MB
    Vendor: ATI (0x1002)
    Device ID: 0x6741
    Revision ID: 0x0000
    ROM Revision: 113-C0170L-573
    gMux Version: 1.9.23
    EFI Driver Version: 01.00.573
    Displays:
    Color LCD:
    Resolution: 1440 x 900
    Pixel Depth: 32-Bit Color (ARGB8888)
    Main Display: Yes
    Mirror: Off
    Online: Yes
    Built-In: Yes

    and:

    Intel HD Graphics 3000:

    Chipset Model: Intel HD Graphics 3000
    Type: GPU
    Bus: Built-In
    VRAM (Total): 512 MB
    Vendor: Intel (0x8086)
    Device ID: 0x0126
    Revision ID: 0x0009
    gMux Version: 1.9.23
  • tipoo - Tuesday, September 20, 2011 - link

    We all care deeply.
  • retrospooty - Tuesday, September 20, 2011 - link

    "My 2011 MPB 15 inch supports:"

    Pretty much all laptops these days support both... The question is does it properly switch between the two, using the low power internal intel for normal Windows day to day use and then automatically switch to the Radeon when 3d gaming.... Wait, its a MAc, why would you even bother 3d gaming.

    This maked me ask, why does Apple even bother putting an expensive gaming card in a Mac? The few games that run, run like crap... unless its there for the Windows/boot camp portion. I guess that makes sense. But then, why not get a PC, its cheaper.
  • inighthawki - Tuesday, September 20, 2011 - link

    As someone who doesn't even like Macs, I think you are looking way past the obvious. Many Macs are used for things like Photoshop, which can use hardware accelerated rendering, and Maya, 3DS Max, etc which are pretty demanding 3D modeling and CAD programs. Not all higher end GPUs are JUST for gaming.
  • retrospooty - Tuesday, September 20, 2011 - link

    ???

    That kinda proves my point. Its not for games, it IS good for those things you listed, which are much better suited to pro series video cards (Nvidia Quadro and AMD Fire Pro) It doesnt make sense to put gaming cards.
  • inplainview - Tuesday, September 20, 2011 - link

    Well, I'm not 11 so I actually use my Mac for real work, and don't waste SSD space putting games on it.. Also as a semi-pro photographer I tend to use it to the max. Did I explain it enough for you?
  • sigmatau - Tuesday, September 20, 2011 - link

    You forgot the price tag. How much money does it take to support Apple?
  • inplainview - Tuesday, September 20, 2011 - link

    I make A LOT of money so I buy whatever I want.
  • seapeople - Tuesday, September 20, 2011 - link

    Then why don't you buy a bigger SSD so your games don't load slowly?

Log in

Don't have an account? Sign up now