Low Power SLI: HybridPower

It's not just the CPU guys that are taking power consumption seriously these days, NVIDIA is too. With the nForce 780a, NVIDIA finally introduces a technology it has been talking about for several months: HybridPower.

With all of NVIDIA's 2008 chipsets featuring integrated graphics, HybridPower enables a discrete graphics card to shut off when not in use, relying on the motherboard's integrated graphics (mGPU) to handle display output.

The technology works like this: the discrete GPU (dGPU) is plugged into a standard PCIe slot, however the display is connected to the mGPU. With HybridPower running in Power Savings mode, the mGPU handles all rendering and display output, while the dGPU remains turned off completely (not idling, but completely turned off, even the fan stops spinning). In Performance Boost mode, the dGPU is turned on and it handles all 3D rendering and display operations but its frame buffer is copied to system memory before being displayed by the mGPU. The dGPU renders all frames but the mGPU actually displays them. Since the dGPU's frame buffer needs to be copied to system memory and is actually displayed by the mGPU there is a small performance hit for enabling HybridPower, thankfully it is negligible.

BIOS and driver support for HybridPower is nothing short of outstanding, the install process is virtually seamless. Generally when dealing with integrated graphics and switching between a mGPU and dGPU you'll need a couple reboots and maybe a reconfigure of the BIOS before you can get display output. With the nForce 780a we simply plugged in a supported NVIDIA GPU and everything else worked itself out.

One problem we countered was related to behavior of the platform with its unsigned graphics driver. The issue is this: the nForce 780a's IGP uses the same graphics driver as the GeForce 9800 GX2 we attempted to install, however that graphics driver won't automatically be installed due to the fact that it is unsigned. It requires a re-run of the NVIDIA installation utility if the user adds a graphics card after the fact in order to get it to install properly. We are assuming that final drivers will be signed and this won't be a problem once the product is available for retail sale, but for now it can be confusing since no errors are thrown and you need to look at device manager before you realize that the GX2 driver wasn't properly installed.

The real problems with HybridPower arise when attempting to switch between using the mGPU and dGPU. The public and reviewers alike were both led to believe (by both AMD and NVIDIA) that the platform/driver would intelligently switch between the mGPU and dGPU - this isn't the
real world functionality of the platform.

Switching between the HybridPower modes must be done manually; while NVIDIA would like for the transition to be automated and seamless, this is the first incarnation of the technology and support for application-sensing technology just isn't there yet.

Luckily, NVIDIA developed a very simple tool that sits in your systray, allowing you to switch between HybridPower modes. Simply right-click the tool, select the appropriate operating mode and the driver enables or disables the appropriate GPU.

There are some limitations; first and foremost, only the GeForce 9800 GTX and GeForce 9800 GX2 are supported by HybridPower. On the chipset side, the nForce 720a, 730a, 750a, 780a and all of the GeForce 8x00 series motherboards support HybridPower. For most users, you'll need a new motherboard and a new GPU to take advantage of HybridPower.

Certain 3D applications won't let you change state while they are running, so you may have to quit applications like 3dsmax before you are able to switch power modes. NVIDIA's utility reminds you of this:

When switching HybridPower modes, the state of one GPU gets moved to the other, meaning that the process isn't instantaneous. The more windows you have open and the more GPUs you have in the system, the slower the process will be. On a single GeForce 9800 GTX it took between 4 and 7 seconds to switch modes, which honestly wasn't too bad.

When we outfitted the system with a GeForce 9800 GX2, featuring two GPUs, the process took up to 13 seconds. The amount of time it takes to switch modes depends entirely on the number of windows open, with 40 windows open the GeForce 9800 GTX took a maximum of around 6 seconds to switch modes, compared to 13 seconds for a GeForce 9800 GX2 thanks to its two GPUs. The transition time would be even higher on a 3 or 4 GPU system.

The type of windows open doesn't seem to have an impact on the transition time between HybridPower modes, simply the number of windows (and their associated memory footprint). The problem is that a dual-purpose machine (one used for work and gaming) can easily have a large number of windows open, and waiting more than 10 seconds for anything to complete easily makes a system feel slow/sluggish.

The power savings were absolutely worth it, see for yourselves:

  Save Power Mode (dGPU Disabled) Boost Performance Mode (dGPU Enabled)
Total System Power Consumption (Idle) 115W 165W

 

Since the mGPU is just as capable of decoding HD video as the dGPU in this case, it is possible to build an actual gaming HTPC out of something like the nForce 780a. You no longer have to sacrifice performance in order to keep power consumption down, you can have a multi-GPU setup but still watch movies thanks to HybridPower.

30" LCD Owners Need not Apply

The ASUS M3N-HT Deluxe only offers an analog VGA output as well as a digital, single-link HDMI output. The problem with this configuration is that while it is possible to convert HDMI to DVI, there is no way of outputting a dual-link DVI signal. In other words, the resolutions needed by 30" displays won't be reachable via the mGPU.

If this assumption is correct and there is no way to output a dual-link DVI signal from the mGPU (the reviewer's guide indicates only a single-channel integrated TMDS), then it almost entirely negates the point of HybridPower and 3-way SLI on this motherboard. Anyone investing a serious amount of money into graphics cards may also have reason to invest in a 30" display, which as it stands will be unsupported by this platform unless the display is driven directly off of the graphics cards themselves, in which case HybridPower won't work.

This is absolutely unacceptable and would prevent us from recommending the 780a as anything more than just another SLI motherboard. HybridPower is quite possibly the best feature for a high-end SLI user and if it won't work with 30" displays then its usefulness is severely degraded.

Unfortunately there's no other workaround here, NVIDIA simply chose wrong with its lack of support for dual-link DVI and we won't see this problem fixed until a new revision of the mGPU makes its way into later chipsets.

High Definition Video Decode Acceleration Test Setup
Comments Locked

38 Comments

View All Comments

  • SiliconDoc - Wednesday, May 7, 2008 - link

    Maybe I'm the only one, but I'm so sick of every new PC component having a global warming psychotic power consumption "feature set", as if any of we end users actually give a d-a- you know what.
    Heck, maybe I'm lone gunman here, but it really makes me sick, over and over again, as if I'd buy their piece of crap because they have some wattage bean counter going gaga about their lower power requirements.
    Hey, here's an idea. STOP GAMING, lower yer rezz, use a tiny 13 inch monitor, and put monitor sleep mode to kick on in one minute.
    Better yet, shut your system off, you're wasting the earth, and get outside for heat from the sun or put on a wool sweater, or dunk ter head in the creek if you're too warm.
    Who are they fooling ? They're pushing 1,000 watt PS's, then we have to hear this beanny watt counter crud. Yeah, right after the Q6600, 4 HD's, 2 DVD's, memory card readers, dual monitor outputs, ohhh.. and make sure you got a 700 watt plus supergigajiggawatt or she might not run.....
    I for one would just like to say, to noone and nobody in particular, go take a green time out.
    PS- this article is no more or less green than any other, so it isn't a target. I guess it's clear this is a power surge and perhaps an overload. Well, good!
  • Donkey2008 - Wednesday, May 7, 2008 - link

    You are absolutely right, especially the application of this technology to notebooks, which is pure insanity. Why would I care if my laptop could switch from discrete to integrated GPU to save battery power and provide me another hour or so of use? I am trying to destroy the earth so I want as little battery life as possible so I can plug it in and use more resources.

    As for desktops, those crazy tree-huggers want you to use less power so that your systems run more efficient and PUT OUT LESS HEAT. This would be a complete waste for those who dropped several hundred dollars for water-cooling and giant, ridiculous, circus clown heatsinks. This isn't even mentioning the enviro-psychos who like to use their computer as a floor heater in winter.

    How about you take your finger out of your nose because it is clearly in too far and blocking your brain from releasing any common sense.
  • SiliconDoc - Wednesday, May 7, 2008 - link

    Why stop at that, you need the wind up power notebook, like the ones selling for the 3rd world. No plugging in and no charging any battery except by turning the crank handle.
    If you're gaming on a battery, it's not just your finger up your nose, but likely your neighbors as well, to hold it up so high. Where are you that you cannot plug in ... up in that airplane ... saving all that jet fuel ?... or did you drive your Yugo to some way out park to hack, away from civilization, also an energy saver, no doubt. Have fun disposing of the polluting battery, too.

    Desktops: If your system is putting out so much heat that you need to run a refrigerator to "cool just the system down", you certainly are not "saving" any power either.. DUH.
    Gigantic heatsinks (and their gargantuan fans)are power-hungry users trying to crank out the last bit of mhz, often with voltage increases, huh ... DUH. Maybe the jet engine they replaced was a cheap sleeve bearing, but they didn't "save power".

    Not sure exactly what the donkey you were trying to say, since you didn't make any sense, but then, that's what it's all about, huh. Preening your green self while wildly flailing about and praising the gigantic power (savings ? lol ) drain you are, anyway - while firing up the 250 watt triple 3d sli maxxed super whomper game.

    I think if you had any common sense, you'd "get it".


  • The Jedi - Wednesday, May 7, 2008 - link

    Jigga-WHAAAT?!
  • zander55 - Wednesday, May 7, 2008 - link

    Why on is ESA only available on the highest end model? Nvidia wants the industry to adopt and implement it into their hardware but won't even put it into their own stuff?
  • crimsonson - Tuesday, May 6, 2008 - link

    I don't understand why so many pages and charts are devoted to pure performance for motherboards. Unless there is physical flaw or bad drivers, performance between these motherboards are normally next to nil!
    I understand stability, overclocking, and power consumption. But looking at these charts a lot of them are minuscule difference that often can be explained by settings, other components or bad drivers. I am not saying bench testing are not useful. But I don't think it is necessary to view dozens of charts with no or little difference. In fact, it would make more sense to go in to details where there is a significant difference. I think your attention to detail gets the best of you :D

    My .02

    In general I do think you guys do awesome work.

  • wjl - Tuesday, May 6, 2008 - link

    Right. The benchmarks are not that interesting, and also which IGP runs which game at how many fps more or less is pretty uninteresting - as if the world had only gamers.

    As much as I like the image quality provided by Nvidia products, they're still a no-go if you want open source drivers - and here is much room for improvement. I won't buy (nor sell) any of them unless they follow the good examples of Intel and ATI/AMD.

    So my next mb - which will definitely have an IGP again - will be of the other mentioned makers, depending on whether I need an AMD or an Intel CPU next time.
  • strikeback03 - Thursday, May 8, 2008 - link

    I have to use the restricted drivers on both my desktop (discrete NVIDIA) and laptop (discrete ATi) in Ubuntu.

    And I've never understood the point of windows that wobble.
  • sprockkets - Tuesday, May 6, 2008 - link

    Tru, I love not having to install any drivers for compiz-fusion on my Intel G31 system. It actually runs it better than my 6150 AMD system.

    But, under load with movies and compiz and other stuff graphics wise running, the 6150 doesn't crap out as much.

    Good chipset, waiting for Intel's version. I have been an AMD person for a long time, but, for $70 a 2ghz Pentium Allendale works great for me.

    WTB a gen 2 Shuttle XPC in silver with either the G45 or Intel's. 3ghz Wolfdale will do nicely.
  • wjl - Wednesday, May 7, 2008 - link

    BTW: tried movies (MythTV) together with Compiz, and that really didn't look nice, even on my 6150/430 Nvidia. Only after switching off most or all desktop effects, the picture became more stable...

Log in

Don't have an account? Sign up now