Optimus Technology Revisited

Over the past year or so since NVIDIA’s Optimus Technology first entered the scene, I’ve been very positive about the technology. The number of titles where things absolutely fail to work is quite small, and even when there are glitches they are generally minor. If anyone has experience with other titles failing to work properly, please leave a comment or send me an email, as I’d love to verify whether Optimus is malfunctioning (and if it can be made to function) on other games beyond what I’ve tested. So far, I have encountered three games where I have some sort of difficulty with Optimus, and there are workarounds of sorts for all three. In order of severity (worst to least), the games are:

Civilization V: In my experience so far, Arrandale-based Optimus laptops fail to run the DX10/11 mode properly. I have a Sandy Bridge + GT 540M laptop where the DX11 mode runs without a hitch, so perhaps the problem lies with the Arrandale IGP. Oddly enough, at least one Arrandale laptop managed to run the DX10 mode (slowly and perhaps some of the colors got munged), but that system has an older Intel driver so that might be the culprit. I'll try some different driver combinations next week, but so far the only way I’ve been able to run Civilization V on the U41JF is by choosing the DX9 mode.

3/30/2011 Update: NVIDIA has released a beta version of their 270 series driver, and I did a quick retest with this. It fixed the issues with Civilization V DX10/11 mode on the U41JF, and presumably will work on other laptops as well. This is why I prefer Optimus until we get more power friendly dGPUs: driver updates can usually fix the problems. This is also one the concerns with companies that don't have a huge compatibility testing group focused on gaming (i.e. Intel), because glitches like this happen on a regular basis. An earlier NVIDIA driver worked, then something got borked with Optimus + Civ5 DX11 in the 265 series rollout AFAICT, and now it's fixed again.

Empire: Total War: This game appears to detect the Intel IGP and limit the available graphics settings based on their profiling of that GPU. Specifically, it doesn't allow higher detail settings, limiting you to "Medium" or lower on several areas. The game itself runs properly on the NVIDIA GPU, so this is more of a maximum available fidelity problem.

Left 4 Dead 2: This game previously ran properly on several Optimus laptops (i.e. the Dell XPS L501x), but there's apparently a newer bug on this one with Intel’s drivers. At present, if you set details for maximum (with or without AA), the game will exit when you try to load a level. The workaround for this is to set "Paged Memory Pool Available" to Low, which appears to reduce performance somewhat, but otherwise the game runs fine.

Given that none of the above issues are show-stoppers (unless you insist on using your DX10/11 GPU in Civ5), I still prefer the potentially improved battery life over the lack of issues that you get with a discrete-only solution. Long-term, I’d still like to see Optimus become unnecessary, but for that to happen we need to have discrete GPUs that can run light workloads (i.e. the Windows desktop) while using only 1-2W at most. Right now, a GTX 460M as an example appears to idle at close to 10W, so we’re an order of magnitude away from my target. Perhaps even worse is AMD’s new HD 6970M, which idles at around 15W but jumps up to more than 30W for basic H.264 playback (about twice what the GTX 460M uses for that same workload).

To get to the point where dGPUs no longer incur a severe power penalty, AMD and NVIDIA will need to add a lot more clock speed options and power gating to their hardware. Frankly, there’s no point why the HD 6970M should run its memory at 3.6GHz just for doing basic video playback (or sometimes just surfing the web). Similarly, the Windows desktop doesn’t need 192 or 384 CUDA cores from the 460M/485M, and it doesn’t need 960 Stream Processors either. A look at IGPs suggests they could completely shut down (i.e. power gate) all but eight or so CUDA cores, or 40 Stream Processors, and still have more than sufficient performance to handle basic Windows and Internet tasks. If you happen to watch a video where the GPU needs to do some work, there’s still no reason to power up all the RAM and GPU cores; figure out the minimum necessary resources that are needed and power up just those areas of the chip, and we’d be set.

One look at Intel’s Sandy Bridge processors and HD 3000 Graphics has me convinced that all of the above is possible. Now we just need the companies to invest the time and resources into R&D, testing, validation, and drivers. Turbo Boost already does much of what we’re talking about, but dynamically altering GPU core and RAM speeds and shutting off cores/memory (via power gate transistors as opposed to just clock gating) is a complex task. I’m sure AMD’s Llano APU will use a lot more power gating on the GPU portion than what we’ve seen in discrete GPUs; however, until we get a lot more granularity NVIDIA’s Optimus Technology is a good way to completely power down the dGPU when the IGP will suffice.

Why Discrete GPUs Matter: Gaming Performance Battery Life: Capacity Triumphs
Comments Locked

24 Comments

View All Comments

  • veri745 - Monday, March 28, 2011 - link

    Now it's about time that they give the LCDs on these a resolution upgrade. I'd like to see atleast 1600x900
  • jrocks84 - Monday, March 28, 2011 - link

    I totally agree on higher resolution LCDs being needed! I haven't searched that hard, but the only two 13" laptops that I know of with a decent res are the Macbook Air and the Sony Vaio Z.
  • lexluthermiester - Tuesday, March 29, 2011 - link

    I have a Asus EEE 1201N with 1366x768 res. It beats out my old VIAO which was 1280x800. Now granted, the 1201n is only a dual-core Atom , but at a 12" screen and the fact it will some moderate gaming, it packs punch for it's size. Battery life is far better as well.

    Of course we are talking about a $400 price point with the 1201n. But I guess the point I'm trying to make is that if you look into what it is you want good things can be found. And honestly, the system in this review would tempt me greatly if the 1201n didn't already meet my needs.... but oh so tempting....
  • ImSpartacus - Tuesday, March 29, 2011 - link

    I agree. I know many laptops will have to move to 16:9 for cost reasons, but why can't they just use 1600x900 as a baseline resolution?

    768 vertical pixels are unacceptable on anything but 11.6" displays.
  • blue_falcon - Monday, March 28, 2011 - link

    The industry is trending towards industry standard resolutions (HD at the moment for most systems). I doubt you'll see a 1600x900 13.3 screen.
  • Penti - Monday, March 28, 2011 - link

    Sony still has some, 13.1" 1600x900 laptops that is. Let's see if they get updated to Sandy Bridge too. If you want it you can have it, even though most use standard displays.
  • DLimmer - Monday, March 28, 2011 - link

    As usual, excellent laptop review. I relied on http://www.anandtech.com/show/2862/dell-studio-14z... a couple years ago when I bought my wife's laptop, and it still does all she asks of it *and* lasts all day on one charge (with intermittent use).
    I also grabbed a Gateway P-6831 based on http://www.anandtech.com/show/2490.

    Minor typos (first page third to last paragraph):
    "One the flipside, ASUS’ Super Hybrid Engine (SHE)" -> *On* the flipside

    (page 5, second paragraph from the bottom):
    "and it doesn’t need 960 Steam" -> *Stream*

    Thank you for providing objective and in-depth reviews we can use when selecting items to purchase.
  • ImSpartacus - Tuesday, March 29, 2011 - link

    I almost bought a 14z instead of my MBP13'09. It was on the thicker side, but had a massive battery and a full voltage processor.

    Eventually, I had to have that big trackpad and disk drive.

    In retrospect the decision was pretty murky.
  • DLimmer - Tuesday, March 29, 2011 - link

    My wife misses the DVD drive occasionally, but we have an external. It's most annoying when you install some software that requires the disc be in the computer to run. Only other time is when she wants to rip a new CD she's bought.

    All-in-all, giving up the drive for more battery life and less weight was a decent trade-off... however, she wants a drive in her next laptop.
  • Beenthere - Monday, March 28, 2011 - link

    How could you get it more wrong: Asus and Intel. It don't get any worse than that.

Log in

Don't have an account? Sign up now