NVIDIA Optimus Demonstration

So how well does Optimus actually work in practice? Outside of a few edge cases which we will mention in a moment, the experience was awesome. Some technophiles might still prefer manual control, but the vast majority of users will be extremely happy with the Optimus solution. You no longer need to worry about what video mode you're currently using, as the Optimus driver can switch dynamically. Even when you run several applications that can benefit from discrete graphics, we didn't encounter any anomalies. Load up a Flash video and the GPU turns on; load a CUDA application and the GPU stays on, even if you then close the Flash video. It's seamless and it takes the guesswork out of GPU power management.


NVIDIA provided a demonstration video showing second-generation switchable graphics compared to Optimus. We've uploaded the video to our server for your enjoyment. (Please note that QuickTime is required, and the sample video uses the H.264 codec so you'll need a reasonable CPU and/or GPU to view it properly.) At the Optimus Deep Dive, NVIDIA provided two other demonstrations of engineering platforms to show how well Optimus works. Sadly, we couldn't take pictures or record videos, but we can talk about the demonstrations.

The first demonstration was an open testbed notebook motherboard using engineering sample hardware. This definitely wasn't the sort of system you run at home, since there was a notebook LCD connected via a standard notebook power/video cable to the motherboard, exposed hardware, etc. The main purpose was to demonstrate how quickly the GPU turns on/off, as well as the fact that the GPU is really OFF. NVIDIA started by booting up Win7, at which point the mobile GPU is off. A small LED on the GPU board would light up when the GPU was on, and the fans would also spin. After Windows finished loading, NVIDIA fired up a simple app on the IGP and nothing changed. Next they started a 3D app and the GPU LED/fan powered up as the application launched; when they shut down the application, the LED/fan powered back off. At one point, with the GPU powered off, NVIDIA removed the GPU module from the system and disconnected its fan; they again loaded a simple application to demonstrate that the system is still fully functional and running of the IGP. (Had they chosen to launch a 3D application at this point, the system would have obviously crashed.) So yes, the GPU in an Optimus laptop is really powered down completely when it's not needed. Very cool!

The second demonstration wasn't quite as impressive, since no one removed a GPU from a running system. This time, Lenovo provided a technology demonstration for NVIDIA showing power draw while running various tasks. The test system was an engineering sample 17" notebook, and we weren't given other details other than the fact that it had an Arrandale CPU and some form of Optimus GPU. The Lenovo notebook had a custom application showing laptop power draw, updating roughly once per second. After loading Windows 7, the idle power was shown at 17W. NVIDIA launched a 3D app on the IGP and power draw increased to 32W, but rendering performance was quite slow. Then they launched the same 3D app on the dGPU and power use hit 39W, but with much better 3D performance. After closing the application, power draw dropped right back to 17W in a matter of seconds. At present there is no word on if/when this Arrandale-based laptop will ship, but it's a safe bet that if Lenovo can provide sample engineering hardware they're likely to ship Optimus laptops in the future.

The final "demonstration" is going to be more in line with what we like to see. Not only did and NVIDIA show us several running Optimus notebooks/laptops, but they also provided each of the attendees with an ASUS UL50Vf sample for review. The UL50Vf should be available for purchase today, and it sounds like the only reason NVIDIA delayed the Optimus launch until now was so that they could have hardware available for end-user purchase. The final part of our Optimus overview will be a review of the ASUS UL50Vf.

Optimus: Recognizing Applications ASUS UL50Vf Overview
Comments Locked

49 Comments

View All Comments

  • jfmeister - Tuesday, February 9, 2010 - link

    I was anxious to get an mx11 but 2 things were bothering me:
    1- No DirectX 11 compatibility
    2- No Core i5/i7 platform.

    Now there is another reason to wait for the refresh. But with arrendale prices droping, DX11 card available, Optimus, I would expect Alienware to get on the badwagon fast for a new mx11 platform and not wait 6 to 8 months for a refresh. This ultra laptop is intended for gamers and we all know that gamers are on top of their things. Optimus in the mx11 case should be a must.

    BTW, what I find funny is Optimus looks like a revolution, but what about 3dfx 10 years ago with their 3D Card addon (Monster 3D 8MB ftw)? Swithcing was used back then... This looks like the same thing except with HD video support! It took that long to come up with that?
  • JarredWalton - Tuesday, February 9, 2010 - link

    Remember that the switching back in the days of 3dfx was just in software and that the 3D GPU was always powered. There was the dongle cable situation as well. So the big deal here isn't just switching to a different GPU, but doing it on-the-fly and powering the GPU on/off virtually instantly. We think this will eventually make its way into desktops, but obviously it's a lot more important for laptops.
  • StriderGT - Tuesday, February 9, 2010 - link

    My take on Optimus:

    Optimus roots lie with hybrid SLI.
    Back then it was advertised as an nvidia only chipset feature (nvidia IGP + nvidia GPU) for both desktop and notebooks.

    Currently nvidia is being rapidly phased out of PC x86 chipsets so optimus is the only way to at least put an nvidia GPU on an intel IGP based system, but:

    1. Only real benefit is gaming performance without sacrificing autonomy in notebooks.
    2. Higher cost (in the form of the discrete GPU), intel has 60%+ of GPUs(=IGPs) because the vast majority do not care or are uninformed about game performance scaling.
    3. CUDA/Physx currently and in the foreseeable future irrelevant for mobile applications (gaming is much more relevant in comparison).
    4. Video decoding capabilities already present in most current IGPs (except pinetrail netbooks which can acquire it with a cheaper dedicated chip )
    5. Netbooks will not benefit from Optimus because they lack the CPU horsepower to feed the discrete GPU and are very cost sensitive... (same reason that ION1/2 is not the primary choice for netbook builders)
    6. In the desktop space only some niche small form factor PC applications could benefit from such a technology eg an SFF PC would need lesser cooling/noise during (IGP) normal operation and become louder more powerful while gaming (GPU)
    7. Idling/2D power consumption of most modern desktop GPUs is so low making the added complexity of a simultaneously working onboard IGP and the associated software a no benefit approach.
    8. Driver/application software problems that might arise from the complexity of profiles and the vastly different workload application scenarios.

    So in the end it boils down how can nvidia convince the world that a discrete GPU and its added cost is necessary in every portable (netbook and upwards sized) device out there. As for the desktop side it will be even more difficult to push such a thing with only noise reduction in small form factor PCs being of interest.

    BTW At least now the manufacturers won't have anymore excuses for the lack of descent GPU inside some of the cheaper notebook models (500-1000$), because of battery autonomy reasons.
    Oh well I'll keep my hopes low after so much time being a niche market since they might find some other excuse along the lines weight and space required for cooling the GPU during A/C operation... :-(

    PS Initially posted on yahoo finance forum
  • Zoomer - Tuesday, February 9, 2010 - link

    Not like it was really necessary; the Voodoo 2 used maybe 25W (probably less) and was meant for a desktop use.
  • jfmeister - Tuesday, February 9, 2010 - link

    Good point! I guess I did not take the time to think about it. I was more into the concept than the whole techincal side of that you brought up.

    Thanks!

    JF
  • cknobman - Tuesday, February 9, 2010 - link

    Man mx11 was biggest disappointment out there. weak sauce last gen processor on a so called premium high end gaming brand? Ill consider it once they get an arrandale culv and optima cause right now looking at notebookreview.com forums it is a manual switching graphics not optima.
  • crimson117 - Tuesday, February 9, 2010 - link

    Which processor should they have used, in your opinion?
  • cknobman - Tuesday, February 9, 2010 - link

    Should have waited another month to market and used the Core i7 ulv processors. There are already a few vendors using this proc (panasonic is one).
  • Wolfpup - Tuesday, April 20, 2010 - link

    Optimus is impressive software, but personally I don't want it, ever. I don't want Intel graphics on my CPU. I don't want Intel graphics in my memory controller. I don't want Intel graphics. I want my real GPU to by my real GPU, not a helper device that renders something that gets copied over to Intel's graphics.

    I just do not want this. I don't like having to rely on profiles either-thankfully you can manually add programs, but still.

Log in

Don't have an account? Sign up now