Back in December, NVIDIA contacted me to let me know that something "really big" was coming out in the near future. It's January 24 as I write this, and tomorrow is the "Optimus Deep Dive" event, an exclusive event with only 14 or so of the top technology websites and magazines in attendance. When Sean Pelletier from NVIDIA contacted me, he was extra excited this time and said something to the effect of, "This new technology is pretty much targeted at you, Jarred… when I saw it, I said, 'We should just call this the Jarred Edition of our mobile platform.' We can't go into details yet, but basically it's going to do all of the stuff that you've been talking about for the past couple of years." With a statement like that, you can understand that it got the gears in my head to start churning. What exactly have I been pining for in terms of mobile GPUs of late? So in advance of the unveiling of their latest technologies and products, I thought I'd put down what I really want to see and then we'll find out how well NVIDIA has matched my expectations.

I've put together my thoughts before getting any actual details from NVIDIA; I'll start with those, but of course NDAs mean that you won't get to read any of this until after the parts are officially announced. Page two will begin the coverage of NVIDIA's Optimus announcement, but my hopes and expectations will serve as a nice springboard into the meat of this article. They set my expectations pretty high back in December, which might come back to haunt them….

First off, if we're talking about a mobile product, we need to consider battery life. Sure, there are some users that want the fastest notebook money can buy—battery life be damned! I'm not that type of user. The way I figure it, the technology has now existed for at least 18 months to offer a laptop that can provide good performance when you need it, but at the same time it should be able to power down unnecessary devices and provide upwards of six hours of battery life (eight would be better). Take one of the big, beefy gaming laptops with an 85Wh (or larger) battery, and if you shut down the discrete GPU and limit the CPU to moderate performance levels, you ought to be able to get a good mobile solution as well as something that can power through tasks when necessary. Why should a 9 pound notebook be limited to just 2 hours (often less) of battery life?

What's more, not all IGPs are created equal, and it would be nice if only certain features of a discrete GPU could power up when needed. Take video decoding as an example. The Intel Atom N270/280/450 processors are all extremely low power CPUs, but they can't provide enough performance to decode a 1080p H.264 video. Pine Trail disappointed us in that respect, but we have Broadcom Crystal HD chips that are supposed to provide the missing functionality. Well, why can't we get something similar from NVIDIA (and ATI for that matter)? We really expect any Core i3/i5 laptop shipped with a discrete GPU to properly support hybrid graphics, and the faster a system can switch between the two ("instantly" being the holy grail), the better. What we'd really like to see is a discrete GPU that can power up just the video processing engine while leaving the rest of the GPU off (i.e. via power gate transistors or something similar). If the video engine on a GPU can do a better job than the IGP and only use a couple watts that would be much better than software decoding on the CPU. Then again, Intel's latest HD Graphics may make this a moot point, provided they can handle 1080p H.264 content properly (including Flash video).

Obviously, the GPU is only part of the equation, and quad-core CPUs aren't an ideal solution for such a product, unless you can fully shut down several of the cores and prevent the OS from waking them up all the time. Core i3/i5/i7 CPUs have power gate transistors that can at least partially accomplish this, but the OS side of things certainly appears to be lagging behind right now. If I unplug and I know all I'm going to be doing for the next couple of hours is typing in Word, why not let me configure the OS to temporarily disable all but one CPU core? What we'd really like to see is a Core i7 type processor that can reach idle power figures similar to Core 2 Duo ULV parts. Incidentally, I'm in a plane writing this in Word on a CULV laptop right now; my estimated battery life remaining is a whopping 9 hours on a 55Wh battery and I have yet to feel the laptop is "too slow" for this task. We haven't reached this state of technology yet and NVIDIA isn't going to announce anything that would affect this aspect of laptops, but since they said this announcement was tailored to meet my wish list I thought I'd mention it.

Another area totally unrelated to power use but equally important for mobile GPUs is the ability to get regular driver updates. NVIDIA first discussed plans for their Verde Notebook Driver Program back at the 8800M launch in late 2007. We first discussed this in early 2008 but it wasn't until December 2008 that we received the first official Verde driver. At that time, the reference driver was only for certain products and operating systems, and it was several releases behind the desktop drivers. By the time Windows 7 launched last fall, NVIDIA managed to release updated mobile drivers for all Windows OSes with support for their 8000M series and newer hardware, and this was done at the same time and with the same version as the desktop driver release. That pattern hasn't held in the months following the Win7 launch, but our wish list for mobile GPUs would definitely include drivers released at the same time as the desktop drivers. With NVIDIA's push on PhysX, CUDA, and other GPGPU technologies, linking the driver releases for both mobile and desktop solutions would be ideal. We can't discuss AMD's plans for their updated ATI Catalyst Mobility just yet, but suffice it to say ATI is well aware of the need for regular mobile driver updates and they're looking to dramatically improve product support in this area. We'll have more to say about this next week.

Finally, the last thing we'd like to see from NVIDIA is less of a gap between mobile and desktop performance. We understand that the power constraints on laptops inherently limit what you can do, and we're certainly not suggesting anyone try to put a 300W (or even 150W) GPU into a laptop. However, right now the gap between desktop and mobile products has grown incredibly wide—not so much for ATI, but certainly for NVIDIA. The current top-performing mobile solution is the GTX 280M, but despite the name this part has nothing to do with the desktop GTX 280. Where the desktop GTX 285 is now up to 240 shader cores (SPs) clocked at 1476MHz, the mobile part is essentially a tweaked version of the old 8800 GTS 512 part. We have a current maximum of 128 SPs running at 1500MHz (1463MHz for the GTX 280M), which is a bit more than half of the theoretical performance of the desktop part with the same name. The bandwidth side of things isn't any better, with around 159GB/s for the desktop and only 61GB/s for notebooks.

As we discussed recently, NVIDIA is all set to release Fermi/GF100 for desktop platforms in the next month or two. Obviously it's time for a new mobile architecture, but what we really want is a mobile version of GF100 rather than a mobile version of GT200. One of the key differences is the support for DirectX 11 on GF100, and with ATI's Mobility Radeon 5000 series already starting to show up in retail products, NVIDIA is behind the 8-ball in this area. We don't have a ton of released or upcoming DX11 games just yet, but all things being equal we'd rather have DX11 support than not. Considering Fermi looks to be a beast in terms of power consumption, we're obviously going to need to make some performance sacrifices in order to keep power in check. GF100 looks to have several parts with varying levels of SPs, so it may be as simple as cutting the number of SPs in half and toning down the clock rates. Another option is that perhaps NVIDIA can take a hybrid approach and tack DX11 features onto the G90 or GT200 architecture rather than reworking GF100 into a mobile product. Whatever route they take, NVIDIA really needs to maintain feature parity with ATI's mobile products, and right now that means DX11 support.

So, that's my wish list right now. I don't ask for much, really: give me mobile performance that has feature parity with desktop parts, with a moderate performance hit in order to keep maximum power requirements in check, and do all that with a chip that's able to switch between 0W power draw and normal power requirements in a fraction of a second as needed. Simple! Now it's time to begin coverage of the actual presentation and find out exactly what NVIDIA is announcing. So turn the page and let's delve into the latest and greatest mobile news from NVIDIA.

A Brief History of Switchable Graphics
POST A COMMENT

49 Comments

View All Comments

  • JarredWalton - Tuesday, February 09, 2010 - link

    You can manually set applications to only use the IGP instead of turning on the dGPU, but to my knowledge there's no way to completely turn off the dGPU and keep it off. Of course, when the GPU isn't needed it is totally powered off so you don't lose battery life unless you start running apps that want to run on the GPU. Reply
  • macroecon - Tuesday, February 09, 2010 - link

    Well, I was getting ready to pull the trigger over the weekend to buy a UL30Vt, but I'm glad that I waited. While this is not a revolutionary feature, it does make laptops that lacks it less valuable in my opinion. The video that Jarred posted toward the end of the article really demonstrates the value of on-the-fly GPU switching. I think that I'll wait for bit longer for Optimus, and also DirectX11 nVidia GPU, to hit the market. Thanks for the coverage Jarred! Reply
  • lopri - Tuesday, February 09, 2010 - link

    Not to rain on NV's parade, but I'd much prefer if Optimus is doing its thing in 100% hardware. In an ideal world, software solution can do the same job as hardware solution, but I've seen some caveats on software solutions - on desktops, admittedly. Instead of trying to 'detect' the apps, detecting 'loads' and take care of it in hardware.

    Some might know what I'm talking about.
    Reply
  • JarredWalton - Tuesday, February 09, 2010 - link

    The only problem with this is that the software is needed to work between Intel and NVIDIA hardware. There's also a concern about if you want something to NOT run on the dGPU (for testing purposes or to save battery life). With IGP reaching the point where it can handle most video tasks, you wouldn't want to power up the dGPU to do H.264 decoding as power requirements would jump several watts.

    Of course, if you could have NVIDIA IGP and dGPU it might be possible to do more on the hardware side, but Arrandale, Pineview, Sandy Bridge, etc. make it very unlikely that we would see another NVIDIA IGP any time soon.
    Reply
  • acooke - Tuesday, February 09, 2010 - link

    OK, so this is awesome (particularly with Lenovo and CUDA mentioned). But how is the encrupted profile update driver yadda yadda stuff going to work with Linux?

    I'm a software developer, I work with CUDA (OpenCL actually), I use Linux. NVidia should worry about people like me because we're the motor behind the take-up of Fermi, which is going to be a significant source of cash for them. Currently I can do very basic OpenCL development while on the road with my laptop using the AMD CPU driver (despite having Intel/Lenovo hardware), but being able to use a GPU woul dbe a huge improvement (it's not that much fun running GPU code on a CPU!).
    Reply
  • darckhart - Wednesday, February 10, 2010 - link

    yes, i'm curious about this also. Reply
  • room1oh1 - Tuesday, February 09, 2010 - link

    I hope they don't fit any brakes into a laptop! Reply
  • MonkeyPaw - Tuesday, February 09, 2010 - link

    Yeah, it's rather unfortunate that they said it should work like a hybrid, and they have the picture of a 2010 Prius in the slide. Just goes to show that car analogies don't work! They could have just drawn the parallel to your laptop battery--when you unplug the laptop, it starts using the battery with no user intervention. Reply
  • horseeater - Tuesday, February 09, 2010 - link

    Switchable graphics are nice, but I want external gfx cards (or enclosures for desktop gfx cards) for laptops. Just plug it in when you're home, kill precious time playing useless junk, and use the igp when on the road.

    That being said, UL80-vt is reportedly awesome, and improvements are surely welcome, if they don't up the price.
    Reply
  • synaesthetic - Wednesday, February 10, 2010 - link

    I want external GPUs also, but I want one that can use the laptop's LCD display rather than forcing me to plug in an external display. After all, external displays aren't portable, but a ViDock isn't terribly large. Reply

Log in

Don't have an account? Sign up now