Optimus, Prime?

"The Autobots called… they want their leader back!" Transformers jokes aside, it's amazing how much the computing landscape can change in such a short amount of time. A few months ago, switchable graphics seemed like a great technology that not enough companies were interested in using. Now, switchable graphics tastes like leftovers from last week. All things being equal, who would want to spend the same amount of money for something that doesn't work nearly as well? Certainly I wouldn't, so once more NVIDIA has put their competition in a tough situation. They did this with the Verde notebook driver program, and now just when AMD looks like they're catching up—DirectX 11 and switchable graphics are starting to appear in a few laptops, and AMD is set to announce their own updated mobile driver program—NVIDIA goes and raises the bar.


As far as the ability to switch quickly between discrete and integrated graphics is concerned, it's difficult to imagine anything that still needs improvement. If you're in the market for a new laptop and you want a discrete GPU, then you definitely want an Optimus laptop. Optimus works well and is generally transparent to the end user; during our testing we didn't encounter any serious problems, and certainly nothing that would make us hesitant to recommend purchasing an Optimus laptop. That doesn't mean there aren't other areas that could be improved, however.

One item that we would still like to see NVIDIA address is their driver release schedule. Specifically, we would love to see desktop and mobile drivers released at the same time. The far bigger concern we have right now is that NVIDIA's mobile GPUs are currently "last gen" technology; ATI has already started to ship Mobility Radeon 5000 GPUs, and you can find laptops using them online quite easily. DX11 is out, NVIDIA is ready to begin shipping DX11 desktop GPUs, and we are still using mobile GPUs that are based on the old G9x architecture. We expect NVIDIA to release their next mobile architecture in the not-too-distant future and it will certainly include Optimus, but we don't know when exactly this will happen and we don't know how fast the next-generation parts are going to be. If the new parts are DX11 and they launch sooner rather than later, the initial Optimus laptops are going to have a very short shelf life.

NVIDIA is pitching Optimus as a solution that will be available top to bottom. Whether you want a netbook, a laptop, or a high-end gaming notebook, there will be some form of Optimus available. We remain skeptical about the market for netbooks with discrete graphics; ION made sense when it was IGP, but a next generation ION as a discrete GPU with Intel IGP is going to be a tough sell. After all, Atom is already a serious bottleneck for a 9400M, so anything faster goes to waste. Since netbooks tend to be focused primarily at low cost markets, increasing costs by $50 or even $25 on the manufacturing side is going to be difficult. The best way to sell such a product would be to make sure all the other features are up to snuff; give us great build quality, a high contrast LCD panel, and a keyboard and touchpad that don't suck and we'd at least be willing to pay more than $300.

Another area where Optimus may not be necessary is video decoding. Intel's IGPs have been the whipping boy of graphics hardware for some time, but while they certainly aren't a great gaming solution the GMA 4500MHD generally works well for video decoding and the new HD Graphics improves the situation quite a bit. If Intel can get Flash 10.1 decode acceleration to work, the only people that will need discrete GPUs are gamers and anyone using CUDA apps (and DirectCompute when it starts to gain traction). Do you spend $600 for a decent CULV laptop or $800 for an Optimus laptop where the major advantage is gaming performance? There are going to be plenty of users that don't want to spend the extra money.

Update: There were a few questions raised by readers, and I have asked NVIDIA for comment. I'm putting this here as both are technically negatives, although the likelihood of either being a major concern is quite small—they will only affect a small subset of users. First, if you want to drive an external LCD at 120Hz you will need a laptop with dual-link DVI (or equivalent), and that's not likely to be present on GS45-based implementations like the UL50Vf. 3D Vision isn't likely to get support initially either, though you would want something faster than G210M for that regardless. Some of you asked about Linux support for Optimus, and if your usage plans fall into that category we have bad news: at present, NVIDIA is focusing on Win7 for Optimus support. We could see Linux support (and perhaps OSX as well) down the road, but right now it doesn't exist. Moreover, we're not even sure what would happen if you try to install Linux on an Optimus laptop... so we're going to quickly check and post the results. It will either run off the IGP exclusively, or it might simply fail. Given Optimus involves a complex software element, the dGPU will not work under Linux right now.

Update #2: Sorry this took so long to get back, but I did actually test what happens with Optimus laptops and Linux. Even using an older 9.04 installation of Ubuntu, everything works as expected... which is to say, you can run Linux but you won't get the Optimus GPU. Since there's no direct connection of any display to the GPU, Optimus is required to move rendered content from the G210M to the GMA 4500MHD display buffer. As noted above, NVIDIA is currently focusing their efforts on making Optimus work on Windows 7; it's possible they will move to other OSes down the road, but they are not committed to doing so. If you want to run Linux and you want to use a discrete GPU, Optimus Technology won't work. Perhaps some skilled Linux community folks can figure out a way to do Optimus-like technology on their own, but given the level of detail required to interface with the GPU and IGP we see that as unlikely.

As long as you're in the market for a laptop with a discrete GPU, however, Optimus is definitely where it's at. When we first tried switchable graphics a couple years ago, we said it was an awesome idea and recommended all future laptops try to include the technology. Apparently, manufacturers weren't interested. We liked the improved implementations even more last year, and yet plenty of Core i3/i5 laptops are shipping without switchable graphics. Ugh. With Optimus, there's no excuse left for skipping out on switchable graphics. It's seamless, it doesn't add to the cost or R&D efforts (other than a small fee to NVIDIA that can easily be absorbed into the GPU cost), and even technophobes can benefit—whether they know their laptop is switching between GPU modes or not. So we say again: no NVIDIA-based laptop going forward should ship without Optimus. In fact, if a manufacturer has a design in the works that's almost ready for market and it uses an NVIDIA discrete GPU without Optimus, please do us a favor and rework the product.

If you're interested in the full NVIDIA presentation, here's a complete gallery of the slides.

What about the ASUS UL50Vf?

As far as laptops go, the UL50Vf is a decent design but it's definitely not our favorite product ever. ASUS deserves some praise for being first, and the best part of the laptop is the Optimus technology we just finished praising. However, Optimus is set to launch in numerous other laptops in the near future and we expect some of the upcoming models will surpass the UL50Vf in several areas. Arrandale models will offer improved performance, and we should see 13.3" laptops with the same components as the UL50Vf. We should also see 15.6" laptops with a faster GPU, and Alienware's M11x has a GT335M in an 11.6" chassis—though it appears the M11x uses the older gen2 switchable graphics.

As for the UL50Vf, the keyboard is easy to type on and the construction is decent as well. We just wish ASUS would stop with the glossy black plastic shells, and we also wish the LCD was better. Battery life appears to have dropped somewhat compared to the 14" UL80Vt, but if you want Optimus right now and you're okay with a 15.6" chassis, the ASUS UL50Vf is a good laptop that should serve you well. We're not sure on the MSRP, but we expect it to be the same as the previous generation UL50Vt—after all, Optimus is supposed to make the design and component costs lower, right? If you're looking for something else, here are a few more ASUS Optimus laptops that might suit you better.

Gallery: ASUS UL30Jc

Right now, the ASUS UL30Jc is the laptop we're really looking forward to testing. It's not a ULV design, but it does use Core i3-350M (potentially i5/i7 in some models), it ditches the glossy black plastic, and it has a 13.3" chassis and LCD. The GPU is the G310M, but performance is about the same as G210M (1530MHz SP clock instead of 1500MHz, and still 16 SPs). Some models will even have Blu-ray support—though the price is likely to be $1000+ in that case. Anyway, feast your eyes on the above pictures and tell me that doesn't look a ton better than the UL50Vf. Brushed aluminum on the palm rest as well as the LCD? I think I've reached Nerdvana. It's too bad they still appear to have a glossy black bezel around the display, but it might not be too late to provide a second model with a matte bezel. How about it, ASUS?

Besides the UL30Jc and the UL50Vf, we know ASUS has at least two other Optimus laptops in the works. The N61JV is a 16" chassis available in brown or white, with an Arrandale (Core i3/i5) processor. We're still waiting for full details on which GPU it uses, but hopefully it's at least a G220M or faster (GT335M would be great!). The N61 should arrive shortly and we'll review it as soon as we can. The N82JV is the other Optimus laptop we know ASUS has coming, and it appears to be a 14" model that presumably builds off the UL80Vt design. It's brown instead of glossy black, but we don't know any details of the remaining hardware. We'd love for it to be an Arrandale CULV design, but we'll have to wait a bit longer to find out.

ASUS UL50Vf LCD Quality
Comments Locked

49 Comments

View All Comments

  • Hrel - Tuesday, February 9, 2010 - link

    Now that I've calmed down a little I should add that I'm not buying ANY gpu that doesn't support DX11 EVER again. We've moved past that; DX11 is necessary; no exceptions.
  • JarredWalton - Tuesday, February 9, 2010 - link

    I'm hoping NVIDIA calls me in for a sooper seekrit meeting some time in the next month or two, but right now they're not talking. They're definitely due for a new architecture, but the real question is going to be what they put together. Will the next gen be DX11? (It really has to be at this point.) Will it be a tweaked version of Fermi (i.e. cut Fermi down to a reasonable number of SPs), or will they tack DX11 functionality onto current designs?

    On a different note, I still wish we could get upgradeable notebook graphics, but that's probably a pipe dream. Consider: NVIDIA makes a new mGPU that they can sell to an OEM for $150 or something. OEM can turn that into an MXM module, do some testing and validation on "old laptops", and sell it to a customer for $300 (maybe even more--I swear the markup on mobile GPUs is HUGE!). Or, the OEM could just tell the customer, "Time for an upgrade" and sell them a new $1500 gaming laptop. Do we even need to guess which route they choose? Grrr....
  • Hrel - Tuesday, February 9, 2010 - link

    It STILL doesn't have a screen with a resolution of AT LEAST 1600x900!!! Seriously!? What do I need to do? Get up on roof tops and scream from the top of my lungs? Cause I'm almost to that point. GIVE ME USEABLE SCREENS!!!!!!!
  • MrSpadge - Wednesday, February 10, 2010 - link

    Not everyones eyes are as good as yours. When I asked some 40+ people if I got the location right and showed it to them via Google Maps on my HTC Touch Diamond they rfused to even think about it without their glasses.
  • strikeback03 - Thursday, February 11, 2010 - link

    I've never had people complain about using Google Maps on my Diamond. Reading text messages and such yes, and for a lot of people forget about using the internet since they have to zoom the browser so far in, but the maps work fine.
  • GoodRevrnd - Tuesday, February 9, 2010 - link

    Any chance you could add the Macbook / Pro to the LCD quality graphs when you do these comparisons?
  • JarredWalton - Tuesday, February 9, 2010 - link

    Tell Anand to send me a MacBook for testing. :-) (I think he may have the necessary tools now to run the tests, but so far I haven't seen any results from his end.)
  • MrSpadge - Tuesday, February 9, 2010 - link

    Consider this: Fermi and following high end chips are going to beasts, but they might accelerate scientific / engineering apps tremendously. But if I put one into my workstation it's going to suck power even when not in use. It's generating noise, it's heating the room and making the air stuffy. This could easily be avoided with Optimus! It's just that someone had to ditch the old concept of "desktops don't need power saving" even more. 20 W for an idle GPU is not OK.

    And there's more: if I run GP-GPU the screen refresh often becomes sluggish (see BOINC etc.) or the app doesn't run at full potential. With Optimus I could have a high performance card crunch along, either at work or BOINC or whatever, and still get a responsive desktop from an IGP!
  • Drizzt321 - Tuesday, February 9, 2010 - link

    Is there a way to set this to specifically only use IGP? So turn off the discrete graphics entirely? Like if I'm willing to suffer lower performance but need the extra battery life. I imagine if I could, the UL50Vf could equal the UL80Vt pretty easily in terms of battery life. I'm definitely all for the default being Optimus turned on...but lets say the IGP is more efficient at decoding that 720p or 1080p, yet NVidia's profile says gotta fire up the discrete GPU. There goes quite a bit of battery life!
  • kpxgq - Wednesday, February 10, 2010 - link

    depending on the scenario... the discrete gpu may use less power than the igp... ie say a discrete gpu working at 10% vs an igp working at 90%...

    kind of like using a lower gear at inclines uses less fuel than a higher gear going the same speed since it works less harder... the software should automatically do the math

Log in

Don't have an account? Sign up now