Optimus, Prime?

"The Autobots called… they want their leader back!" Transformers jokes aside, it's amazing how much the computing landscape can change in such a short amount of time. A few months ago, switchable graphics seemed like a great technology that not enough companies were interested in using. Now, switchable graphics tastes like leftovers from last week. All things being equal, who would want to spend the same amount of money for something that doesn't work nearly as well? Certainly I wouldn't, so once more NVIDIA has put their competition in a tough situation. They did this with the Verde notebook driver program, and now just when AMD looks like they're catching up—DirectX 11 and switchable graphics are starting to appear in a few laptops, and AMD is set to announce their own updated mobile driver program—NVIDIA goes and raises the bar.


As far as the ability to switch quickly between discrete and integrated graphics is concerned, it's difficult to imagine anything that still needs improvement. If you're in the market for a new laptop and you want a discrete GPU, then you definitely want an Optimus laptop. Optimus works well and is generally transparent to the end user; during our testing we didn't encounter any serious problems, and certainly nothing that would make us hesitant to recommend purchasing an Optimus laptop. That doesn't mean there aren't other areas that could be improved, however.

One item that we would still like to see NVIDIA address is their driver release schedule. Specifically, we would love to see desktop and mobile drivers released at the same time. The far bigger concern we have right now is that NVIDIA's mobile GPUs are currently "last gen" technology; ATI has already started to ship Mobility Radeon 5000 GPUs, and you can find laptops using them online quite easily. DX11 is out, NVIDIA is ready to begin shipping DX11 desktop GPUs, and we are still using mobile GPUs that are based on the old G9x architecture. We expect NVIDIA to release their next mobile architecture in the not-too-distant future and it will certainly include Optimus, but we don't know when exactly this will happen and we don't know how fast the next-generation parts are going to be. If the new parts are DX11 and they launch sooner rather than later, the initial Optimus laptops are going to have a very short shelf life.

NVIDIA is pitching Optimus as a solution that will be available top to bottom. Whether you want a netbook, a laptop, or a high-end gaming notebook, there will be some form of Optimus available. We remain skeptical about the market for netbooks with discrete graphics; ION made sense when it was IGP, but a next generation ION as a discrete GPU with Intel IGP is going to be a tough sell. After all, Atom is already a serious bottleneck for a 9400M, so anything faster goes to waste. Since netbooks tend to be focused primarily at low cost markets, increasing costs by $50 or even $25 on the manufacturing side is going to be difficult. The best way to sell such a product would be to make sure all the other features are up to snuff; give us great build quality, a high contrast LCD panel, and a keyboard and touchpad that don't suck and we'd at least be willing to pay more than $300.

Another area where Optimus may not be necessary is video decoding. Intel's IGPs have been the whipping boy of graphics hardware for some time, but while they certainly aren't a great gaming solution the GMA 4500MHD generally works well for video decoding and the new HD Graphics improves the situation quite a bit. If Intel can get Flash 10.1 decode acceleration to work, the only people that will need discrete GPUs are gamers and anyone using CUDA apps (and DirectCompute when it starts to gain traction). Do you spend $600 for a decent CULV laptop or $800 for an Optimus laptop where the major advantage is gaming performance? There are going to be plenty of users that don't want to spend the extra money.

Update: There were a few questions raised by readers, and I have asked NVIDIA for comment. I'm putting this here as both are technically negatives, although the likelihood of either being a major concern is quite small—they will only affect a small subset of users. First, if you want to drive an external LCD at 120Hz you will need a laptop with dual-link DVI (or equivalent), and that's not likely to be present on GS45-based implementations like the UL50Vf. 3D Vision isn't likely to get support initially either, though you would want something faster than G210M for that regardless. Some of you asked about Linux support for Optimus, and if your usage plans fall into that category we have bad news: at present, NVIDIA is focusing on Win7 for Optimus support. We could see Linux support (and perhaps OSX as well) down the road, but right now it doesn't exist. Moreover, we're not even sure what would happen if you try to install Linux on an Optimus laptop... so we're going to quickly check and post the results. It will either run off the IGP exclusively, or it might simply fail. Given Optimus involves a complex software element, the dGPU will not work under Linux right now.

Update #2: Sorry this took so long to get back, but I did actually test what happens with Optimus laptops and Linux. Even using an older 9.04 installation of Ubuntu, everything works as expected... which is to say, you can run Linux but you won't get the Optimus GPU. Since there's no direct connection of any display to the GPU, Optimus is required to move rendered content from the G210M to the GMA 4500MHD display buffer. As noted above, NVIDIA is currently focusing their efforts on making Optimus work on Windows 7; it's possible they will move to other OSes down the road, but they are not committed to doing so. If you want to run Linux and you want to use a discrete GPU, Optimus Technology won't work. Perhaps some skilled Linux community folks can figure out a way to do Optimus-like technology on their own, but given the level of detail required to interface with the GPU and IGP we see that as unlikely.

As long as you're in the market for a laptop with a discrete GPU, however, Optimus is definitely where it's at. When we first tried switchable graphics a couple years ago, we said it was an awesome idea and recommended all future laptops try to include the technology. Apparently, manufacturers weren't interested. We liked the improved implementations even more last year, and yet plenty of Core i3/i5 laptops are shipping without switchable graphics. Ugh. With Optimus, there's no excuse left for skipping out on switchable graphics. It's seamless, it doesn't add to the cost or R&D efforts (other than a small fee to NVIDIA that can easily be absorbed into the GPU cost), and even technophobes can benefit—whether they know their laptop is switching between GPU modes or not. So we say again: no NVIDIA-based laptop going forward should ship without Optimus. In fact, if a manufacturer has a design in the works that's almost ready for market and it uses an NVIDIA discrete GPU without Optimus, please do us a favor and rework the product.

If you're interested in the full NVIDIA presentation, here's a complete gallery of the slides.

What about the ASUS UL50Vf?

As far as laptops go, the UL50Vf is a decent design but it's definitely not our favorite product ever. ASUS deserves some praise for being first, and the best part of the laptop is the Optimus technology we just finished praising. However, Optimus is set to launch in numerous other laptops in the near future and we expect some of the upcoming models will surpass the UL50Vf in several areas. Arrandale models will offer improved performance, and we should see 13.3" laptops with the same components as the UL50Vf. We should also see 15.6" laptops with a faster GPU, and Alienware's M11x has a GT335M in an 11.6" chassis—though it appears the M11x uses the older gen2 switchable graphics.

As for the UL50Vf, the keyboard is easy to type on and the construction is decent as well. We just wish ASUS would stop with the glossy black plastic shells, and we also wish the LCD was better. Battery life appears to have dropped somewhat compared to the 14" UL80Vt, but if you want Optimus right now and you're okay with a 15.6" chassis, the ASUS UL50Vf is a good laptop that should serve you well. We're not sure on the MSRP, but we expect it to be the same as the previous generation UL50Vt—after all, Optimus is supposed to make the design and component costs lower, right? If you're looking for something else, here are a few more ASUS Optimus laptops that might suit you better.

Gallery: ASUS UL30Jc

Right now, the ASUS UL30Jc is the laptop we're really looking forward to testing. It's not a ULV design, but it does use Core i3-350M (potentially i5/i7 in some models), it ditches the glossy black plastic, and it has a 13.3" chassis and LCD. The GPU is the G310M, but performance is about the same as G210M (1530MHz SP clock instead of 1500MHz, and still 16 SPs). Some models will even have Blu-ray support—though the price is likely to be $1000+ in that case. Anyway, feast your eyes on the above pictures and tell me that doesn't look a ton better than the UL50Vf. Brushed aluminum on the palm rest as well as the LCD? I think I've reached Nerdvana. It's too bad they still appear to have a glossy black bezel around the display, but it might not be too late to provide a second model with a matte bezel. How about it, ASUS?

Besides the UL30Jc and the UL50Vf, we know ASUS has at least two other Optimus laptops in the works. The N61JV is a 16" chassis available in brown or white, with an Arrandale (Core i3/i5) processor. We're still waiting for full details on which GPU it uses, but hopefully it's at least a G220M or faster (GT335M would be great!). The N61 should arrive shortly and we'll review it as soon as we can. The N82JV is the other Optimus laptop we know ASUS has coming, and it appears to be a 14" model that presumably builds off the UL80Vt design. It's brown instead of glossy black, but we don't know any details of the remaining hardware. We'd love for it to be an Arrandale CULV design, but we'll have to wait a bit longer to find out.

ASUS UL50Vf LCD Quality
POST A COMMENT

49 Comments

View All Comments

  • MasterTactician - Wednesday, February 10, 2010 - link

    But doesn't this solve the problem of an external graphics solution not being able to use the laptop's display? If the external GPU can pass the rendered frames back to the IGP's buffer via PCI-E than its problem solved, isn't it? So the real question is: Will nVidia capitalize on this? Reply
  • Pessimism - Tuesday, February 09, 2010 - link

    Did Nvidia dip into the clearance bin again for chip packaging materials? Will the laptop survive its warranty period unscathed? What of the day after that? Reply
  • HighTech4US - Tuesday, February 09, 2010 - link

    You char-lie droid ditto-heads are really something.

    You live in the past and swallow everything char-lie spews.

    Now go back to your church's web site semi-inaccurate and wait for your next gospel from char-lie.
    Reply
  • Pessimism - Tuesday, February 09, 2010 - link

    So I suppose the $300+ million dollars in charges Nvidia took were merely a good faith gesture for the tech community with no basis in fact regarding an actual defect. Reply
  • HighTech4US - Tuesday, February 09, 2010 - link

    The $300+ million was in fact a good faith measure to the vendors who bought the products that had the defect. nVidia backed extended warranties and picked up the total cost of repairs.

    The defect has been fixed long ago.

    So your char-lie comment as if it still exists deserves to be called what it is. A char-lie.
    Reply
  • Pessimism - Tuesday, February 09, 2010 - link

    So you admit that a defect existed. That's more than can be said for several large OEM manufacturers.


    Reply
  • Visual - Tuesday, February 09, 2010 - link

    Don't get me wrong - I really like the ability to have a long battery life when not doing anything and also have great performance when desired. And if switchable graphics is the way to achieve this, I'm all for it.

    But it seems counter-productive in some ways. If the external GPU was properly designed in the first place, able to shut down power to the unused parts of the processor, supporting low-power profiles, then we'd never have needed switching between two distinct GPUs. Why did that never happen?

    Now that Intel, and eventually AMD too, are integrating a low-power GPU inside the CPU itself, I guess there is no escaping from switchable graphics any more. But I just fail to see why NVidia or ATI couldn't have done it the proper way before.
    Reply
  • AmdInside - Tuesday, February 09, 2010 - link

    Because it's hard to compete with a company that is giving away integrated graphics for free (Intel) in order to move higher priced CPUs. In a way, AMD is doing the same with ATI. Giving us great motherboards with ATI graphics and cheap cheap prices (which in many ways are much better than Intel's much higher priced offerings) in order to get you to buy AMD CPUs. Reply
  • JarredWalton - Tuesday, February 09, 2010 - link

    Don't forget that no matter how pieces of a GPU go into a deep sleep state (i.e. via power gate transistors), you would still have some extra stuff receiving power. VRAM for example, plus any transistors/resistors. At idle the CULV laptops are down around 8 to 10W; even the WiFi card can suck up .5 to 1W and make a pretty substantial difference in battery life. I'd say the best you're likely to see from a discrete GPU is idle power draw that's around 3W over and above what an IGP might need, so a savings of 3W could be a 30% power use reduction. Reply
  • maler23 - Tuesday, February 09, 2010 - link

    I've been waiting for this article ever since it was hinted in the last CULV roundup. The ASUS laptop is a little disappointing, especially the graphic card situation(the Alienware M11X kind of sucked up a lot of excitement there). Frankly, I'd just take a discounted UL-30VT and deal with manual graphics switching.

    Couple of questions:

    -Any chance for a review of the aforementioned Alienware M11X soon?

    -I've seen a couple of reviews with display quality comparisons including this one. How do to the Macbook and Macbook Pros fit into the rankings?

    cheers!

    -J
    Reply

Log in

Don't have an account? Sign up now