NVIDIA Optimus - Truly Seamless Switchable Graphics and ASUS UL50Vf
by Jarred Walton on February 9, 2010 9:00 AM ESTASUS UL50Vf Overview
The ASUS UL50Vf is essentially the Optimus version of the UL50Vt, and the UL50Vt is the 15.6" version of the UL80Vt we liked so much. To be honest, we are a lot more interested in the ASUS UL30Jc—a 13.3" Optimus CULV laptop with an optical drive (some models will even ship with Blu-ray support in the near future). Here are the specifications for the UL50Vf.
ASUS UL50Vf Specifications | |
Processor | Intel Core 2 Duo SU7300 (2x1.3GHz, 45nm, 3MB L2, 800FSB, 10W) Overclockable to 1.73GHz/1066FSB (Turbo33) |
Chipset | Intel GS45 + ICH9M |
Memory | 2x2GB DDR3-1066 (Max 2x4GB) |
Graphics | NVIDIA GeForce G210M 512MB (16SPs, 606/1468/1580 Core/Shader/RAM clocks) Intel GMA 4500MHD IGP Switchable NVIDIA Optimus Graphics |
Display | 15.6" LED Glossy 16:9 768p (1366x768) |
Hard Drive(s) | 320GB 5400RPM HDD |
Optical Drive | 8x DVDR SuperMulti |
Networking | Gigabit Ethernet Atheros AR9285 802.11n |
Audio | HD Audio (2 stereo speakers with two audio jacks) |
Battery | 8-Cell, 15V, 5600mAh, 84Wh battery "Up to 12 Hours" |
Front Side | None |
Left Side | Headphone/Microphone jacks 2 x USB HDMI Flash reader (MMC/MS/MS Pro/SD) Cooling Exhaust AC Power connection |
Right Side | 1 x USB 2.0 Optical Drive (DVDRW) Gigabit Ethernet VGA Kensington Lock |
Back Side | None |
Operating System | Windows 7 Home Premium 64-bit |
Dimensions | 15.4" x 10.4" x 1.05" (WxDxH) |
Weight | 5.2 lbs (with 8-cell battery) |
Extras | Webcam 103-Key keyboard with 10-key Flash reader (MMC/MS/MSPro/SD) Multi-touch touchpad Brushed aluminum cover ExpressGate OS (8-second boot) |
Warranty | 2-year global warranty 1-year battery pack warranty 1-year accidental damage 30-day zero bright dot LCD |
Pricing | $800 MSRP |
Obviously, there were some changes to the motherboard in order to work with Optimus. Specifically, ASUS was able to remove any multiplexers and extra signal routing from the UL50Vt design. However, those changes are on the inside and you can't see any difference looking at the exterior. Specifications remain the same as the UL50Vt/UL80Vt, and performance is virtually the same as the UL80Vt we tested. (There will be some minor differences due to the change in LCD size and the use of different drivers, but that's about it.)
Pretty much everything we had to say about the UL80Vt applies to the UL50Vf. The features are great, and Optimus makes it even better. You can overclock the CPU by 33% in order to improve performance, or you can run the CULV processor at the stock speed and improve battery life. Unlike Optimus, changing the CPU speed doesn't happen on-the-fly (unfortunately), but it is a little easier than what we experienced with UL80Vt. This time, instead of requiring a full system reboot, enabling/disabling Turbo33 only requires the system to enter suspend mode. In that sense, Turbo33 is sort of like switchable graphics gen2: it requires manual user intervention and takes 10 to 15 seconds to shift modes. Ideally, we would like to be able to switch the overclock without suspending, and even better would be the option to enable overclocking on AC power and disable it on DC power.
The UL50Vf carries over the aluminum cover on the LCD lid along with the glossy interior plastic and LCD. It also uses the same 1366x768 LCD resolution. Considering the larger chassis, we feel ASUS would have been better off increasing the LCD resolution slightly (1440x900 or 1600x900 would have been good), and we would have also appreciated a faster dGPU. With Optimus allowing the GPU to switch on/off as needed and a 15.6" chassis, we feel ASUS should have been able to get something like the GT 335/325M into the UL50Vf. After all, Alienware is managing to cram similar hardware into an 11.6" chassis with the M11x!
Before we get to the tests, we did encounter a few minor glitches during testing. First, we couldn't get x264 decode acceleration to work with the dGPU using Media Player Classic - Home Cinema. We could set the application to load on the discrete graphics, but MPC-HC apparently didn't know how to talk to the Optimus GPU and ended up running off the IGP. Since the GMA 4500MHD was more than capable of handling our 720p and 1080p x264 files, we're not too concerned with this issue. Another glitch is that CPU-Z refused to work; it would hang at the graphics detection stage. This isn't so much a problem with Optimus as a need for changes to CPU-Z—and very likely some other low-level tools that talk directly to the graphics hardware. (We didn't try any overclocking or tweaking of the GPU on the UL50Vf, but we suspect it might be a bit trickier than normal.)
Finally, when using the dGPU and playing games, we periodically noticed a slight glitch where the screen would flicker black for a frame. We couldn't come up with any repeatable test, but it seems like the problem may be related to the Copy Engine transferring incorrect data. This was not limited to any one title, but it occurred most frequently during our Empire: Total War testing (usually at least once every 60 seconds). It would hardly be surprising to find that there are a few bugs in the NVIDIA drivers, and most likely this is one of them. We didn't find the occasional "flicker" to be a serious issue and at present we really don't have enough information to say more about what might be causing the glitch we experienced. We'll do some additional testing to see if we can determine if this is more of a problem with specific games or if it happens on all games.
We've run an abbreviated set of tests with the UL50Vf. As mentioned, performance is virtually identical to the UL80Vt, the primary difference being the ability to immediately switch between discrete and integrated graphics as necessary. We will highlight both the old UL80Vt and the UL50Vf in our charts for comparison; you can see additional performance results for the UL80Vt in our previous review. All tests were conducted with the default graphics settings, so the discrete GPU is used when Optimus deems it beneficial and the IGP is used in all other cases. The gaming and general performance tests are run with Turbo33 engaged (33% CPU overclock) while battery testing was conducted at stock CPU speed.
49 Comments
View All Comments
Hrel - Tuesday, February 9, 2010 - link
Now that I've calmed down a little I should add that I'm not buying ANY gpu that doesn't support DX11 EVER again. We've moved past that; DX11 is necessary; no exceptions.JarredWalton - Tuesday, February 9, 2010 - link
I'm hoping NVIDIA calls me in for a sooper seekrit meeting some time in the next month or two, but right now they're not talking. They're definitely due for a new architecture, but the real question is going to be what they put together. Will the next gen be DX11? (It really has to be at this point.) Will it be a tweaked version of Fermi (i.e. cut Fermi down to a reasonable number of SPs), or will they tack DX11 functionality onto current designs?On a different note, I still wish we could get upgradeable notebook graphics, but that's probably a pipe dream. Consider: NVIDIA makes a new mGPU that they can sell to an OEM for $150 or something. OEM can turn that into an MXM module, do some testing and validation on "old laptops", and sell it to a customer for $300 (maybe even more--I swear the markup on mobile GPUs is HUGE!). Or, the OEM could just tell the customer, "Time for an upgrade" and sell them a new $1500 gaming laptop. Do we even need to guess which route they choose? Grrr....
Hrel - Tuesday, February 9, 2010 - link
It STILL doesn't have a screen with a resolution of AT LEAST 1600x900!!! Seriously!? What do I need to do? Get up on roof tops and scream from the top of my lungs? Cause I'm almost to that point. GIVE ME USEABLE SCREENS!!!!!!!MrSpadge - Wednesday, February 10, 2010 - link
Not everyones eyes are as good as yours. When I asked some 40+ people if I got the location right and showed it to them via Google Maps on my HTC Touch Diamond they rfused to even think about it without their glasses.strikeback03 - Thursday, February 11, 2010 - link
I've never had people complain about using Google Maps on my Diamond. Reading text messages and such yes, and for a lot of people forget about using the internet since they have to zoom the browser so far in, but the maps work fine.GoodRevrnd - Tuesday, February 9, 2010 - link
Any chance you could add the Macbook / Pro to the LCD quality graphs when you do these comparisons?JarredWalton - Tuesday, February 9, 2010 - link
Tell Anand to send me a MacBook for testing. :-) (I think he may have the necessary tools now to run the tests, but so far I haven't seen any results from his end.)MrSpadge - Tuesday, February 9, 2010 - link
Consider this: Fermi and following high end chips are going to beasts, but they might accelerate scientific / engineering apps tremendously. But if I put one into my workstation it's going to suck power even when not in use. It's generating noise, it's heating the room and making the air stuffy. This could easily be avoided with Optimus! It's just that someone had to ditch the old concept of "desktops don't need power saving" even more. 20 W for an idle GPU is not OK.And there's more: if I run GP-GPU the screen refresh often becomes sluggish (see BOINC etc.) or the app doesn't run at full potential. With Optimus I could have a high performance card crunch along, either at work or BOINC or whatever, and still get a responsive desktop from an IGP!
Drizzt321 - Tuesday, February 9, 2010 - link
Is there a way to set this to specifically only use IGP? So turn off the discrete graphics entirely? Like if I'm willing to suffer lower performance but need the extra battery life. I imagine if I could, the UL50Vf could equal the UL80Vt pretty easily in terms of battery life. I'm definitely all for the default being Optimus turned on...but lets say the IGP is more efficient at decoding that 720p or 1080p, yet NVidia's profile says gotta fire up the discrete GPU. There goes quite a bit of battery life!kpxgq - Wednesday, February 10, 2010 - link
depending on the scenario... the discrete gpu may use less power than the igp... ie say a discrete gpu working at 10% vs an igp working at 90%...kind of like using a lower gear at inclines uses less fuel than a higher gear going the same speed since it works less harder... the software should automatically do the math