NVIDIA Optimus - Truly Seamless Switchable Graphics and ASUS UL50Vf
by Jarred Walton on February 9, 2010 9:00 AM ESTASUS UL50Vf Overview
The ASUS UL50Vf is essentially the Optimus version of the UL50Vt, and the UL50Vt is the 15.6" version of the UL80Vt we liked so much. To be honest, we are a lot more interested in the ASUS UL30Jc—a 13.3" Optimus CULV laptop with an optical drive (some models will even ship with Blu-ray support in the near future). Here are the specifications for the UL50Vf.
ASUS UL50Vf Specifications | |
Processor | Intel Core 2 Duo SU7300 (2x1.3GHz, 45nm, 3MB L2, 800FSB, 10W) Overclockable to 1.73GHz/1066FSB (Turbo33) |
Chipset | Intel GS45 + ICH9M |
Memory | 2x2GB DDR3-1066 (Max 2x4GB) |
Graphics | NVIDIA GeForce G210M 512MB (16SPs, 606/1468/1580 Core/Shader/RAM clocks) Intel GMA 4500MHD IGP Switchable NVIDIA Optimus Graphics |
Display | 15.6" LED Glossy 16:9 768p (1366x768) |
Hard Drive(s) | 320GB 5400RPM HDD |
Optical Drive | 8x DVDR SuperMulti |
Networking | Gigabit Ethernet Atheros AR9285 802.11n |
Audio | HD Audio (2 stereo speakers with two audio jacks) |
Battery | 8-Cell, 15V, 5600mAh, 84Wh battery "Up to 12 Hours" |
Front Side | None |
Left Side | Headphone/Microphone jacks 2 x USB HDMI Flash reader (MMC/MS/MS Pro/SD) Cooling Exhaust AC Power connection |
Right Side | 1 x USB 2.0 Optical Drive (DVDRW) Gigabit Ethernet VGA Kensington Lock |
Back Side | None |
Operating System | Windows 7 Home Premium 64-bit |
Dimensions | 15.4" x 10.4" x 1.05" (WxDxH) |
Weight | 5.2 lbs (with 8-cell battery) |
Extras | Webcam 103-Key keyboard with 10-key Flash reader (MMC/MS/MSPro/SD) Multi-touch touchpad Brushed aluminum cover ExpressGate OS (8-second boot) |
Warranty | 2-year global warranty 1-year battery pack warranty 1-year accidental damage 30-day zero bright dot LCD |
Pricing | $800 MSRP |
Obviously, there were some changes to the motherboard in order to work with Optimus. Specifically, ASUS was able to remove any multiplexers and extra signal routing from the UL50Vt design. However, those changes are on the inside and you can't see any difference looking at the exterior. Specifications remain the same as the UL50Vt/UL80Vt, and performance is virtually the same as the UL80Vt we tested. (There will be some minor differences due to the change in LCD size and the use of different drivers, but that's about it.)
Pretty much everything we had to say about the UL80Vt applies to the UL50Vf. The features are great, and Optimus makes it even better. You can overclock the CPU by 33% in order to improve performance, or you can run the CULV processor at the stock speed and improve battery life. Unlike Optimus, changing the CPU speed doesn't happen on-the-fly (unfortunately), but it is a little easier than what we experienced with UL80Vt. This time, instead of requiring a full system reboot, enabling/disabling Turbo33 only requires the system to enter suspend mode. In that sense, Turbo33 is sort of like switchable graphics gen2: it requires manual user intervention and takes 10 to 15 seconds to shift modes. Ideally, we would like to be able to switch the overclock without suspending, and even better would be the option to enable overclocking on AC power and disable it on DC power.
The UL50Vf carries over the aluminum cover on the LCD lid along with the glossy interior plastic and LCD. It also uses the same 1366x768 LCD resolution. Considering the larger chassis, we feel ASUS would have been better off increasing the LCD resolution slightly (1440x900 or 1600x900 would have been good), and we would have also appreciated a faster dGPU. With Optimus allowing the GPU to switch on/off as needed and a 15.6" chassis, we feel ASUS should have been able to get something like the GT 335/325M into the UL50Vf. After all, Alienware is managing to cram similar hardware into an 11.6" chassis with the M11x!
Before we get to the tests, we did encounter a few minor glitches during testing. First, we couldn't get x264 decode acceleration to work with the dGPU using Media Player Classic - Home Cinema. We could set the application to load on the discrete graphics, but MPC-HC apparently didn't know how to talk to the Optimus GPU and ended up running off the IGP. Since the GMA 4500MHD was more than capable of handling our 720p and 1080p x264 files, we're not too concerned with this issue. Another glitch is that CPU-Z refused to work; it would hang at the graphics detection stage. This isn't so much a problem with Optimus as a need for changes to CPU-Z—and very likely some other low-level tools that talk directly to the graphics hardware. (We didn't try any overclocking or tweaking of the GPU on the UL50Vf, but we suspect it might be a bit trickier than normal.)
Finally, when using the dGPU and playing games, we periodically noticed a slight glitch where the screen would flicker black for a frame. We couldn't come up with any repeatable test, but it seems like the problem may be related to the Copy Engine transferring incorrect data. This was not limited to any one title, but it occurred most frequently during our Empire: Total War testing (usually at least once every 60 seconds). It would hardly be surprising to find that there are a few bugs in the NVIDIA drivers, and most likely this is one of them. We didn't find the occasional "flicker" to be a serious issue and at present we really don't have enough information to say more about what might be causing the glitch we experienced. We'll do some additional testing to see if we can determine if this is more of a problem with specific games or if it happens on all games.
We've run an abbreviated set of tests with the UL50Vf. As mentioned, performance is virtually identical to the UL80Vt, the primary difference being the ability to immediately switch between discrete and integrated graphics as necessary. We will highlight both the old UL80Vt and the UL50Vf in our charts for comparison; you can see additional performance results for the UL80Vt in our previous review. All tests were conducted with the default graphics settings, so the discrete GPU is used when Optimus deems it beneficial and the IGP is used in all other cases. The gaming and general performance tests are run with Turbo33 engaged (33% CPU overclock) while battery testing was conducted at stock CPU speed.
49 Comments
View All Comments
jfmeister - Tuesday, February 9, 2010 - link
I was anxious to get an mx11 but 2 things were bothering me:1- No DirectX 11 compatibility
2- No Core i5/i7 platform.
Now there is another reason to wait for the refresh. But with arrendale prices droping, DX11 card available, Optimus, I would expect Alienware to get on the badwagon fast for a new mx11 platform and not wait 6 to 8 months for a refresh. This ultra laptop is intended for gamers and we all know that gamers are on top of their things. Optimus in the mx11 case should be a must.
BTW, what I find funny is Optimus looks like a revolution, but what about 3dfx 10 years ago with their 3D Card addon (Monster 3D 8MB ftw)? Swithcing was used back then... This looks like the same thing except with HD video support! It took that long to come up with that?
JarredWalton - Tuesday, February 9, 2010 - link
Remember that the switching back in the days of 3dfx was just in software and that the 3D GPU was always powered. There was the dongle cable situation as well. So the big deal here isn't just switching to a different GPU, but doing it on-the-fly and powering the GPU on/off virtually instantly. We think this will eventually make its way into desktops, but obviously it's a lot more important for laptops.StriderGT - Tuesday, February 9, 2010 - link
My take on Optimus:Optimus roots lie with hybrid SLI.
Back then it was advertised as an nvidia only chipset feature (nvidia IGP + nvidia GPU) for both desktop and notebooks.
Currently nvidia is being rapidly phased out of PC x86 chipsets so optimus is the only way to at least put an nvidia GPU on an intel IGP based system, but:
1. Only real benefit is gaming performance without sacrificing autonomy in notebooks.
2. Higher cost (in the form of the discrete GPU), intel has 60%+ of GPUs(=IGPs) because the vast majority do not care or are uninformed about game performance scaling.
3. CUDA/Physx currently and in the foreseeable future irrelevant for mobile applications (gaming is much more relevant in comparison).
4. Video decoding capabilities already present in most current IGPs (except pinetrail netbooks which can acquire it with a cheaper dedicated chip )
5. Netbooks will not benefit from Optimus because they lack the CPU horsepower to feed the discrete GPU and are very cost sensitive... (same reason that ION1/2 is not the primary choice for netbook builders)
6. In the desktop space only some niche small form factor PC applications could benefit from such a technology eg an SFF PC would need lesser cooling/noise during (IGP) normal operation and become louder more powerful while gaming (GPU)
7. Idling/2D power consumption of most modern desktop GPUs is so low making the added complexity of a simultaneously working onboard IGP and the associated software a no benefit approach.
8. Driver/application software problems that might arise from the complexity of profiles and the vastly different workload application scenarios.
So in the end it boils down how can nvidia convince the world that a discrete GPU and its added cost is necessary in every portable (netbook and upwards sized) device out there. As for the desktop side it will be even more difficult to push such a thing with only noise reduction in small form factor PCs being of interest.
BTW At least now the manufacturers won't have anymore excuses for the lack of descent GPU inside some of the cheaper notebook models (500-1000$), because of battery autonomy reasons.
Oh well I'll keep my hopes low after so much time being a niche market since they might find some other excuse along the lines weight and space required for cooling the GPU during A/C operation... :-(
PS Initially posted on yahoo finance forum
Zoomer - Tuesday, February 9, 2010 - link
Not like it was really necessary; the Voodoo 2 used maybe 25W (probably less) and was meant for a desktop use.jfmeister - Tuesday, February 9, 2010 - link
Good point! I guess I did not take the time to think about it. I was more into the concept than the whole techincal side of that you brought up.Thanks!
JF
cknobman - Tuesday, February 9, 2010 - link
Man mx11 was biggest disappointment out there. weak sauce last gen processor on a so called premium high end gaming brand? Ill consider it once they get an arrandale culv and optima cause right now looking at notebookreview.com forums it is a manual switching graphics not optima.crimson117 - Tuesday, February 9, 2010 - link
Which processor should they have used, in your opinion?cknobman - Tuesday, February 9, 2010 - link
Should have waited another month to market and used the Core i7 ulv processors. There are already a few vendors using this proc (panasonic is one).Wolfpup - Tuesday, April 20, 2010 - link
Optimus is impressive software, but personally I don't want it, ever. I don't want Intel graphics on my CPU. I don't want Intel graphics in my memory controller. I don't want Intel graphics. I want my real GPU to by my real GPU, not a helper device that renders something that gets copied over to Intel's graphics.I just do not want this. I don't like having to rely on profiles either-thankfully you can manually add programs, but still.