Closing Thoughts

This has been the first laptop with AMD switchable graphics that I’ve had a chance to actually review, and it hasn’t impressed me as much as I would like. I can’t say I’m really surprised, as I’ve been trying to get my hands on such a laptop since the HP Envy 14 launched. If it worked perfectly and could match NVIDIA’s Optimus, I imagine AMD and/or their partners would have been pushing it into reviewers’ hands a lot more. Regardless of delays in getting a test sample, we’ve finally had a chance to test AMD’s Dynamic Switchable Graphics and we can tell you where it stands…sort of. Let’s recap.

AMD’s dynamic switching is fine when it works, but in our testing it fails to work on a regular basis. Mostly, it just feels like it needs more development and testing; given sufficient resources and time all of the issues I’ve experienced on the VAIO C could get resolved. Long-term, AMD needs a lot more games to get explicit support; out of sixteen titles, not even counting OpenGL games, four of the games had some sort of problem with dynamic switching. On an NVIDIA Optimus laptop, every single game worked without any tweaking necessary. That’s what AMD needs to achieve at this point, and preferably do so without any performance compromises.

The bigger issue of course is that AMD needs to get their laptop partners—Sony in this case—to release regular driver updates, and to use up-to-date driver builds when laptops launch. For all we know many of the issues have been addressed in the months since the February build; instead, Sony has a driver released in June 2011, but with a version number that suggests it was already over four months old at the time the VAIO VPCCA290X launched. We found that the Llano A8-3500M with the HD 6630M outperformed the VAIO CA, which simply shouldn’t happen (unless AMD has some special optimization on Llano that allow their GPUs to run faster). It looks like Sony has given up at least 10% of the performance potential of HD 6630M on average, and in some games the outdated drivers may be culling a third of the performance potential.

That’s really my main concern if you haven’t noticed: drivers. If you have an AMD IGP and AMD GPU (e.g. Llano), some of this discussion becomes unnecessary. Since AMD provides both graphics drivers in that case, updates should be a lot easier, although OEMs would still need to sanction the reference drivers. If an OEM were willing to commit the resources necessary to at least do bi-monthly driver updates for switchable graphics, that would also be sufficient, but they’d need a proven track record of doing so—something no laptop manufacturer has ever achieved. Another alternative is for AMD to get the OEMs on board with letting AMD release reference drivers, including for switchable graphics platforms on Intel chipsets, but no one has managed to do that either and I don’t see things changing. As noted earlier, AMD already has plans in place to move to fully independent graphics drivers, hopefully some time in 2012, but best-case we’re four months away and worst-case it might not even happen in 2012.

That’s another part of the problem with AMD’s drivers, unfortunately: they currently have people working on Brazos, Llano, the upcoming Trinity, existing desktop and notebook graphics, the HD 7000 series, and switchable graphics (plus some other tidbits I missed I’m sure). I doubt that fixing their Dynamic Switchable Graphics drivers will take priority over getting HD 7000 and Trinity drivers ready, and AMD could probably use more people working on improved compiler support for Bulldozer while we’re at it. In other words, there are a lot of areas in AMD software development that need people, and how many people are working on Dynamic Switchable Graphics is unknown. NVIDIA’s Optimus Technology currently enjoys a healthy lead in dynamic switchable graphics and AMD is trying to play catch up, and I’m not sure they’re ready to commit the manpower required to make it happen. It’s hardly a surprise then that where more than 100 Optimus enabled laptops have launched in the past 18 months, there are only a few laptops with AMD’s Dynamic Switchable Graphics—and only a dozen or so laptops using any form of AMD switchable graphics to my knowledge.

To be fair, let me also point out that NVIDIA's Optimus Technology didn't launch and immediately work with everything. Taken from that perspective, AMD's Dynamic Switchable Graphics is about 18 months behind NVIDIA, and hopefully we'll see the technology mature and improve during the next year. We look forward to the day where the compatibility problems we experienced are largely addressed, and we can all get back to using our computers rather than wrestling with them. (Wake me when that happens!) AMD has the groundwork laid at least, so whether it takes six months or 18 months, at some point we should have the ability to get updated drivers for our AMD, Intel, and NVIDIA graphics solutions without worrying about breaking certain features. If that sounds like a pipe dream, just ask some of us old timers about the joys of DOS drivers, loading high, and EMM386.SYS.

As far as the VAIO CA goes, Sony makes a decent laptop—we’ll give it a separate review shortly. The VAIO CA isn’t at quite the same level in terms of build quality and materials as the VAIO SB Dustin reviewed, but it does provide reasonable performance and some of the best battery life results we’ve seen for the specs. If nothing else, Sony at least knows how to tune their laptops for long battery life. Pricing is where things get dicey; there are some $800 VAIO C models that have the same specs as the review unit, but it’s not clear if they include the HD 6630M or not—we’re guessing not. Going straight to Sony, you can configure the VPCCA290X with all the options of our review unit at a not-too-onerous price of $930. Where Sony completely fails however is in their driver support; shipping a laptop with discrete graphics using drivers that are at least four months old (at the time of launch) is tantamount to telling your customers that they don’t need the graphics card at all! But we say the same thing about HD 6470M and yet several companies (including Sony) are still using it.

Ironically, I’d rather have something like the HP Envy 14 or the Sony VAIO C without AMD’s switchable graphics and just give me a discrete GPU instead; that would make getting updated video drivers easier and battery life doesn’t even suffer all that much. AMD is actually a bit closer to my ideal of not needing switchable graphics than NVIDIA, as their mainstream GPUs tend to use less power. Their HD 6630M still draws about 2.0~2.3W more than Intel’s HD 3000 IGP under low loads and 2.8W more during H.264 decoding, but that compares to around 4.3W more for a GT 540M doing H.264 decoding. (We can’t test idle power draw since Optimus just shuts the dGPU off—not that that’s a bad thing.) Of course, the 6630M at ~8.3W idle is still using about 30% more total power than with the IGP (6.0W), but you can still get over six hours of battery life.

Given the price of $930 for the Sony VAIO C, at that point you’re within $150 of better built laptops with nice 1080p displays—e.g. the Dell XPS 15z, or wait for the XPS 14z to show up and see what it offers. If you don’t mind the CPU throttling (or at least running games while using ThrottleStop), you can also grab the Acer TimelineX 3830TG-6431 we used as a comparison point in this article for just $700. The MSI X460DX-008US for $727 (with i3-2310M and a 14” LCD), 15.6” Acer Aspire AS5750G-6496 for $680, or the 15.6” Gigabyte Q2532N-CF1 for $885 are also available for less money than the Sony—and that’s just naming a few of the Optimus laptops with Sandy Bridge CPUs and GeForce GT 540M GPUs. In short, you can get any laptop equipped with NVIDIA’s Optimus Technology and get the improved battery life that running off the IGP affords while still having readily available driver updates, frequently at a lower price. Unless you absolutely don’t care about driver updates—or a UI with lack of “expert” features like a list of game and application profiles and global settings—NVIDIA is definitely the way to go for dynamic switchable graphics technologies right now.

Just to wrap things up, obviously I’m just one person testing these things out, and we have a lot of readers. It’s also been frustrating trying to get a laptop with AMD Switchable Graphics in for testing, and while NVIDIA shipped us a Sony VAIO C, it’s possible that other laptops out there (HP dv6/dv7 and Envy 15/17) might have better driver support. Hopefully we’ll be able to get one (or more) of those for testing, at which time we can revisit this subject, but until then I’d love to hear your thoughts and input on switchable graphics as well as compatibility. What problems/glitches have you run into with Optimus that I might have missed? Outside of Linux non-support, are there any major issues with Optimus that you’d like addressed? The same goes for the AMD side: what other titles are having issues on any of the dynamic switchable graphics laptops? How have laptops with AMD switchable graphics fared in terms of driver updates over time? Does AMD’s switchable technology work any better under Linux that NVIDIA’s Optimus? If any of you can provide specific complaints/concerns with details of how to reproduce the problem(s) for either platform, please sound off in the comments section, or shoot me an email.

Video Demonstrations
Comments Locked

91 Comments

View All Comments

  • JarredWalton - Tuesday, September 20, 2011 - link

    I can't see any other drivers for that laptop other than the original Sept. 2010 driver and a new Sept. 2011 driver. Got a link for the other previous drivers to confirm? Anyway, AMD says they make a driver build available for switchable graphics on a monthly basis, so it sounds like Acer is actually using that. Kudos to them if that's the case.
  • overseer - Tuesday, September 20, 2011 - link

    Perhaps Acer just omitted older versions of drivers they deemed unnecessary. My 4745G was manufactured in Jan. 2010 and the initial CCC was dated late 2009. I can recall I took 2 updates: one in summer 2010 (Catalyst 10.3?) and one in last week (the latest one). So it's safe to say there have been at least 4 traceable versions of AMD GPU drivers for my model.

    While I can't really convince you that it's a bi-monthly or quarterly update cycle from Acer with the limited evidence, this OEM nonetheless has been keeping an eye on new graphics drivers - something I'd never expected in the first place as an early adopter of AMD switchable.
  • MJEvans - Tuesday, September 20, 2011 - link

    You toyed with several ideas for power throttling graphics chips. The obvious ones like turning off cores and working at a different point of the voltage/frequency curve to slash used power are solid.

    Where things turn silly is suggesting the use of only 64 of 256 bits of memory interface. This simply won't work for a few reasons. However let's assume that for price and performance 4, 64 bit, chips had been selected. Probably, the application assets would be littered across the 4 slabs of memory; either for wider page accesses to the same content (faster transfer) or for parallel access to unrelated content (say for two isolated tasks). In any event the cost of consolidating them; both in time and energy expended for the moves, would only be worthwhile if it were for a long duration.

    Instead a better approach would be to follow a similar voltage/freq curve for medium power modes. For low power modes the obvious solution is to have core assets on one bank of memory and use any other enabled blanks as disposable framebuffers. This would allow you to operate at lower clock speeds without impacting performance. Further, if the display output were disabled you would then be able to de-activate all but the asset bank of memory.

    Probably a patent thicket of somekind exists for this stuff; but really I consider these to be things that must be obvious to someone skilled in the art, or even just of logic and basic knowledge of the physics driving current transistors; since my college degree is getting stale and I've never been employed in this field.
  • Old_Fogie_Late_Bloomer - Tuesday, September 20, 2011 - link

    This might not be feasible, but perhaps AMD and/or Nvidia could do something like what Nvidia is doing with Kal-El: have low-power circuitry that can do the bare minimum of what's needed for Windows Aero, etc. (basically what the integrated graphics chip does in Optimus) and switch it out for the more powerful circuitry when needed. As with Kal-El, this switch could be more or less invisible to Windows, or it could be handled at the driver level.

    Of course, that basically wastes the cost of the unused integrated graphics. Perhaps AMD's APUs could more take advantage of this idea: basically, put a CPU and two GPUs on one die, flipping between the slow, power-sipping graphics and the fast and powerful graphics.
  • MJEvans - Tuesday, September 20, 2011 - link

    Actually the Kal-El article explained the key point rather well. The two common options are high speed but a high cost of being 'on' and lower speed but more efficiency per operation at the same speed. Given the highly parallel nature of a graphics solution it makes much more sense to keep the parts that are operating running at faster speed and switch off more of the ones that then aren't needed at all. The main barrier to doing that effectively enough would be the blocks of units used; however if development is occurring with this in mind it might be economically viable. That's a question that would require actual industry experience and knowledge of current design trade-offs to answer.
  • Old_Fogie_Late_Bloomer - Wednesday, September 21, 2011 - link

    Well, the thing that I got from the Kal-El article that was really interesting to me, which I think COULD be relevant to mobile GPU applications, is that, apparently, using this other form of transistor--which is low-leakage but cannot run faster than a certain speed (half a GHz or so)--is sufficiently more efficient in terms of power usage that Nvidia engineers felt that the additional cost is worth it, both in terms of silicon area and the increased cost per part of manufacturing, which, of course, trickles down to the price of the tablet or whatever it's used in. That sounds to me like they feel pretty confident about the idea.

    That being said, I have not the slightest clue what kind of transistors are used in current mobile chips. It might be that GPU manufacturers are already reaping the benefits of low-leakage transistors, in which case there might not be anywhere to go. If they are not, however, why not have just enough low-power cores to run Windows 8's flashy graphical effects, and then switch over to the more powerful, higher-clocked circuitry for gaming or GPGPU applications. I don't know how much it would cost to the consumer, but I'm betting someone would pay $25-$50 more for something that "just works."
  • MJEvans - Friday, September 23, 2011 - link

    It seems that either your comment missed my primary point or I didn't state it clearly enough.

    Likely the engineering trade-off favors powering just a single graphics core (out of the hundreds even mainstream systems how have, relative to merely 4 (vastly more complex) CPU cores on 'mid-high' end systems) rather than increasing hardware and software complexity by adding in an entirely different manufacturing technology and tying up valuable area that could be used for other things with a custom low power version of a thing.

    I find it much more likely that normal use states favor these scenarios:
    a) Display is off,
    a.a) entire gpu is off (Ram might be trickle refreshing).
    b) Display is frozen,
    b.a) entire gpu is off (Ram might be trickle refreshing).
    c) Display is on,
    c.a) gpu has one core active at medium or higher speed
    (This would not be /as/ efficient as an ultra-low power core or two, but I seriously have to wonder if they would even be sufficient; or what fraction of a single core is required for 'basic' features these days).
    c.b) mid-power; some cores are active, possibly dynamically scaled based on load (similar to CPU frequency governing but a bit more basic here)
    c.c) full power; everything is on and cycles are not even wasted on profiling (this would be a special state requested by intensive games/applications).
  • danjw - Tuesday, September 20, 2011 - link

    When I worked for a small game developer, getting ATI to give you the time of day was pretty much impossible. Where as Nvidia was more then happy to help us out, with some free graphics cards and support. If AMD is continuing on this path, they will not ever be real competition to Nvidia.
  • fynamo - Tuesday, September 20, 2011 - link

    The WHOLE POINT of having switchable graphics is to reduce power consumption and thereby extend battery life, and at the same time provide any necessary 2D acceleration capabilities for the OS and Web browsing.

    I'm disappointed that this review totally misses the mark.

    I've been testing numerous Optimus configurations myself lately and have discovered some SERIOUS issues with [at least the Optimus version of] switchable graphics technology: Web browsing.

    Web browsers today increasingly accelerate CSS3, SVG and Flash; however, GPU's have yet to catch up to this trend. As a result rendering performance is abysmal on a Dell XPS 15 with an Intel 2720QM CPU + NVIDIA GeForce 540. Not just with web acceleration. Changing window sizes, basic desktop / Aero UI stuff is like a slideshow. I upgrade from a Dell XPS 16 with the Radeon 3670, and the overall experience has been reduced from a liquid-smooth Windows environment to a slideshow.

    Granted, gaming performance is not bad but that's not the issue.

    Running the latest drivers for everything.

    I was hoping to see this topic researched better in this article.
  • JarredWalton - Tuesday, September 20, 2011 - link

    There's really not much to say in regards to power and battery life, assuming the switchable graphics works right. When the dGPU isn't needed, both AMD and NVIDIA let the IGP do all the work, so then we're looking at Sony vs. Acer using Intel i5-2410M. BIOS and power optimizations come into play, but the two are close enough that it doesn't make much of a difference. (I posted the battery life results above if you're interested, and I still plan on doing a full review of the individual laptops.)

    I'm curious what sites/tests/content you're using that create problems with CSS3/SVG/Flash. My experience on an Intel IGP only is fine for everything I've done outside of running games, but admittedly I might not be pushing things too hard in my regular use. Even so, the latest NVIDIA drivers should allow you to run your browser on dGPU if you need to -- have you tried that? Maybe your global setting has the laptop set to default to IGP everywhere, which might cause some issues. But like I said, specific sites and interactions that cause the sluggishness would be useful, since I can't rule out other software interactions from being the culprit otherwise.

Log in

Don't have an account? Sign up now