POST A COMMENT

91 Comments

Back to Article

  • tipoo - Tuesday, September 20, 2011 - link

    "The bigger issue of course is that AMD needs to get their laptop partners—Sony in this case—to release regular driver updates, and to use up-to-date driver builds when laptops launch."

    AMD now lets you get laptop drivers from their own site, and they are always as up to date as the desktop ones. Unless Sony opted out of that for whatever reason?
    Reply
  • OCedHrt - Tuesday, September 20, 2011 - link

    These drivers do not work for switchable graphics. nVidia had the same issue before Optimus. Reply
  • orangpelupa - Wednesday, September 21, 2011 - link

    the driver is work for switchable graphic.

    my acer laptop with switchable intel + HD Mobility 5650 is updateable.

    use the 11-8_mobility_vista_win7_64_dd_ccc.exe
    not the a few KB .exe auto detector from ATi. This app is useless.

    but if failed to install using the main almost 100MB .exe, usually it stil can be isntalled using modded .inf.

    just make sure to Switch to dGPU mode before running any installation driver.
    Reply
  • orangpelupa - Wednesday, September 21, 2011 - link

    modded inf and the download link for mobility 11.8 generic ATi
    http://wp.me/pyhfN-m1
    Reply
  • The0ne - Wednesday, September 21, 2011 - link

    I wouldn't say useless as the full package download refuses to installed properly on my M17xR2 and the only way for the driver to work is for me to use their auto detector and downloader. This for some reason downloads a slightly different package (size is less I believe) but it works. Reply
  • mfenn - Tuesday, September 20, 2011 - link

    Did you even read the article? Jarred mentioned that they did on several occasions. Hell, he even devoted an entire page to the issue! Reply
  • mczak - Tuesday, September 20, 2011 - link

    I'm wondering if you can still "make your own driver". This is exactly what I did for a Thinkpad T400 with switchable graphics, since the provided driver was so old and buggy. The "monolithic" driver isn't really all that monolithic, it basically consists of a standard AMD mobility driver (which you just can download if you have the real download link) plus a standard intel driver in the same package. Though the .inf file needs to be hacked up.
    (So I used an old switchable driver to see what the .inf looked like, plus a new intel and amd mobility driver to make up the new version - worked quite ok except some driver signing warnings, and some bogus mux-switching upon suspend/resume with multimon though I don't think this worked before neither.)
    I'd venture a guess and suspect this would still work with the muxless solutions, but it's a huge pain in the ass obviously and AMD really needs to fix this and just have drivers which work on all mobile gpus, the OEMs will NEVER get it right otherwise, they won't care if AMD gives them new drivers monthly or not they will simply not bother to supply updated drivers.
    Reply
  • bjacobson - Tuesday, September 20, 2011 - link

    even if they up to date, that's no guarantee that they're going to work for the first 6 or so months. I've never had much luck getting everything on AMD to work the first time-- crossfire with dual monitors doesn't work on 2x 5770 with Quake Wars : Enemy Territory (had to disable the 2nd display); alt-tab still doesn't work in Unreal Tournament 3 without crashing the game (not in crossfire, just 1x4890); and it took them several months after switching their user interface to that new one to package back in the under/overscan ability on the embedded graphics that came on the motherboard we used for our HTPC...just lots of stuff that's always 95% complete with 5% broken that ends up being really annoying.

    IE, I wouldn't be surprised if somewhere in between suspend, hibernate, plugging in an external monitor, and this dynamic GPU solution, that something won't work quite right for about 6 months...but that's just my gut speaking judging by what I've seen before. I'm a big fan of underdogs and still cheer for AMD, but I do have to say Nvidia's drivers (since about 8800GT which is as early as my experience goes) have simply worked with all those quirky setups we needed and didn't end up breaking later when installing an updated driver.
    Reply
  • OCedHrt - Wednesday, September 21, 2011 - link

    I've always had to overscan/underscan available as an option for integrated graphics for the last 2 years+. Yet I've always had problems with nVidia, especially on stability. I don't think the stability is always tied directly to the gpu and drivers. Reply
  • Aloonatic - Tuesday, September 20, 2011 - link

    Is that just for more recent mobility radeon systems?

    Just the other day, I was looking to update a laptop with a 4570, but it wouldn't update anything other than the good old CCC.

    (I was a Dell Studio 17, by the way :o) )
    Reply
  • Wolfpup - Wednesday, September 21, 2011 - link

    The article mentions, but probably doesn't make the statement strong enough-you CAN'T use AMD's drivers with Sandy Bridge + AMD systems, which is why I don't even think they should be on the market, let alone anyone actually buy them.

    I'd been looking at HP's Sandy Bridge systems until learning that. AMD + AMD systems SHOULD work, although the A series stuff isn't listed yet. But the c series chips work, so I'd be surprised if they don't get A series officially supported.

    Unfortunately there's unsupported stuff using Nvidia too...apparently Sager's systems aren't supported (along with some Sony's, and I've heard the new high end Dell ones with Floptimus have issues too...)

    Unfortunately it's hard to find out which notebooks can use normal drivers, even though that's a HUGE selling point!
    Reply
  • inplainview - Tuesday, September 20, 2011 - link

    My 2011 MPB 15 inch supports:

    AMD Radeon HD 6750M:

    Chipset Model: AMD Radeon HD 6750M
    Type: GPU
    Bus: PCIe
    PCIe Lane Width: x8
    VRAM (Total): 1024 MB
    Vendor: ATI (0x1002)
    Device ID: 0x6741
    Revision ID: 0x0000
    ROM Revision: 113-C0170L-573
    gMux Version: 1.9.23
    EFI Driver Version: 01.00.573
    Displays:
    Color LCD:
    Resolution: 1440 x 900
    Pixel Depth: 32-Bit Color (ARGB8888)
    Main Display: Yes
    Mirror: Off
    Online: Yes
    Built-In: Yes

    and:

    Intel HD Graphics 3000:

    Chipset Model: Intel HD Graphics 3000
    Type: GPU
    Bus: Built-In
    VRAM (Total): 512 MB
    Vendor: Intel (0x8086)
    Device ID: 0x0126
    Revision ID: 0x0009
    gMux Version: 1.9.23
    Reply
  • tipoo - Tuesday, September 20, 2011 - link

    We all care deeply. Reply
  • retrospooty - Tuesday, September 20, 2011 - link

    "My 2011 MPB 15 inch supports:"

    Pretty much all laptops these days support both... The question is does it properly switch between the two, using the low power internal intel for normal Windows day to day use and then automatically switch to the Radeon when 3d gaming.... Wait, its a MAc, why would you even bother 3d gaming.

    This maked me ask, why does Apple even bother putting an expensive gaming card in a Mac? The few games that run, run like crap... unless its there for the Windows/boot camp portion. I guess that makes sense. But then, why not get a PC, its cheaper.
    Reply
  • inighthawki - Tuesday, September 20, 2011 - link

    As someone who doesn't even like Macs, I think you are looking way past the obvious. Many Macs are used for things like Photoshop, which can use hardware accelerated rendering, and Maya, 3DS Max, etc which are pretty demanding 3D modeling and CAD programs. Not all higher end GPUs are JUST for gaming. Reply
  • retrospooty - Tuesday, September 20, 2011 - link

    ???

    That kinda proves my point. Its not for games, it IS good for those things you listed, which are much better suited to pro series video cards (Nvidia Quadro and AMD Fire Pro) It doesnt make sense to put gaming cards.
    Reply
  • inplainview - Tuesday, September 20, 2011 - link

    Well, I'm not 11 so I actually use my Mac for real work, and don't waste SSD space putting games on it.. Also as a semi-pro photographer I tend to use it to the max. Did I explain it enough for you? Reply
  • sigmatau - Tuesday, September 20, 2011 - link

    You forgot the price tag. How much money does it take to support Apple? Reply
  • inplainview - Tuesday, September 20, 2011 - link

    I make A LOT of money so I buy whatever I want. Reply
  • seapeople - Tuesday, September 20, 2011 - link

    Then why don't you buy a bigger SSD so your games don't load slowly? Reply
  • just4U - Tuesday, September 20, 2011 - link

    It's been a looong day.. I need a good chuckle.. thanks! Reply
  • inplainview - Wednesday, September 21, 2011 - link

    Are you serial stupid? I said above that I do not play games. Are you reading challenged? Reply
  • ggathagan - Wednesday, September 21, 2011 - link

    Thanks,
    I was wondering what you ended up buying.
    What're you having for dinner tonight?
    Mom was worried, too. You might want to call her.
    Reply
  • sonofsanta - Tuesday, September 20, 2011 - link

    I, er, I think the phrase you're looking for is "cry foul", not "fowl". Wouldn't make much sense if people started running round shouting "Pheasant!" because nVidia sent you both laptops... Reply
  • MysteriousAndy - Tuesday, September 20, 2011 - link

    ++

    http://dictionary.cambridge.org/dictionary/british...
    Reply
  • swx2 - Tuesday, September 20, 2011 - link

    relevant:
    http://www.vgcats.com/comics/?strip_id=119
    Reply
  • JarredWalton - Tuesday, September 20, 2011 - link

    It's simply a typo (or misspelling if you prefer); I'm fully aware that we "cry foul" and not "fowl". Reply
  • beginner99 - Tuesday, September 20, 2011 - link

    ... makes absolutely no sense.

    "we thought it would be interesting to see just how much performance you give up by gaming with the slower Llano CPU. Ready for some fun? You actually gain 5% performance with the A8-3500M + 6630M compared to the VAIO CA. ...Can you feel my frustration (with Sony) yet?"

    The Charts show the i5 being better in every single benchmark especially starcraft 2, 29 vs 62 fps and you claim i5 has a 9% lead? Sorry but in my math its over 100% faster.

    Again that last paragraph makes 0 sense and contradicts the data.
    Reply
  • JarredWalton - Tuesday, September 20, 2011 - link

    Crap... I had the wrong numbers in my spreadsheet. Will update the text, because I was totally flabbergasted by the percentages. Turns out I had numbers from a different laptop listed (but the charts are correct). Thanks for pointing this out! Reply
  • Althernai - Tuesday, September 20, 2011 - link

    Meanwhile, the 6700M, 6800M, and 6900M could all benefit from dynamic switching (assuming the bugs and other issues are worked out), but no one is doing it.


    Actually, HP's dv6t and dv7t have the 6770M with switchable graphics. It was badly broken upon release (dynamic only = OpenGL only on Intel), but HP has since released a BIOS update which allows you to select between Dynamic and Fixed in the BIOS (not in Windows like the Sony, but at least it's possible now).

    How have laptops with AMD switchable graphics fared in terms of driver updates over time?

    Updating drivers on the HP dv6t is not so bad: you need to install AMD's drivers on top of what HP provides, but it seems to work OK as long as you leave out the control center. AMD releases two sets of drivers: the normal type which comes with all kinds of restrictions and the "hotfix" type which can be installed on practically anything (including even FirePro GPUs). You can always get the latter for the current month if there is a problem with downloading the driver.

    Does AMD’s switchable technology work any better under Linux that NVIDIA’s Optimus?[/quote
    Linux support is terrible: unlike Optimus where it usually at least has the decency to use the Intel GPU with the Nvidia one turned off, with dv6t, the discreet GPU is running (I can tell by the thermal behavior), but no drivers for it can be installed and there appears to be no way to use it. It's partially my own fault for using Scientific Linux (it's the latest 6.1 release, but the kernel is still rather old). I will try with Fedora 16 once it comes out, but my only goal is to get the Radeon card to turn off and be able to control the brightness -- actually switching between GPUs is highly unlikely.
    Reply
  • JarredWalton - Tuesday, September 20, 2011 - link

    Thanks for the input -- I've corrected the 6700M mistakes now. Somehow I got it stuck in my brain that 6700M was rebadged 5700M, but that's only the 6300M and 6800M. Thanks also for the updates on Linux--good to know how it does/doesn't work. Reply
  • rflynn88 - Tuesday, September 20, 2011 - link

    Correction required:

    The 6700M series chips do support dynamic switching. I'm not sure if the 6300M series does. The HP dv6t can be optioned out with a Core i5/i7 Sandy Bridge chip and the 6770M with dynamic switching. There was actually a big issue with the dynamic switching not working on the dv6t for OpenGL applications, which was only recently remedied with a BIOS update.
    Reply
  • JarredWalton - Tuesday, September 20, 2011 - link

    Corrected -- see above note. I mistakenly confused the 6700M situation with the 6300M/6800M, which are rebadged 5000M parts and would not have the decoupled PCI-E interface to allow for dynamic switching. Reply
  • BernardP - Tuesday, September 20, 2011 - link

    What I would like to do with Optimus is disable it completely and have my notebook always use the Nvidia graphics, even on the desktop. I don't care about battery life, as my notebook is almost never running on battery.

    After searching on the web, I have found no way to disable Optimus. Anybody here have a solution?
    Reply
  • bhima - Tuesday, September 20, 2011 - link

    Yes. Open up your NVIDIA settings. Change your Global Settings preffered graphics processor to the NVIDIA card. Voila! Reply
  • BernardP - Tuesday, September 20, 2011 - link

    As simple as that? I'm hopeful...

    Please allow me a bit of residual doubt, considering, for example, the following discussion where there is a mention of this suggested setting.

    http://superuser.com/questions/282734/how-to-disab...

    However, I'll try your suggestion on a new Alienware laptop with Optimus one of my friends just bought.
    Reply
  • JarredWalton - Tuesday, September 20, 2011 - link

    The above change won't disable Optimus so much as always try to run programs on the dGPU. To my knowledge, there is no way to disable it completely, since with Optimus there is no direct link between the dGPU and the display outputs. All data has to get copied over the PCI-E bus to the IGP framebuffer. Reply
  • BernardP - Wednesday, September 21, 2011 - link

    That's also my understanding, hence my doubts... Reply
  • seapeople - Tuesday, September 20, 2011 - link

    I can see not caring about battery life, but you don't care about heat and/or noise either? More power use = more heat = constant fan turning on and off vs silent operation using only the integrated GPU.

    Less heat also equates to longer hardware lifetime.

    I just don't understand why you would want a fat GPU idling while you browse AnandTech instead of the low power Intel GPU built into your computer?
    Reply
  • BernardP - Wednesday, September 21, 2011 - link

    Because I want a not-so-fat NVidia GPU, such as GT520M or GT525M, to pair with a high quality 1980x1080 IPS screen.... and then, use NVidia scaling to set up a custom resolution that will allow my old presbyobic eyes to see what's on the screen. For a 15.6 in. screen, 1440x900 is about right and with the NVidia custom resolution tools, there is no significant loss of quality.

    AMD graphics drivers don't allow setting up custom resolutions. Can Intel graphics drivers do it?

    And I know that one can make things bigger onscreen with Windows settings, but it doesn't work all the time for everything. There is no substitute for setting up custom resolutions.
    Reply
  • Filiprino - Tuesday, September 20, 2011 - link

    Huh... Sony sucks big on graphics driver section. Better forget their radeon card and OpenCL support? Reply
  • 86waterpumper - Tuesday, September 20, 2011 - link

    The amd LLano solution is totally worthless as far as I'm concerned until they release some laptops with smaller sizes. If I am going to buy a honking monster laptop I'm not going to power it with a cripple and slow llano, I'm going to just get a i5 or i7. When they get some 12 and 13.3 stuff to market I'll get interested... Reply
  • duploxxx - Tuesday, September 20, 2011 - link

    That is up to OEM to design, there are lots of people these days who want large screen and low price, they seem to target Liano for that, I also want a 13-14"based Liano but can't seem to find it....

    Liano slow, yeah right for common tasks it is more then powerfull enough, as if one would care if a laptop boots 2 sec faster and firefox starts 0,5sec faster. The crapload of SW that OEM deploy on those machines already makes it horrible even on an i7 so called supper fast CPU....for the ultimate benchmark experience :D

    storage these days is the slow factor in a laptop, unless off course you work with an Atom...
    Reply
  • JarredWalton - Tuesday, September 20, 2011 - link

    Filiprino, I don't know about all programs, but it is definitely possible to use OpenCL with the VAIO CA. You just have to switch to manual mode to get it to work, and you'll be running 11.1 drivers. I did get the Bitcoin GUIminer to work with the Sony as a test, for example, but it totally failed to detect the GPU when in dynamic mode. (Note: 68Mhash/sec isn't fast enough to be worthwhile of course, and given the pricing on BTC these days I wouldn't bother trying to get involved with the scheme. It's still a useful benchmark at times, though.) Reply
  • Filiprino - Wednesday, September 21, 2011 - link

    Thank you, although I'm more worried about the driver upgrading viability and the performance improvements they can bring :-/ Reply
  • BryanC - Tuesday, September 20, 2011 - link

    Did I miss the battery life comparison? Reply
  • JarredWalton - Tuesday, September 20, 2011 - link

    I'm going to post that in the full review; this was intended to focus solely on switching technology, as it was already plenty long. In case you're wondering though, here are the numbers. Acer has a 66Wh battery and Sony has a 59Wh battery.

    Acer 3830TG Optimus:
    Idle: 580 minutes
    Internet: 461 minutes
    H.264: 344 minutes

    Acer 3830TG GT 540M:
    H.264: 248 minutes

    Sony VAIO C IGP:
    Idle: 574 minutes
    Internet: 417 minutes
    H.264: 358 minutes

    Sony VAIO C 6630M:
    Idle: 415 minutes
    Internet: 336 minutes
    H.264: 276 minutes
    Reply
  • fabarati - Tuesday, September 20, 2011 - link

    I'm pretty sure the AMD HD 6700-series are just the HD6600-series with higher clocks or GDDR5. They're all Turks-based.

    It's the HD 6500-series that are rebranded HD 5600/5700s (Redwood). Something that's also pretty obvious if you look at the core configurations.
    Reply
  • chinedooo - Wednesday, September 21, 2011 - link

    Yea they are all the same 480 stream porcessor chips. But the 6770 has got higher clocks and gddr5. The GDDR5 makes a world of difference. Reply
  • duploxxx - Tuesday, September 20, 2011 - link

    I have a latitude e6520 i7 with NV Quadro

    The GPU switch works well as long as the NV configuration has the right profile for the right game, if the profile isn't available you sometimes face issues. (just black screen, why at that point it doesn't run on the Intel GPU I don't know.....)

    THe only way to avoid this is to add the exe to the profile, or to right click the application and select the GPU to run with. Works fine but i would expect this to be the same with ATI no?
    Reply
  • powchie - Tuesday, September 20, 2011 - link

    No battery life comparison? Reply
  • DanNeely - Tuesday, September 20, 2011 - link

    This is just a GPU switching comparison, not a pair of full laptop reviews. Reply
  • khimera2000 - Tuesday, September 20, 2011 - link

    Its an Intel GPU switching comparison. They didnt compare AMD switching with AMD CPU. I think thats an important detail.

    This articles scope is alot more narrow then the title implies.
    Reply
  • JarredWalton - Tuesday, September 20, 2011 - link

    This article is over 8000 words and I thought that was long enough. Battery life numbers are posted above if you're interested. Really, the battery life comparison is more of a look at how well Acer and Sony are optimizing their BIOS and drivers for mobile use, and both do quite well (though Sony leads in terms of efficiency, even with a larger LCD). Reply
  • Stuka87 - Tuesday, September 20, 2011 - link

    I have not yet used AMD's switchable setup. Being that its so new, I am not surprised that it is not yet perfect. nVidia took a while to work out their kinks as well.

    Although with that said, I still have an issue with my Precision (Optimus with i7-2620M and Quadro 1000). I use a Matrox Triple Head to go to drive three displays with a Dell Docking Station. This being when I am undocked running off battery, it runs on the Intel graphics. When I dock, I tries to switch to the Quadro. However, I end up with a black screen because its trying to use the resolution of the built in screen that Intel was driving before docking. The work around is to always hibernate before docking, this makes it re-initialize the hardware when it comes out of hibernation.

    But typically, the switch works fine. And I get amazing battery life out of this machine.
    Reply
  • medi01 - Sunday, September 25, 2011 - link

    "I have not yet used AMD's switchable setup. Being that its so new, I am not surprised that it is not yet perfect"

    Especially on nVidia supplied notebook. ;)
    Reply
  • JarredWalton - Tuesday, September 27, 2011 - link

    Which you can buy from Sony. It was unopened, with absolutely no tampering by anyone other than myself -- and the same goes for the Acer. If anyone can point me to a better AMD + Intel laptop with dynamic switchable graphics (Vostro 3450 with HD 6470M? I don't think so...), let me know. I've also talked several times with AMD and asked them to provide me with new drivers and/or a different laptop. We'll see if they can do so, because honestly I'd love for AMD to have a more compelling offering in this area. Reply
  • mercblue281 - Tuesday, September 20, 2011 - link

    Can you guys please have someone proofread before posting?
    "...and they haven’t been able to get use one yet..."
    In the FIRST paragraph? Really?
    Come on! I know the internet has dumbed down the general population's grammar and spelling but you guys are better than this.
    Reply
  • jeremyshaw - Tuesday, September 20, 2011 - link

    on the first page, you mention "HD6700m line is rebadged HD5600m part"

    which isn't true, since the HD6700m line has 480 shaders, and GDDR5, both of which are lacking from the HD5600m.
    Reply
  • overseer - Tuesday, September 20, 2011 - link

    "If an OEM were willing to commit the resources necessary to at least do bi-monthly driver updates for switchable graphics, that would also be sufficient, but they’d need a proven track record of doing so—something no laptop manufacturer has ever achieved."

    Can't say I agree with you here.

    I have an Acer Aspire 4745G (i3 330M + HD5650M, manually switchable) that I bought in Apr. 2010. Over a year and half Acer has been offering the GPU driver updates for it (once in a quarter or 2 months AFAICR).

    Check the 4745G downloads on Acer's support page and you'll find the latest AMD VGA driver update which came out on 2011/09/07.
    http://us.acer.com/ac/en/US/content/drivers
    Reply
  • JarredWalton - Tuesday, September 20, 2011 - link

    I can't see any other drivers for that laptop other than the original Sept. 2010 driver and a new Sept. 2011 driver. Got a link for the other previous drivers to confirm? Anyway, AMD says they make a driver build available for switchable graphics on a monthly basis, so it sounds like Acer is actually using that. Kudos to them if that's the case. Reply
  • overseer - Tuesday, September 20, 2011 - link

    Perhaps Acer just omitted older versions of drivers they deemed unnecessary. My 4745G was manufactured in Jan. 2010 and the initial CCC was dated late 2009. I can recall I took 2 updates: one in summer 2010 (Catalyst 10.3?) and one in last week (the latest one). So it's safe to say there have been at least 4 traceable versions of AMD GPU drivers for my model.

    While I can't really convince you that it's a bi-monthly or quarterly update cycle from Acer with the limited evidence, this OEM nonetheless has been keeping an eye on new graphics drivers - something I'd never expected in the first place as an early adopter of AMD switchable.
    Reply
  • MJEvans - Tuesday, September 20, 2011 - link

    You toyed with several ideas for power throttling graphics chips. The obvious ones like turning off cores and working at a different point of the voltage/frequency curve to slash used power are solid.

    Where things turn silly is suggesting the use of only 64 of 256 bits of memory interface. This simply won't work for a few reasons. However let's assume that for price and performance 4, 64 bit, chips had been selected. Probably, the application assets would be littered across the 4 slabs of memory; either for wider page accesses to the same content (faster transfer) or for parallel access to unrelated content (say for two isolated tasks). In any event the cost of consolidating them; both in time and energy expended for the moves, would only be worthwhile if it were for a long duration.

    Instead a better approach would be to follow a similar voltage/freq curve for medium power modes. For low power modes the obvious solution is to have core assets on one bank of memory and use any other enabled blanks as disposable framebuffers. This would allow you to operate at lower clock speeds without impacting performance. Further, if the display output were disabled you would then be able to de-activate all but the asset bank of memory.

    Probably a patent thicket of somekind exists for this stuff; but really I consider these to be things that must be obvious to someone skilled in the art, or even just of logic and basic knowledge of the physics driving current transistors; since my college degree is getting stale and I've never been employed in this field.
    Reply
  • Old_Fogie_Late_Bloomer - Tuesday, September 20, 2011 - link

    This might not be feasible, but perhaps AMD and/or Nvidia could do something like what Nvidia is doing with Kal-El: have low-power circuitry that can do the bare minimum of what's needed for Windows Aero, etc. (basically what the integrated graphics chip does in Optimus) and switch it out for the more powerful circuitry when needed. As with Kal-El, this switch could be more or less invisible to Windows, or it could be handled at the driver level.

    Of course, that basically wastes the cost of the unused integrated graphics. Perhaps AMD's APUs could more take advantage of this idea: basically, put a CPU and two GPUs on one die, flipping between the slow, power-sipping graphics and the fast and powerful graphics.
    Reply
  • MJEvans - Tuesday, September 20, 2011 - link

    Actually the Kal-El article explained the key point rather well. The two common options are high speed but a high cost of being 'on' and lower speed but more efficiency per operation at the same speed. Given the highly parallel nature of a graphics solution it makes much more sense to keep the parts that are operating running at faster speed and switch off more of the ones that then aren't needed at all. The main barrier to doing that effectively enough would be the blocks of units used; however if development is occurring with this in mind it might be economically viable. That's a question that would require actual industry experience and knowledge of current design trade-offs to answer. Reply
  • Old_Fogie_Late_Bloomer - Wednesday, September 21, 2011 - link

    Well, the thing that I got from the Kal-El article that was really interesting to me, which I think COULD be relevant to mobile GPU applications, is that, apparently, using this other form of transistor--which is low-leakage but cannot run faster than a certain speed (half a GHz or so)--is sufficiently more efficient in terms of power usage that Nvidia engineers felt that the additional cost is worth it, both in terms of silicon area and the increased cost per part of manufacturing, which, of course, trickles down to the price of the tablet or whatever it's used in. That sounds to me like they feel pretty confident about the idea.

    That being said, I have not the slightest clue what kind of transistors are used in current mobile chips. It might be that GPU manufacturers are already reaping the benefits of low-leakage transistors, in which case there might not be anywhere to go. If they are not, however, why not have just enough low-power cores to run Windows 8's flashy graphical effects, and then switch over to the more powerful, higher-clocked circuitry for gaming or GPGPU applications. I don't know how much it would cost to the consumer, but I'm betting someone would pay $25-$50 more for something that "just works."
    Reply
  • MJEvans - Friday, September 23, 2011 - link

    It seems that either your comment missed my primary point or I didn't state it clearly enough.

    Likely the engineering trade-off favors powering just a single graphics core (out of the hundreds even mainstream systems how have, relative to merely 4 (vastly more complex) CPU cores on 'mid-high' end systems) rather than increasing hardware and software complexity by adding in an entirely different manufacturing technology and tying up valuable area that could be used for other things with a custom low power version of a thing.

    I find it much more likely that normal use states favor these scenarios:
    a) Display is off,
    a.a) entire gpu is off (Ram might be trickle refreshing).
    b) Display is frozen,
    b.a) entire gpu is off (Ram might be trickle refreshing).
    c) Display is on,
    c.a) gpu has one core active at medium or higher speed
    (This would not be /as/ efficient as an ultra-low power core or two, but I seriously have to wonder if they would even be sufficient; or what fraction of a single core is required for 'basic' features these days).
    c.b) mid-power; some cores are active, possibly dynamically scaled based on load (similar to CPU frequency governing but a bit more basic here)
    c.c) full power; everything is on and cycles are not even wasted on profiling (this would be a special state requested by intensive games/applications).
    Reply
  • danjw - Tuesday, September 20, 2011 - link

    When I worked for a small game developer, getting ATI to give you the time of day was pretty much impossible. Where as Nvidia was more then happy to help us out, with some free graphics cards and support. If AMD is continuing on this path, they will not ever be real competition to Nvidia. Reply
  • fynamo - Tuesday, September 20, 2011 - link

    The WHOLE POINT of having switchable graphics is to reduce power consumption and thereby extend battery life, and at the same time provide any necessary 2D acceleration capabilities for the OS and Web browsing.

    I'm disappointed that this review totally misses the mark.

    I've been testing numerous Optimus configurations myself lately and have discovered some SERIOUS issues with [at least the Optimus version of] switchable graphics technology: Web browsing.

    Web browsers today increasingly accelerate CSS3, SVG and Flash; however, GPU's have yet to catch up to this trend. As a result rendering performance is abysmal on a Dell XPS 15 with an Intel 2720QM CPU + NVIDIA GeForce 540. Not just with web acceleration. Changing window sizes, basic desktop / Aero UI stuff is like a slideshow. I upgrade from a Dell XPS 16 with the Radeon 3670, and the overall experience has been reduced from a liquid-smooth Windows environment to a slideshow.

    Granted, gaming performance is not bad but that's not the issue.

    Running the latest drivers for everything.

    I was hoping to see this topic researched better in this article.
    Reply
  • JarredWalton - Tuesday, September 20, 2011 - link

    There's really not much to say in regards to power and battery life, assuming the switchable graphics works right. When the dGPU isn't needed, both AMD and NVIDIA let the IGP do all the work, so then we're looking at Sony vs. Acer using Intel i5-2410M. BIOS and power optimizations come into play, but the two are close enough that it doesn't make much of a difference. (I posted the battery life results above if you're interested, and I still plan on doing a full review of the individual laptops.)

    I'm curious what sites/tests/content you're using that create problems with CSS3/SVG/Flash. My experience on an Intel IGP only is fine for everything I've done outside of running games, but admittedly I might not be pushing things too hard in my regular use. Even so, the latest NVIDIA drivers should allow you to run your browser on dGPU if you need to -- have you tried that? Maybe your global setting has the laptop set to default to IGP everywhere, which might cause some issues. But like I said, specific sites and interactions that cause the sluggishness would be useful, since I can't rule out other software interactions from being the culprit otherwise.
    Reply
  • fynamo - Wednesday, September 21, 2011 - link

    Tried all of the driver tweaks, forced browsers hw accel, all to no avail. Firefox and Chrome both will use only IGP despite forcing them in NVIDIA control panel.

    In reality, most people aren't going to notice CSS3 sluggishness because very few sites actually employ CSS3 currently. But as a developer of bleeding-edge apps that are indeed using CSS3, and which we are also developing for mobile, I am HIGHLY sensitive to performance.

    As stated - on Optimus, css3 performance sucks. On AMD, css3 performance is orders of magnitude better.

    The other issue is with resizing and dragging windows. I noticed that the "SYSTEM" process in Task Manager (Windows 7 64) spikes to use a single full CPU core while resizing or dragging a window, and the drag / move animation slows to ~10 FPS or less. I did NOT have this problem on my "old" Radeon 3670 machine.

    The same tests on a desktop, also with Windows 7 64 and with a Radeon 6850 (no IGP), show liquid-smooth and no CPU spike.

    I've tested multiple Optimus systems and all have this problem, but my tests with AMD systems have yielded good results each time.
    Reply
  • Spazweasel - Tuesday, September 20, 2011 - link

    When people ask why I stick with nVidia graphics cards, this article sums up all my reasons well:

    1. nVidia for many years has done a much better job of delivering timely driver updates, better driver stability, and multi-GPU scaling. SLI "just works". Crossfire is a crapshoot.
    2. I have never had a problem with a game that was related to an nVidia driver. I cannot say the same of AMD.
    3. AMD certainly has somewhat faster hardware at a given price point, but that doesn't matter if the games crash, if the driver UI sucks, or they can't get their partners to deliver what few driver updates there are.
    4. I have many friends and acquaintances in the gaming industry. Without exception, they have reported that nVidia is much, much easier to deal with and is more responsive to the concerns of game developers than AMD. nVidia will often give you some of their own engineer-time to help you work through a problem, while AMD's response is "RTFM, go away, stop bothering us". This is likely why games have fewer driver-related issues upon initial release with nVidia than AMD; nVidia will help you before your game is on the market (and include the necessary changes to their drivers in advance of the game's release), while AMD is unresponsive during development, and often well into retail.

    Secondarily, never buy a Sony computing product. You'd better be happy with the drivers that come with it, because you're not going to see new ones. Over the years I've had two laptops made by Sony, and both were orphaned within 18 months of purchase (driver updates on OSs which were current when the product was new stopped, and newer OSs never got a driver at all). Sony is terrible at ongoing driver support, regardless of what the hardware category (video, audio, input device, peripheral connection hardware) is. I've come to the conclusion that there is nothing software-related which Sony can get right, on either a technical nor ethical basis, and that planned obsolescence through early termination of software support is explicitly part of their business strategy.

    My most recent AMD experience is a 4870, which was (and is) fast, loud, and unstable. I've thought about a 6570 for an HTPC, mostly for thermal reasons and packaging reasons (if you want a quiet, cool video card capable of moderate detail-level gaming to feed a 720p TV that is low-profile, you're pretty much limited to AMD), so it's about time for me to see if anything's changed. In the meantime, for my heavy-duty gaming machine, it's nVidia and will remain so until AMD's driver team gets its act together, regardless of how nice AMD's hardware is. Seriously, the hardware team at AMD needs to put the beat-down on the Catalyst guys; the driver team is making everyone look bad.
    Reply
  • tecknurd - Wednesday, September 21, 2011 - link

    I completely agree. ATI never wrote reliable and stable drivers. Also they gave me a run-around by saying to update to the latest drivers which I did at the time, but the graphic drivers still crashed my setup. Now AMD owns ATI and they have the same faults as ATI. People say that Radeon graphics is good, but this article shows they do not care for reliability and stability which are require for GUI.

    I switched to nVidia because of poor driver support from ATI. Also poor driver support in Linux for Radeon graphics. IMHO, the open source community does a better job writing drivers for Radeon graphics compared to AMD.

    I would buy AMD for their CPU but not for their graphics.
    Reply
  • chinedooo - Wednesday, September 21, 2011 - link

    haha the dv6t with a 6770m would kill all these other laptops. And it switches perfectly too. I get like 6-7 hrs web browsing on mine. Reply
  • chinedooo - Wednesday, September 21, 2011 - link

    Another difference between the two is the vram. the 6700 series uses gddr5. makes a world of difference. Reply
  • Hrel - Wednesday, September 21, 2011 - link

    "and the user can add their own custom apps". Does this mean we can pick and choose if the dgpu is on or off on a per app basis? I spoke to Nvidia and they said you CAN do that in the Nvidia control panel. I just don't know how. I have the Clevo P151HM laptop, so maybe the option isn't even there on mine. I'd still like you guys to tell us how to do this, assuming it's possible.

    Side note, I'm annoyed this laptop only accepts drivers from Clevo, and not from Nvidia.
    Reply
  • tanjo - Wednesday, September 21, 2011 - link

    3 years and it's still not working properly???

    The best solution is to add ultra low power 2D power state on dGPUs.
    Reply
  • orangpelupa - Wednesday, September 21, 2011 - link

    actually you can install GENERIC driver from ATi to update the laptop with switchable graphic.

    just dont use the auto detect app from ATi. it useless. always decline to download the driver....

    i have been long time using Acer with Intel + Radeon HD 5650. i can always update the ATi driver using generic from ati website.
    for acer i just install the 11-8_mobility_vista_win7_64_dd_ccc.exe

    but if the installer decline to install, you can update while using modded inf
    http://game.bramantya.org/modded-inf-ati-mobility-... (sorry have not uploaded the 11.8 modded inf)

    if still failed, can update manually from device manager.

    just make sure before doing any update with "generic" driver is graphic switched to dGPU mode from the shortcut in right click menu in desktop.

    that updating generic, work old laptop with "screen flicker when switch graphic". so i dont know if its work with the new dynamic switching ATi.
    Anyone with this new DYNAMIC switching want to try?
    Reply
  • my2cents - Wednesday, September 21, 2011 - link

    Just my 2 cents. I was searching around web and found site, some blog, where some dude is creating ATI + Intel switchable graphics. I own myself a Vaio VPC-SA2S9R. Just google "leshcat_dot_blogspot_dot_com". Works good so far. Reply
  • RenderB - Wednesday, September 21, 2011 - link

    Sadly the nvidia tool isn't doing much better. Have the same optimus config as tested, but from asus. The auto detect will always tell me to go get drivers from clearcube. Reply
  • Death666Angel - Wednesday, September 21, 2011 - link

    "but then why even have dynamic switching in the first place?"
    That is a good question. I myself would prefer a solution that lets me decide what to do. I wouldn't prefer dynamic switching, because I want to be in control of my hardware as much as possible. Reboot switching is of course a pain in the butt, but I see nothing wrong with a short cut manual switch for iGPU/dGPU switching. Would be preferable to me actually.
    But otherwise, thanks for the article.
    Reply
  • mars2k - Wednesday, September 21, 2011 - link

    I purchased a Lenovo W520 on the first day of release. Got the Thinkpad and could never get the graphics to perform correctly. Integration with the Intel board was lousy. Lenovo's support team could do nothing. I returned the machine. I'm thinking buggy drivers were the problem.
    My particular problem revolved around using multiple monitors. Connecting disconnecting or configuring multi monitors would often require a reboot. Worse than worthless!
    Reply
  • Wolfpup - Wednesday, September 21, 2011 - link

    “With all my talk of switchable graphics, though, let’s make one thing clear: switchable graphics is not necessarily the Holy Grail of mobile GPUs. The true ideal in my opinion is mobile GPUs that can run fast when needed (i.e. playing games), while also being able to power off large portions of the chip and RAM and get down to IGP levels of power consumption.”

    THANK you for that! That's the first time I've heard that said since all this switchable graphics stuff started.

    As far as I'm concerned, the GPU situation on notebooks is a disaster, and right when I thought we were finally moving in the first direction, this switchable stuff came out.

    As it is now, a HUGE percent of notebooks can't use Nvidia or AMD's normal drivers. Sandy Bridge + AMD shouldn't even be on the market, and shouldn't be considered by anyone. As the article mentions-you can't use normal drivers for those, at least not today. (AMD + AMD should work sooner or later, although the A series CPUs aren't listed yet in the driver download box on AMD.com...my little c50 notebook is supported though, so hopefully the A stuff will be soon too.)

    I just bought Asus' G74, which I've really liked-and one of the reasons I bought it was that it DOESN'T use switchable graphics, and it DOES use normal drivers from Nvidia. Some more expensive systems with even better GPUs can't use the normal drivers, which IMO just isn't acceptable in a $1000 purchase, let alone a $2000+ purchase.

    At any rate, Nvidia's current GPUs underclock themselves to a very impressive 50mhz, several times slower even than my Geforce 9650GT from 3 years back. As mentioned, get power gating in there, and just get rid of Optimus and AMD's equivalent. Get rid of that complexity, driver/compatibility weirdness, etc.

    Even once switchable graphics are gone, we're STILL not there yet...as the article mentions, most of Sony and Toshiba's notebooks can't be updated. Sager's stuff apparently can't, etc.

    My Asus N80 from 3 years ago and my G74 both accept Nvidia.com driver updates just like a desktop system, no hacks, nothing weird, it just works. And that's how it should be!
    Reply
  • Anato - Friday, September 23, 2011 - link

    Shouldn't be too difficult to make GPU which can divide resources so that parts of the unit can be shut down. Even CPU's can do that and they are executing same code as compared to GPU which can be more liberal on architecture. Reply
  • JonnyDough - Friday, September 23, 2011 - link

    "If nothing else, Sony at least knows how to tune their laptops for long battery life."

    Now if they can just make PS3 networking that isn't so hackable, batteries that don't explode, and Blu-Ray discs that don't cost 4X as much as an up-scalable DVD. At least they are smart enough to partner up with AMD.
    Reply
  • pman6 - Friday, September 23, 2011 - link

    AMD's solution definitely seems half assed. Their drivers are always crap.

    ...just like how they rolled out the mobile llano hybrid crossfire with crap performance.

    seems like they can never get the drivers right.

    I have a llano laptop with crossfire, and you would think they would be able to do split frame rendering to improve performance. but 99% of the time, crossfire game performance is worse than discrete alone.

    AMD needs to get their crap together.
    Reply
  • netkcid - Friday, September 23, 2011 - link

    good job trying to switch and run dx11 apps on a dx10 igp... and then complain about it...dur Reply
  • JarredWalton - Saturday, September 24, 2011 - link

    All of the games should run on the Intel IGP; none of them are DX11 only. Double dur. The point is that they're not switching over to the IGP when the laptop is set to do so. But that's not really the major issue -- if you're running a game, you'll want it on the dGPU 99% of the time. The real issue is that the dynamic switching failed to work properly. When the competition can make it work properly, and when your non-dynamic generally can work properly, then your drivers are not finished and certainly not worthy of WHQL certification (which has apparently become a joke these days). Reply
  • medi01 - Sunday, September 25, 2011 - link

    "Before we get to the actual meat of this review, we have a disclaimer to make: both laptops we’re comparing came to us via NVIDIA. Now, before anyone cries “foul!”, let me explain..."

    Let me explain: AMD not sending you whatever you want is not a good enough reason to do hidden ads for nVidia.
    Reply
  • JarredWalton - Tuesday, September 27, 2011 - link

    Thanks for speaking for AMD, but I talked to them over four times throughout the course of this review. I've had this hardware since July, if that helps you figure out how much I've been waiting for AMD to step up with something better. In effect, I told AMD when I first got the laptop, "Look guys, I've got a Sony VAIO C with Intel + AMD dynamic switchable graphics, and there are problems. There are also no recent drivers. This review won't be favorable toward your solution if you can't get me something better." If I had given them a week, or even two, sure, but two months is more than enough time to give me something better if such a thing exists.

    Sticking up for AMD's solution just because it's from AMD is not a reason enough to do so. The solution as it currently stands is broken. It can be fixed, but until it actually IS fixed, I cannot recommend buying it. Like I said, I'd rather have AMD discrete GPU without switchable graphics on a laptop, because then at least I can get driver updates. And if you don't need driver updates, you don't really need a discrete GPU in your laptop.
    Reply
  • sehergull - Tuesday, October 18, 2011 - link

    I like this above information.But i need more information with best examples to handle this branch accounting in a specialized way.

    Great information and very helpful while gathering information on ------- thanks

    http://city-news.org
    Reply

Log in

Don't have an account? Sign up now