Other Technical Details and Performance Expectations

So far we’ve discussed the past and near future of AMD’s Enduro/Switchable Graphics, but we haven’t gone into the technical aspects much. We’ve covered most of this previously (and neither AMD nor NVIDIA provide a ton of detail as to how precisely they’re doing the work), but there are a couple other tidbits we wanted to briefly discuss before wrapping up.

At a high level, all of the display outputs on a laptop now connect to the Intel iGPU, and AMD is able to route their content through the PCIe bus to the embedded graphics and out to the display. Nothing has really changed there; content is copied from the dGPU to the iGPU output in some fashion and you get the ability to switch seamlessly between the two GPUs. We also mentioned earlier that AMD has now removed the need for the active PCIe bus when the GPU is powered down, which drops power use of the dGPU from less than 100mW or so down to 0W.

One thing that hasn’t changed is AMD’s use of Link Adapter Mode (LDA) where NVIDIA uses Multi Adapter Mode, but we now have an explanation of why this difference exists. As far as we could tell, there’s not really an inherent superiority of either mode for general use. The primary reason AMD uses LDA is that they also have a chipset business, where NVIDIA has bowed out of making chipsets. Why this matters is that LDA is what facilitates AMD’s Dual Graphics (formerly Hybrid CrossFire)—the dGPU and the iGPU working together to render a scene. This is less important on Intel platforms, as AMD isn’t trying to do any cooperative rendering with Intel iGPUs; they potentially could in the future if desired, but that seems unlikely given the difficulty of getting even similar GPUs to work together. AMD also indicates that the use of LDA provides full support for Windows 8 Metro applications; I would assume NVIDIA also supports Metro apps, so unless that proves to not be the case (and we should know soon enough), other than Dual Graphics it appears that Enduro and Optimus are essentially at parity in terms of how they function, with software/drivers being the key differentiator.

Something else we’re still waiting to see is the packaging of the new Mobility Catalyst drivers. AMD didn’t provide us with the actual installation files—they installed them for us as they were still in a rather early state. That being the case, we aren’t sure if the Mobility Catalyst drivers for Enduro systems will feature totally independent drivers as far as Intel iGPUs are concerned, but that appears to be the case. If all goes as planned, you will be able to update your AMD dGPU drivers separately from your Intel iGPU drivers without any trouble once the Enduro Catalyst drivers start coming out.

Performance Expectations

AMD let us borrow a Sager notebook for a short time after the preview to test out the new “Enduro 5.5” drivers, and they also helped us install the drivers on a Clevo P170EM system from AVADirect. We’ll be providing a full review with performance data for the P170EM in the near future, but in the meantime we wanted to show off the Sager notebook as well as discuss performance expectations. Here's a rundown of the system specs for the Sager NP9150 along with some photos.

Sager NP9150 / Clevo P150EM Specifications
Processor Intel i7-3720QM
(Quad-core 2.60-3.60GHz, 6MB L3, 22nm, 45W)
Chipset HM77
Memory 8GB (2x4GB) DDR3-1600
Graphics Intel HD 4000
(16 EUs, up to 1250MHz)

AMD Radeon HD 7970M 2GB GDDR5
(1280 cores @ 850MHz, 256-bit 4800MHz RAM)
Display 15.6" WLED Glossy 16:9 1080p (1920x1080)
Storage 180GB Intel 520 SSD
Operating System Windows 7 Home Premium 64-bit
Price $1919 as configured (9/05/2012)

The Sager unit is their rebranded Clevo P150EM, and it has many of the same design issues that we’ve seen with Clevo in the past. The backlit keyboard with zoned lighting is a new addition, and they’ve tweaked the keyboard layout as well. Interestingly (and frustratingly), while they’ve mostly fixed my complaints with the 10-key layout, they went ahead and screwed up the main keyboard layout. The Windows key is now to the right of the spacebar, and there’s a second backslash key just right of the spacebar. I’m also not a fan of the tactile feedback from the keys, though it’s not terrible. Outside of the keyboard quirks and overly abundant use of plastic for the chassis, though, the performance is certainly there.

We ran through our current suite of games at the native 1080p on the P150EM with settings maxed out in most titles. Total War: Shogun 2 wouldn’t allow us to select Very High settings (a problem we’ve encountered on other systems in the past where the game incorrectly detects the amount of video memory and/or iGPU feature set rather than looking at the dGPU), but otherwise we get very respectable frame rates. Civilization V continues to be a bit sluggish at max settings (around 26FPS), but the brutal Battlefield 3 manages 36FPS and could very easily reach 40+ FPS if you disable 4xMSAA and just use FXAA. Those are the three lowest performing games we tested, with everything else running smoothly in the 45+ FPS range. A quick look at the last GTX 580M system we tested shows performance is better in over half of the games, and slightly lower in the other three titles. We’ll have a second look at the P170EM with a GTX 680M from AVADirect shortly after our full HD 7970M review, though, so stay tuned.

Finally, AMD did inform us that the current drivers aren’t fully optimized for performance (particularly with the 7970M), so we should hopefully see some gains with the final driver release—or if not then, the next release. Performance with GCN architecture desktop cards has been a little erratic since the launch, up until the latest Catalyst 12.7 drivers. I believe that the current beta drivers I’m using also predate 12.7 in some aspects (though they're version 9.0.0.0), so if that’s the case then the official release should clean things up quite a bit.

New for Mid-2012: “Enduro 5.5” Enhancements Closing Thoughts
Comments Locked

200 Comments

View All Comments

  • Wolfpup - Thursday, September 6, 2012 - link

    Whoops, a few spelling mistakes, and I mean to say why is Anandtech supporting this junk, not AMD...I know why AMD is, Nvidia, and apparently OEMs that would rather have all sorts of complaints about their systems not working but be able to post a longer battery life on the box.
  • JarredWalton - Thursday, September 6, 2012 - link

    Please give me a list of problems with Optimus that don't involve Linux. I've asked for this -- to you specifically -- numerous times in previous comments. And just for the record, you CAN'T disable Optimus with a BIOS tweak. The only way to get rid of Optimus on a laptop that has it is to buy a new laptop -- unless the laptop was designed to allow that (e.g. some Alienware models). An Optimus GPU has no direct connection to the display outputs, so if you want to turn off the iGPU you would have no display at all.

    As to what consumers want working dynamic graphics support: *I* want it, and most people that want a laptop that can both play games as well as last more than three hours on battery want it. Colleges and universities are full of students that carry around laptops, and those who play games all want Optimus or a similar technology. In fact, it's so desirable that even Apple has gone the switchable graphics route on MacBook Pro laptops for the past three years (though granted they only have to support a very small subset of hardware and their own OS).

    In an ideal world, we'd have a discrete GPU that can do basic work like an IGP while only consuming <1W. The problem is that when you have higher end GPUs that have 2GB RAM and all the other stuff, idling at 1W isn't likely to happen any time soon. They need to be able to shut off all VRAM except for a small amount, power down nearly all of the GPU (maybe 48 CUDA cores or 80 Stream Processors could stay active), and drop clocks way down. Then they need to be able to power all the other components back up without any delays and gracefully handle scaling of power use with demands. As complex as switchable/dynamic graphics might be, doing all of the above is even more so. That's why AMD and NVIDIA are working on Enduro and Optimus (though I assume there's also work being done to bring idle power use way down as well).
  • Vozier - Thursday, September 6, 2012 - link

    I think MANY users are pleased with switchable grapchis, just no GAMER users are.
    But dont mix things, this article is some of the best news we can get about ENDURO and its improvements.
    I know is late for many, but dont trash it, you might as well get the thing removed...

    LETS BUILD
    not DESTROY....

    regards
    Voz
  • extide - Thursday, September 6, 2012 - link

    I am a more knowledgeable consumer than you, own a P150EM with a 680m and am GLAD it has Optimus and that was a feature I WANTED.

    Optimus/Enduro when working CORRECTLY IS ACTUALLY what everyone WANTS. It is supposed to save battery and get the best gaming performance, which is what everyone wants. Right now it just doesnt work correctly in all cases, and that's what people don't want.

    As a side note I don't think Optimus on the 680m is actually working correctly, I think it is running the dGPU all the time and not shutting it down all the way, but that is a whole different topic.
  • JarredWalton - Thursday, September 6, 2012 - link

    If it's running the dGPU all the time, your battery life should be around 60 to 90 minutes tops -- less if you're using the GPU to play a game. Also, at least on the P170EM, Clevo clearly hasn't invested a lot of effort into optimizing power states when the system is idle or under a light load. I'll have the full review shortly, but basically the P170EM (NVIDIA or AMD variant) draws around 20W while idle; it should be more like 10-11W, indicating there's a whole lot of extra power being used by the motherboard and other accessories. Big OEMs like Dell, Samsung, Sony, HP, etc. usually put a lot more effort into power optimizations and it shows.
  • Hrobertgar - Friday, September 7, 2012 - link

    About 2 years ago, I bought a Dell XPS with 420M and switchable graphics, not because I wanted switchable graphics, I just liked the systems specs. The first thing I did was tell the system to use the dGPU 100% of the time, as I use it plugged into the wall 99% of the time (it just means I can travel with a game capable system).

    My experience was that when playing WoW unplugged it lasted for maybe 20-30 minutes. Fortunately that is a rare situation, but I can relate to the suggestion in the article that switchable systems should probably default to dGPU if its plugged in and iGPU if its battery power. I mean the entire point of having the dGPU is to use it, and if its plugged in to the wall then I don't care about power usage as a Laptop uses less than a desktop already, so why throttle beyond that.

    For those that think NVIDIA can do no wrong: A couple months ago I updated BIOS and drivers and somewhere in the process my frame rates cratered. Not being as fancy as many of you it took me a while to discover that one of the updates reverted to default iGPU 100% of the time. The new software prevented me from going 100% dGPU as before, and I basically had to tell it that internet and games should use dGPU, and I have added a dGPU usage icon to verify that is the case, and my frame rates were restored. I do not know if the issue was Dell or Intel or NVIDIA, but I do have an NVIDIA system and there was a small issue, with the switchable graphics.
  • JarredWalton - Friday, September 7, 2012 - link

    Normally, with Optimus you can just set the global profile to use "High-performance NVIDIA processor" and you're done. Of course, some of the newer driver profiles have things like iexplorer.exe, firefox.exe, chrome.exe, and probably a bunch of other "light" processes set to the Integrated graphics (which will override the global profile). In practice of course, I prefer to simply set Optimus laptops to "Auto" and things work properly 99% of the time for me.

    YMMV, naturally -- personally, I don't want my laptop running the dGPU all the time if I don't need it, as it's simply more heat being generated. More heat means the fans work harder, creating more noise and also potentially wearing out sooner rather than later. But then, I have a desktop for regular non-travel use, so it's not quite that critical that my laptops perform optimally all the time.
  • seapeople - Saturday, September 8, 2012 - link

    I cannot fathom how any knowledgeable person could think switchable graphics are a universally bad thing on laptops. If you're a 100% gamer, that's what desktops are for. If you're a heavy gamer, same thing. Desktops are cheaper, more reliable, easier to upgrade, and give much better performance. If you're so freaked out about it that switchable graphics seem like the devil to you, THEN GET A DESKTOP.

    For the rest of the population, switchable graphics allow you to get at least 2x and maybe more battery life while your laptop runs cooler and quieter for almost every non-gaming application out there. It's a big deal.
  • arcticjoe - Saturday, September 8, 2012 - link

    What about people who travel a lot, or are in college or need a mobile platform. Should I waste money on buying a gaming PC and a laptop for mobile work, or just get a machine that does both instead?
    Current gen laptops are very close to high end PCs in terms of performance, - most can overclock their CPUs to 4ghz+ and GPUs are quicker than last gen's flagship cards (GTX 580 and Radeon 6970).
  • johnxfire - Thursday, September 6, 2012 - link

    I've got a 7970M on my P150HM since my HD6990M died and I was too damned to send in my whole laptop to get it repaired.

    The 7970M without Enduro is a dream. Really fast, temps stay below 80c. Hopefully the P150/170EM owners will get to experience the fullest of their 7970Ms.

Log in

Don't have an account? Sign up now