The AMD Llano Notebook Review: Competing in the Mobile Market
by Jarred Walton & Anand Lal Shimpi on June 14, 2011 12:01 AM ESTIntroducing Mobile Llano
Anand has provided our coverage of Llano’s architecture and he’ll have a preview of desktop performance, but he’s leaving the mobile coverage to me (Jarred). At a high level, the breakdown of Llano is really quite simple: take a K10.5 series CPU core (dual- or quad-core), pair it up with a DX11 capable GPU core similar to AMD’s Redwood line (5600/5600M or 6500M), and then mix in power gating and Turbo Core; bake everything in a 32nm process and you’ve got Llano. Easier said than done, of course, as K10.5 parts previously used a 45nm process while Redwood used 40nm, so AMD had plenty of work to do before they could realize the simplistic overview I just described; the result is what matters, though, so let’s break out our spoons and see how the pudding tastes. Here’s the overview of the mobile A-series APUs launching today.
AMD A-Series Fusion APUs for Notebooks | |||||||
APU Model | A8-3530MX | A8-3510MX | A8-3500M | A6-3410MX | A6-3400M | A4-3310MX | A4-3300M |
CPU Cores | 4 | 4 | 4 | 4 | 4 | 2 | 2 |
CPU Clock (Base/Max) | 1.9/2.6GHz | 1.8/2.5GHz | 1.5/2.4GHz | 1.6/2.3GHz | 1.4/2.3GHz | 2.1/2.5GHz | 1.9/2.5GHz |
L2 Cache (MB) | 4 | 4 | 4 | 4 | 4 | 2 | 2 |
Radeon Model | HD 6620G | HD 6620G | HD 6620G | HD 6520G | HD 6520G | HD 6480G | HD 6480G |
Radeon Cores | 400 | 400 | 400 | 320 | 320 | 240 | 240 |
GPU Clock (MHz) | 444 | 444 | 444 | 400 | 400 | 444 | 444 |
TDP | 45W | 45W | 35W | 45W | 35W | 45W | 35W |
Max DDR3 Speed |
DDR3- 1600 DDR3L- 1333 |
DDR3- 1600 DDR3L- 1333 |
DDR3- 1333 DDR3L- 1333 |
DDR3- 1600 DDR3L- 1333 |
DDR3- 1333 DDR3L- 1333 |
DDR3- 1333 DDR3L- 1333 |
DDR3- 1333 DDR3L- 1333 |
There are two different power envelopes for Llano right now: 35W and 45W. The former models end with an M while the latter end in MX. Don’t let the relatively high TDPs fool you, as similar to Intel we’re looking at maximum TDP while idle and low-load TDP will be far lower. Based on battery life, it appears that the entire test notebook consumes around 7.42W at idle. By comparison, a slightly larger dual-core SNB notebook consumes around 7.68W when idle, so we’re very close to parity at idle. As noted earlier, all APU models come with 1MB L2 cache per core, and Turbo Core allows for cores to clock up to higher values under the right circumstances. That could prove important, as clock-for-clock K10.5 cores can’t hope to keep up with Sandy Bridge, and Sandy Bridge parts are already clocking significantly higher.
On the CPU side of the equation, there are currently only dual-core and quad-core parts, so tri-core appears dead (or at least MIA for now). The other part of the APU is the GPU cores, and here there are three options. The A6 and A8 APUs are both quad-core, but A6 has 320 Radeon cores clocked at 400MHz compared to 400 cores at 444MHz—so the 6620G is potentially 40% faster. A4 APUs trim the GPU further, with 240 cores clocked at 444MHz, and they’re the dual-core parts. The 6620G could be up to 67% faster than 6480G, under the right circumstances. As Anand mentioned, right now all of the A-series APUs are coming from the “big Llano” die, but in the future we’ll see the A4 production shift to “little Llano” instead of using harvested die.
Vision and Radeon Branding
For 2011, AMD is simplifying their Vision branding with Llano, skipping the Premium, Ultimate, and Black modifiers and instead referring to the APU. Vision E2 refers to the dual-core E-series APUs, while the A4, A6, and A8 lines correlate directly with the A-series APUs. The Radeon brand continues as an important asset, so there will be sticker options to promote quad-core and dual-core CPUs with Radeon graphics. What about the Dual Graphics, though?
With the integrated GPU finally able to approach the performance of midrange mobile GPUs, AMD is making a return to hybrid CrossFire (IGP and a dGPU working together), though the official name is now apparently “Radeon Dual Graphics” or just "Dual Graphics"; we’ve also heard it referred to as “Asymmetrical CrossFire”, and we’ll use any of these terms throughout this article.
We first saw an attempt at hybrid CrossFire with the HD 2400 and the 790 chipset, and later that extended to HD 3400 cards, but it never really impressed as it was limited to desktops and you could still get far better performance by spending an extra $10 to upgrade from a 3400 to a 3600 dGPU. The 6620G fGPU is several times more powerful than the old HD 4250 IGP, making CrossFire potential useful, especially on laptops where the power savings from shutting off the dGPU are very significant.
With Radeon Dual Graphics, AMD introduces more brands. The various Fusion GPUs (fGPUs) only work in CrossFire with specific discrete GPUs (dGPUs)—nearly all of the 6400M, 6600M, and 6700M line are eligible—giving rise to several new Radeon names. If you start with a base of a Radeon HD 6620G and add a Radeon HD 6770M to it, the resulting combination is now called a Radeon HD 6775G2. Pair it with a 6750M and you get a 6755G2. The entirety of the list is depicted in the slide from above. For now these names are just going to be listed on the notebook spec sheet, the drivers themselves will report the actual GPU you have driving the panel you're connected to. AMD is still working out the right way to expose these names through software to avoid confusion.
177 Comments
View All Comments
ET - Wednesday, June 15, 2011 - link
It may be impossible to know the exact speed the cores run, but it would be interesting to run a test to get some relative numbers.You can run a single threaded CPU bound program such as SuperPI, then run it again with the other three cores at 100% (for example by having another three instances of SuperPI running). Do this on AC and battery, and it might generate some interesting numbers. At the very least we'll be able to tell whether the 1.5GHz -> 2.4GHz ratio looks right.
ET - Wednesday, June 15, 2011 - link
By the way, I just read Tom's Hardware review, which was unique in that it compared to a Phenom II X4 running at 1.5GHz and 2.4GHz. It looked from these benchmarks like the A8-3500M is always performing around the 1.5GHz level of the Phenom II X4 (sometimes it's a little faster, sometimes a little slower), which suggests that Turbo Core doesn't really kick in.i_am_the_avenger - Wednesday, June 15, 2011 - link
Maybe this will cheer the AMD Fans a bitThis article did not mention some nifty features the APUs have (or maybe it did I did not read it line by line)...................................
Watch the video below from engadget:
http://www.engadget.com/2011/06/13/amds-fusion-a-s...
It shows how these APUs can smooth out shaking videos real time, even while streaming from Youtube! and it does a very good job.
Another feature is how it en-chances videos (colour etc.)
This improves general user PC experience.......... something very desirable
The video also shows how AMD wants to target general users and not work enthusiasts
Another video shows comparison between the i7-2630QM and A8-3500M while multitasking video related applications.
http://www.engadget.com/2011/03/01/amd-compares-up...
---Interesting to note that the APU Gradually increased its power consumption while i7 was like bursting to and fro, something the way turbo core acts maybe-----
I think it is work vs general performance,
Intel's great for work, when you need to finish tasks and it needs to be done quickly,
while AMD APUs give you a good over all pc and multimedia performance - you watch videos, play games, so what if the zip file extracts a minute late and the fGPU performance is great....
You may buy a i7 SNB with discreate GPU but that has a battery life hit (for same battery capacity) and also extra heat generation which requires more fans, also the extra weight..
Please don't start judging me or something....
I am getting confused myself, while intel looks great in every way except stock gaming and battery life(not that bad)... I think I don't need that much power, even if I work - my work isn't so CPU oriented that an i7 would matter, a 30 second task finishes in 20 ok but it does not matter to me..... but improved video and battery seems more useful to me
I don't think that all of us have to tax our CPUs to full potential -- a few have to, not considering them -- so even if Intel have faster processors for many it does not affect them as much.
psychobriggsy - Wednesday, June 15, 2011 - link
For all your moaning about not getting Asymmetric CrossFire to work, you didn't read the reviewers guide that says it only works in DX10 and DX11 mode, not DX9. So your Dirt2 benches for example clearly state DX9 for this test. I don't know about the other titles on that page - you say 5 of the others are DX9 titles. Do these titles have DX10 modes of operation - if so, USE THEM.Otherwise it just looks like you are trying to get the best results for the Intel Integrated Graphics.
Just put "0 - Unsupported" for DX11 tests by HD3000 like other sites have done.
ET - Wednesday, June 15, 2011 - link
The article said:"AMD told us in an email on Monday (after all of our testing was already complete) that the current ACF implementation on our test notebook and with the test drivers only works on DX10/11 games. It's not clear if this will be the intention for future ACF enabled laptops or if this is specific to our review sample. Even at our "High" settings, five of our ten titles are DX9 games (DiRT 2, L4D2, Mafia II, Mass Effect 2, and StarCraft II--lots of twos in there, I know!), so they shouldn't show any improvement...and they don't. Actually, the five DX9 games even show reduced performance relative to the dGPU, so not only does ACF not help but it hinders. That's the bad news. The only good news is that the other half of the games show moderate performance increases over the dGPU."
I agree that at least in the case of DiRT 2 that's blatantly false, since that game was one of the first to use DX11, and was given with many Radeon 58x0 cards for this reason.
JarredWalton - Friday, June 17, 2011 - link
DiRT 2 supports DX11, but it's only DX9 or DX11. We chose to standardize on DX9 for our Low/Med/High settings -- and actually, DX11 runs slower at the High settings than DX9 does (though perhaps it looks slightly better). Anyway, we do test DiRT 2 with DX11 for our "Ultra" settings, but Llano isn't fast enough to handle 1080p with 4xAA and DX11. So to be clear, I'm not saying DiRT 2 isn't DX11; I'm saying that the settings we standardized on over a year ago are not DX11.jitttaaa - Wednesday, June 15, 2011 - link
How is the notebook llano performing as good, if not better than the desktop llano?ET - Wednesday, June 15, 2011 - link
At least as far as CPU power is concerned, the desktop part is obviously faster. The benchmarks are mostly not compatible so it's hard to judge, but in Cinebench R10 the mobile Llano gets 2037 while the desktop gets 3390. I agree that for graphics it looks like the desktop part is performing worse in games, which is strange considering the GPU is working at a faster speed.Only explanation I can think of is that the faster CPU is taking too much memory bandwidth, but it doesn't make much sense since it's been said that the GPU gets priority. It's definitely something that's worth checking out with AMD.
ionave - Thursday, June 16, 2011 - link
http://www.anandtech.com/show/4448/amd-llano-deskt...On average the A8-3850 is 58% faster than the Core i5 2500K.
Boom. Delivered. You think its slow? It really isn't. The A8-3850 has about the performance of a DESKTOP i3. If you think that is bad performance, then you don't know what you are talking about. The battery life is amazing for having that kind of performance in a laptop. I'm sorry, but it totally destroys i7 and i5 platforms because of the sheer performance in that amazing battery life.
JarredWalton - Friday, June 17, 2011 - link
Let me correct that for you:On average, the A8-3850 fGPU (6550D) is 58% faster than the Core i5-2500K's HD 3000 IGP, in games running at low quality settings. It is also 29% faster than the i5-2500K with a discrete HD 5450, which is a $25 graphics card. On the other hand, the i5-2500K with an HD 5570 (a $50 GPU) is on average 66% faster than the A8-3850.
Boom. Delivered. You think that's fast? It really isn't. The 6550D has about the performance of a $35 desktop GPU. If you think that is good performance, then you don't know what you are talking about.
At least Llano is decent for laptops, but for $650 you can already get i3-2310M with a GT 520M and Optimus. Let me spell it out for you: better performance on the CPU, similar or better performance on the GPU, and a price online that's already $50 below the suggested target of the A8-3500M. Realistically, A8-3500M will need to sell for $600 to be viable, A6 for $500, and A4 for $450 or less.