Power Consumption

Both the A8-3850 and Intel's Core i3-2105 are built on a 32nm process and both feature extensive power and clock gating. By virtue of having lower power cores the A8 manages to beat the Core i3 in idle power consumption. Under CPU load however the A8-3850 does consume more power as it simply has more cores that can be loaded up. We also see higher power consumption in 3D gaming, but we do get much higher performance and as a result much better performance per watt.

Power Consumption Comparison
  AMD A8-3850 Intel Core i3-2105
Idle 43.6W 51.7W
GPU Accelerated Video Transcoding 126W 85W
3D Gaming (Metro 2033) 126W 101W
CPU Load (x264 Encode) 123W 87.6W

Final Words

If you're building an entry level gaming PC and have to rely solely on integrated graphics, it's clear that Llano is the only solution on the market today. You easily get 2x the frame rates of Intel's Core i3-2105 and can use that extra headroom to increase resolution, quality or sometimes both. The performance advantage is just one aspect of what Llano offers in this department. You do also get better overall game compatibility, DX11 and GPU compute support although the latter is still missing that killer app.

AMD's dual-graphics (asymmetric CrossFire) is an interesting solution to the argument that you could just buy a cheaper AMD CPU and a low end discrete GPU and get better performance. For example, you could get better performance if you bought a Radeon HD 6570 and an Athlon II X4 640 for $175 vs. a A8-3850 for $135. With dual-graphics in play you could add a discrete GPU to the A8-3850 and have better overall performance (in theory) than the discrete card by itself. In practice, limiting dual-graphics to only DX10/11 titles does hurt some of its potential. In my opinion the better solution here would be more aggressive pricing on the Llano APUs. The Athlon II X4 + Radeon HD 6570 is a better buy (unless you want the power savings of the A8), the only way to truly combat that is for the A8-3850 to drop in price.

If gaming isn't something you're going to be doing then you're better off with Sandy Bridge. And at that point there's no need to spring for the Core i3-2105, the standard 2100 will do just fine.

Compute & Video Transcoding Performance
POST A COMMENT

99 Comments

View All Comments

  • Roland00Address - Thursday, June 30, 2011 - link

    1366x768 would be preferable than 1024x768, for very few things use 1024x768 anymore, and 1366x768 is the default resolution for 720p tvs, 18.5 inch monitors, and older cheap monitors bought in the last 3 to 4 years. Reply
  • BigDDesign - Friday, July 01, 2011 - link

    I like to play games. 1024X768 is useable in a pinch. So bragging is out. Who cares? Reply
  • CSMR - Sunday, July 03, 2011 - link

    What do you mean, image quality comparison? If two graphics cards differ in image quality, one of them does not work and needs bug-fixes. So your question is really, are there any bugs in graphics output on this "APU".

    Unfortunately sites often do "image quality comparisons" but it is nonsense, actually marketing nonsense.
    Reply
  • Musafir_86 - Tuesday, July 05, 2011 - link

    -Image Quality (IQ) here means the rendered 3D images of 3D games, the latest of them includes Crysis 2, DiRT 3, etc.

    -In most games, different levels of quality (quality setting like low, medium, high) is provided for scalability reason (so they can cater a wider range of customers).

    -Different GPU uses different algorithms/techniques or 'tweaks' to squeeze maximum performance at a given quality metric. So, in applying those tweaks, the rendered output is sometimes different between one another even though the quality level is same (e.g. High vs. High). Driver maturity is also another contributing factor.

    -FYI, Anandtech DOES provide IQ comparisons before, especially when comparing new Radeon and GeForce generations.

    -BTW, I hope the IQ comparison as promised by Ganesh would be available soon.

    Regards.
    Reply
  • ckryan - Thursday, June 30, 2011 - link

    Like the mobile version, Llano on the desktop is actually kinda impressive. Not necessarily for its current perfomance, but rather as an indication of what to expect from Trinity. With a significant performance boost to both parts of the CPU, I can easily envision my next laptop using Llano's successor.

    While its impossible not to compare these parts to Sandy Bridge, its not like a Phenom II x4 is hopelessly obsolete. Llano's desktop power figures are pretty good, but it seems like the 2105 draws way more power than it needs at idle -- my i5 2500k/P67 and a GTX 460 pulls 49w from the wall according to my P3 Kill A Watt.

    Still, I'm glad to see AMDs fusion initiative paying some dividends. It's good to have options, but I think it's time they stopped playing hard to get with Bulldozer.
    Reply
  • L. - Thursday, June 30, 2011 - link

    Hey .. I was just thinking, considering the scaling, would you just run those tests again with a decent memory kit ?

    I mean, most people buying Llano will probably get 1866 at the price of 1600, considering usual RAM market trends (1333 is dead/ price rising - 1600 is bottom price, many kits actually do much more than their SPD - 1866 is bottom within 3 months I guess, etc.).

    So, what about some real neat kits @ 2.2+ or even relatively cheap ones around 2Ghz ?

    I'm pretty sure that's where Llano will start making sense (or ... with quad-ddr5 .. 8 times the bandwidth should do the trick, right ?)
    Reply
  • L. - Thursday, June 30, 2011 - link

    I forgot one detail, there is no benchmark showing how that 6550D fares when you add a bit of shiny to your gfx settings, is it really so pitiful noone would ever consider pressing the button ? Reply
  • mino - Sunday, July 03, 2011 - link

    i* cannot handle that, so writing such is a big no-no. Reply
  • jjj - Thursday, June 30, 2011 - link

    No overclocking section :(
    I know the ASRock A75 Extreme6 review covers it a bit,but not nearly enough(doesn't even tell us idle voltages)
    Was curious to see how it overclocks in a few scenarios:
    - with a discrete GPU (IGP fully sleeping)
    - with 2 CPU cores disabled (if it is even possible)
    - with the GPU starved by limiting the ammount of ram it gets
    - how far can it be underclocked
    Reply
  • beginner99 - Thursday, June 30, 2011 - link

    Marketed for years as being something special (APU) but seriously it's nothing special at all. Intel was 1.5 years earlier (Arrandale). Agreed the power of the igp was pretty bad but still...

    This is basically only usable in the mobile version. And there it ain't to bad especially in terms of power consumption. Considering AMDs mobile track record in the last couple of year I would say its a pretty good comeback. And this was obviously its main target.
    Reply

Log in

Don't have an account? Sign up now