Asymmetric CrossFire

Asymmetric CrossFire is supported by desktop Llano APUs. You can combine your A6 or A8 with a Radeon HD 6450, 6570 or 6670 and have both GPUs work in tandem. There are some limitations as we found - mainly asymmetric CF only works in DX10 or DX11 games. While DirectX 9 titles will still function, performance will be suboptimal as you'll soon see.

AMD Radeon Dual Graphics Branding
Discrete GPU 6550D 6530D
HD 6670 HD 6690D2 HD 6690D2
HD 6570 HD 6630D2 HD 6610D2
HD 6450 HD 6550D2 HD 6550D2

Getting asymmetric CrossFire working was a breeze. ASRock's BIOS was setup by default to allow for a secondary discrete GPU to function in dual-graphics mode, all I had to do was install a Radeon HD 6570. My monitor remained plugged into the motherboard and the driver handled the rest:

If you're running a DX10/DX11 game you can get positive scaling, however the gains vary. In Crysis we saw a 68% increase in performance over the APU alone, but only a 12% increase over the discrete GPU by itself:

Crysis: Warhead

HAWX showed us bigger gains on both sides. We saw a more than 2x improvement over the APU alone and a 32% increase over the Radeon HD 6570 by itself:

HAWX

Fire up a DX9 game (or a DX10/11 game in DX9 mode) however, and the results are disappointing. You actually get lower performance than if you had stuck with the discrete GPU alone:

Metro 2033

DiRT 2

Mass Effect 2

StarCraft II

GPU Performance Performance in Older Games
Comments Locked

99 Comments

View All Comments

  • Roland00Address - Thursday, June 30, 2011 - link

    1366x768 would be preferable than 1024x768, for very few things use 1024x768 anymore, and 1366x768 is the default resolution for 720p tvs, 18.5 inch monitors, and older cheap monitors bought in the last 3 to 4 years.
  • BigDDesign - Friday, July 1, 2011 - link

    I like to play games. 1024X768 is useable in a pinch. So bragging is out. Who cares?
  • CSMR - Sunday, July 3, 2011 - link

    What do you mean, image quality comparison? If two graphics cards differ in image quality, one of them does not work and needs bug-fixes. So your question is really, are there any bugs in graphics output on this "APU".

    Unfortunately sites often do "image quality comparisons" but it is nonsense, actually marketing nonsense.
  • Musafir_86 - Tuesday, July 5, 2011 - link

    -Image Quality (IQ) here means the rendered 3D images of 3D games, the latest of them includes Crysis 2, DiRT 3, etc.

    -In most games, different levels of quality (quality setting like low, medium, high) is provided for scalability reason (so they can cater a wider range of customers).

    -Different GPU uses different algorithms/techniques or 'tweaks' to squeeze maximum performance at a given quality metric. So, in applying those tweaks, the rendered output is sometimes different between one another even though the quality level is same (e.g. High vs. High). Driver maturity is also another contributing factor.

    -FYI, Anandtech DOES provide IQ comparisons before, especially when comparing new Radeon and GeForce generations.

    -BTW, I hope the IQ comparison as promised by Ganesh would be available soon.

    Regards.
  • ckryan - Thursday, June 30, 2011 - link

    Like the mobile version, Llano on the desktop is actually kinda impressive. Not necessarily for its current perfomance, but rather as an indication of what to expect from Trinity. With a significant performance boost to both parts of the CPU, I can easily envision my next laptop using Llano's successor.

    While its impossible not to compare these parts to Sandy Bridge, its not like a Phenom II x4 is hopelessly obsolete. Llano's desktop power figures are pretty good, but it seems like the 2105 draws way more power than it needs at idle -- my i5 2500k/P67 and a GTX 460 pulls 49w from the wall according to my P3 Kill A Watt.

    Still, I'm glad to see AMDs fusion initiative paying some dividends. It's good to have options, but I think it's time they stopped playing hard to get with Bulldozer.
  • L. - Thursday, June 30, 2011 - link

    Hey .. I was just thinking, considering the scaling, would you just run those tests again with a decent memory kit ?

    I mean, most people buying Llano will probably get 1866 at the price of 1600, considering usual RAM market trends (1333 is dead/ price rising - 1600 is bottom price, many kits actually do much more than their SPD - 1866 is bottom within 3 months I guess, etc.).

    So, what about some real neat kits @ 2.2+ or even relatively cheap ones around 2Ghz ?

    I'm pretty sure that's where Llano will start making sense (or ... with quad-ddr5 .. 8 times the bandwidth should do the trick, right ?)
  • L. - Thursday, June 30, 2011 - link

    I forgot one detail, there is no benchmark showing how that 6550D fares when you add a bit of shiny to your gfx settings, is it really so pitiful noone would ever consider pressing the button ?
  • mino - Sunday, July 3, 2011 - link

    i* cannot handle that, so writing such is a big no-no.
  • jjj - Thursday, June 30, 2011 - link

    No overclocking section :(
    I know the ASRock A75 Extreme6 review covers it a bit,but not nearly enough(doesn't even tell us idle voltages)
    Was curious to see how it overclocks in a few scenarios:
    - with a discrete GPU (IGP fully sleeping)
    - with 2 CPU cores disabled (if it is even possible)
    - with the GPU starved by limiting the ammount of ram it gets
    - how far can it be underclocked
  • beginner99 - Thursday, June 30, 2011 - link

    Marketed for years as being something special (APU) but seriously it's nothing special at all. Intel was 1.5 years earlier (Arrandale). Agreed the power of the igp was pretty bad but still...

    This is basically only usable in the mobile version. And there it ain't to bad especially in terms of power consumption. Considering AMDs mobile track record in the last couple of year I would say its a pretty good comeback. And this was obviously its main target.

Log in

Don't have an account? Sign up now