High Detail Gaming and Asymmetrical CrossFire Misfire

Update, 8/10/2011: Just to let you know, AMD managed to get me a new BIOS to address some of the rendering issues I experienced with CrossFire. As you'll read below, I had problems in several titles, and I still take exception with the "DX10/11 only" approach. I can name dozens of good games out there that are DX9-only that released in the past year. Anyway, the updated BIOS has at least addressed the rendering errors I noticed, so retail Asymmetrical CrossFire laptops should do better. With that disclaimer out of the way, here's my initial experience from two months back.

So far, the story for Llano and gaming has been quite good. The notebook we received comes with the 6620G fGPU along with a 6630M dGPU, though, and AMD has enabled Asymmetrical CrossFire...sort of. The results for ACF in 3DMarks were interesting if only academic, so now we're going to look at how Llano performs with ACF enabled and running at our High detail settings (using an external LCD).

Just a warning before we get to the charts: this is preproduction hardware, and AMD informed us (post-review) that they stopped worrying about fixing BIOS issues on this particular laptop because it isn't going to see production. AMD sent us an updated driver late last week that was supposed to address some of the CrossFire issues, but in our experience it didn’t help and actually hurt in a few titles. Given that the heart of the problem is in the current BIOS, that might also explain why Turbo Core doesn't seem to be working as well as we would expect.

AMD also notes that the current ACF implementation only works on DX10/11 games, and at present that's their plan going forwards as the majority of software vendors state they will be moving to DX10/11. While the future might be a DX10/11 world, the fact is that many recent titles are still DX9 only. Even at our "High" settings, five of our ten titles are tested in DX9 mode (DiRT 2, L4D2, Mafia II, Mass Effect 2, and StarCraft II—lots of twos in there, I know!), so they shouldn't show any improvement...and they don't. Of those five titles, four don't have any support for DX10/11 (DiRT 2 being the exception), and even very recent, high-profile games are still shipping in DX9 form (e.g. Crysis 2, though a DX11 patch is still in the works).  Not showing an improvement is one thing, but as we'll see in a moment, enabling CrossFire mode actually reduces performance by 10-15% relative to the dGPU. That's the bad news. The good news is that the other half of the games show moderate performance increases over the dGPU.

If that doesn't make the situation patently clear, CrossFire on our test unit is largely not in what we consider a working state. With that out of the way, here are the results we did managed to cobble together:

Battlefield: Bad Company 2

Civilization V

DiRT 2

Left 4 Dead 2

Mafia II

Mass Effect 2

Metro 2033

STALKER: Call of Pripyat

StarCraft II: Wings of Liberty

Total War: Shogun 2

Given this is preproduction hardware that won't see a store shelf, the above results are almost meaningless. If ACF can provide at least a 30% increase on average, like what we see in TWS2, it could be useful. If it can't do at least 30%, it seems like switchable graphics with an HD 6730M would be less problematic and provide better performance. The only takeaway we have right now is that ACF is largely not working on this particular unit. Shipping hardware and drivers should be better (they could hardly be worse), but let's just do a quick discussion of the results.

If we just look at games with DX10/11 enabled, the story isn't too bad. Not accounting for the rendering issues noted below, ACF is able to boost performance by an average of 24% over the dGPU at our High settings. We didn’t include the Low and Medium results for ACF on the previous page for what should be obvious reasons, but if the results at our High settings are less than stellar, Low and Medium settings are even less impressive. Trimming our list of titles to three games (we tested TWS2 and STALKER in DX9 mode at our Low and Medium settings), ACF manages to average a 1% performance increase over the dGPU at Low and a 14% increase at Medium, but Civ5 still had to contend with rendering errors and Metro 2033 showed reduced performance.

In terms of rendering quality, ACF is very buggy on the test system; the default BIOS settings initially resulted in corrupted output for most games and 3D apps, but even with the correct settings we still encountered plenty of rendering errors. Civilization V only had one GPU rendering everything properly while units were missing on the other GPU, so you’d get a flicker every other frame with units appearing/disappearing. At higher detail settings, the corruption was even more severe. STALKER: Call of Pripyat and Total War: Shogun 2 also had rendering errors/flickering at higher quality settings. Since we didn't enable DX10/11 until our High defaults, right when ACF is supposed to start helping is where we encountered rendering issues.

Just to be clear: none of this means that Asymmetrical CrossFire is a bad idea; it just needs a lot more work on the drivers and BIOS. If/when we get a retail notebook that includes Asymmetrical CrossFire support, we’ll be sure to revisit the topic. Why ACF isn’t supported in DX9 is still a looming question, and AMD’s drivers need a much better interface for managing switchable graphics profiles. A list of all supported games with a central location to change all the settings would be a huge step up from the current UI, and users need the ability to enable/disable CrossFire support on a per-game basis if AMD wants anyone to actually use ACF. We also hope AMD rethinks their “only for DX10/DX11 modes” stance; CrossFire has worked with numerous DX9 games in the past, and what we’d like to see is ACF with the same list of supported games as regular CrossFire. If nothing else, having ACF enabled shouldn't reduce performance in DX9 titles.

In summary: we don't know if ACF will really help that much. We tested Asymmetrical CrossFire on what is, at best, beta hardware and drivers, and it didn't work very well. We want it to work, and the potential is certainly there, but we'll need to wait for a better test platform. To be continued....

Fusion GPU Takes on Gaming AMD’s Llano Platform: Contending for your Mobile Dollar
Comments Locked

177 Comments

View All Comments

  • phantom505 - Tuesday, June 14, 2011 - link

    I went with a K325 in a Toshiba with a Radeon IGP. Nobody I have lent it out to has every complained about it being slow or incapable of doing what they wanted/needed to. I get about 5 hours of battery life consistently. I don't do too much that is CPU intensive but I hear people moan and groan about the E-350 and Atom both when they try to open 50MB+ ppt files. I have no such problems.

    I for one an quite happy to see that AMD is still leading this segment since most users will be quite happy with AMD. I'm finding it more and more that Intel may own the top end, but nobody I know cares in the slightest.
  • mino - Tuesday, June 14, 2011 - link

    E-350 is generally faster than K325 + IGP. Then than that, I fully agree.
  • ash9 - Tuesday, June 14, 2011 - link

    In this price range, I think not, besides Open(X) applications will reveal the potential - its up to the application developers now
  • GaMEChld - Tuesday, June 14, 2011 - link

    My netbook is a pain to use precisely because of its graphics. It cannot properly play youtube or movie files fluently. Aside from its multi-media problems, I don't try to do ridiculous things on a netbook, so the other components are not much of a factor for me. But if I can't even watch videos properly, then it's trash.

    Luckily, I got that netbook for free, so I'm not that sad about it. I'll probably sell it on eBay and get a Brazos netbook at some point.
  • hvakrg - Tuesday, June 14, 2011 - link

    Yes, they're becoming primary machines, but what exactly do you need the CPU part for in a primary machine today? Let's face it most people use their computer to browse the web, listen to music and watch videos, all of which are either relying on the GPU today or is clearly moving in that direction.

    Intel will have an advantage in the hardcore CPU market probably forever due to them being years ahead of the competition in manufacturing processes, but what advantage does that give them when it comes to selling computers to the end user? Things like battery life and GPU performance is what will be weighted in the future.
  • Broheim - Wednesday, June 15, 2011 - link

    personally I need it to compile thousands of lines of code sometimes several times a day, if I were to settle for a E-350 I'd die of old age long before I get my masters in computer science.... some of us actually gives our 2600k @ 4.5ghz a run for it's money.

    th G in GPU doesn't stand for General... the GPU can only do a few highly specialized tasks, it's never going to replace and will always rely on the CPU. Unless you're a gamer you benifit much more from a fast CPU than a fast GPU, and even as a gamer you still need a good CPU.

    don't believe me? take a E-350 and do all the things you listed, then strap a HD6990 onto it and try and see if you can tell the difference...
    trust me, you can't.
  • ET - Wednesday, June 15, 2011 - link

    Compiling code is a minority application, although I did that at a pinch on a 1.2GHz Pentium M, so the E-350 would do as well. Certainly won't use it for my main development machine, I agree.

    Still, as hvakrg said, most users do web browsing, listen to music, watch video. The E-350 would work well enough for that.
  • sinigami - Wednesday, June 15, 2011 - link

    >most users do web
    >browsing, listen to music,
    >watch video. The E-350
    >would work well enough
    >for that.

    The Atom also works well enough for that, for less money.

    You might be pleasantly surprised to find that current Atom netbooks can play 720p MKVs. For netbook level video, that's "well enough".

    As you said, for anything tougher than that, i wouldn't use it for my "main machine" either.
  • ionave - Thursday, June 16, 2011 - link

    Why would you spend $2000 for an intel powered laptop when you can build a desktop to do computations for a quarter of the price at 20x the speed, and get a laptop for $400 to run code on the desktop remotely and use it for lighter tasks? I'm surprised that you are a masters student in computer science, because your lack of logic doesn't reflect it. Correct me if I'm wrong, but why would you compute on the go when you can let the code on a desktop or cluster while the laptop is safely powered down in your backpack?

    Also, I can run Super Mario Galaxy using dolphin (CPU intensive) emulator at full frame rate on my AMD Phenom II X2 BE, and the cores in the A8 are improved versions of Phenom II X4. You really need to get your facts straight, since the CPU is actually VERY good. Go look at the benchmarks and do your research
  • Broheim - Thursday, June 16, 2011 - link

    he clearly said primary machine, so before you go around insulting me I'd suggest you learn how to read.
    the 2600K is a desktop CPU you douchebucket, I never said my main machine was a laptop, quite to the contrary.

    what you can and can't do is of no interrest to me, but first off, I never mentioned the A8 I said E-350, again with the failure to read.
    nevertheless...
    K10 is not even a match for Nehalem, and so far behind Sandy bridge it's ridiculous.
    I've seen the benchmarks, I've done my research and concluded that the A8 CPU is far from "VERY" good, have you done yours?

Log in

Don't have an account? Sign up now