High Detail Gaming and Asymmetrical CrossFire Misfire

Update, 8/10/2011: Just to let you know, AMD managed to get me a new BIOS to address some of the rendering issues I experienced with CrossFire. As you'll read below, I had problems in several titles, and I still take exception with the "DX10/11 only" approach. I can name dozens of good games out there that are DX9-only that released in the past year. Anyway, the updated BIOS has at least addressed the rendering errors I noticed, so retail Asymmetrical CrossFire laptops should do better. With that disclaimer out of the way, here's my initial experience from two months back.

So far, the story for Llano and gaming has been quite good. The notebook we received comes with the 6620G fGPU along with a 6630M dGPU, though, and AMD has enabled Asymmetrical CrossFire...sort of. The results for ACF in 3DMarks were interesting if only academic, so now we're going to look at how Llano performs with ACF enabled and running at our High detail settings (using an external LCD).

Just a warning before we get to the charts: this is preproduction hardware, and AMD informed us (post-review) that they stopped worrying about fixing BIOS issues on this particular laptop because it isn't going to see production. AMD sent us an updated driver late last week that was supposed to address some of the CrossFire issues, but in our experience it didn’t help and actually hurt in a few titles. Given that the heart of the problem is in the current BIOS, that might also explain why Turbo Core doesn't seem to be working as well as we would expect.

AMD also notes that the current ACF implementation only works on DX10/11 games, and at present that's their plan going forwards as the majority of software vendors state they will be moving to DX10/11. While the future might be a DX10/11 world, the fact is that many recent titles are still DX9 only. Even at our "High" settings, five of our ten titles are tested in DX9 mode (DiRT 2, L4D2, Mafia II, Mass Effect 2, and StarCraft II—lots of twos in there, I know!), so they shouldn't show any improvement...and they don't. Of those five titles, four don't have any support for DX10/11 (DiRT 2 being the exception), and even very recent, high-profile games are still shipping in DX9 form (e.g. Crysis 2, though a DX11 patch is still in the works).  Not showing an improvement is one thing, but as we'll see in a moment, enabling CrossFire mode actually reduces performance by 10-15% relative to the dGPU. That's the bad news. The good news is that the other half of the games show moderate performance increases over the dGPU.

If that doesn't make the situation patently clear, CrossFire on our test unit is largely not in what we consider a working state. With that out of the way, here are the results we did managed to cobble together:

Battlefield: Bad Company 2

Civilization V

DiRT 2

Left 4 Dead 2

Mafia II

Mass Effect 2

Metro 2033

STALKER: Call of Pripyat

StarCraft II: Wings of Liberty

Total War: Shogun 2

Given this is preproduction hardware that won't see a store shelf, the above results are almost meaningless. If ACF can provide at least a 30% increase on average, like what we see in TWS2, it could be useful. If it can't do at least 30%, it seems like switchable graphics with an HD 6730M would be less problematic and provide better performance. The only takeaway we have right now is that ACF is largely not working on this particular unit. Shipping hardware and drivers should be better (they could hardly be worse), but let's just do a quick discussion of the results.

If we just look at games with DX10/11 enabled, the story isn't too bad. Not accounting for the rendering issues noted below, ACF is able to boost performance by an average of 24% over the dGPU at our High settings. We didn’t include the Low and Medium results for ACF on the previous page for what should be obvious reasons, but if the results at our High settings are less than stellar, Low and Medium settings are even less impressive. Trimming our list of titles to three games (we tested TWS2 and STALKER in DX9 mode at our Low and Medium settings), ACF manages to average a 1% performance increase over the dGPU at Low and a 14% increase at Medium, but Civ5 still had to contend with rendering errors and Metro 2033 showed reduced performance.

In terms of rendering quality, ACF is very buggy on the test system; the default BIOS settings initially resulted in corrupted output for most games and 3D apps, but even with the correct settings we still encountered plenty of rendering errors. Civilization V only had one GPU rendering everything properly while units were missing on the other GPU, so you’d get a flicker every other frame with units appearing/disappearing. At higher detail settings, the corruption was even more severe. STALKER: Call of Pripyat and Total War: Shogun 2 also had rendering errors/flickering at higher quality settings. Since we didn't enable DX10/11 until our High defaults, right when ACF is supposed to start helping is where we encountered rendering issues.

Just to be clear: none of this means that Asymmetrical CrossFire is a bad idea; it just needs a lot more work on the drivers and BIOS. If/when we get a retail notebook that includes Asymmetrical CrossFire support, we’ll be sure to revisit the topic. Why ACF isn’t supported in DX9 is still a looming question, and AMD’s drivers need a much better interface for managing switchable graphics profiles. A list of all supported games with a central location to change all the settings would be a huge step up from the current UI, and users need the ability to enable/disable CrossFire support on a per-game basis if AMD wants anyone to actually use ACF. We also hope AMD rethinks their “only for DX10/DX11 modes” stance; CrossFire has worked with numerous DX9 games in the past, and what we’d like to see is ACF with the same list of supported games as regular CrossFire. If nothing else, having ACF enabled shouldn't reduce performance in DX9 titles.

In summary: we don't know if ACF will really help that much. We tested Asymmetrical CrossFire on what is, at best, beta hardware and drivers, and it didn't work very well. We want it to work, and the potential is certainly there, but we'll need to wait for a better test platform. To be continued....

Fusion GPU Takes on Gaming AMD’s Llano Platform: Contending for your Mobile Dollar
Comments Locked

177 Comments

View All Comments

  • Brian23 - Tuesday, June 14, 2011 - link

    Jarred/Anand,

    Based on the benchmarks you've posted, It's not very clear to me how this CPU performs in "real world" CPU usage. Perhaps you have it covered with one of your synthetic benchmarks, but by looking at the names, it's not clear which ones stress the integer vs floating point portions of the processor.

    IMO, a test I'd REALLY like to see is how this APU compares in a compile benchmark against a C2D 8400 and a i3 380M. Those are both common CPUs that can be used to compare against other benchmarks.

    Could you compile something like Chrome or Firefox on this system and a couple others and update the review?

    Thanks! I appreciate the work you guys do!
  • ET - Tuesday, June 14, 2011 - link

    PCMark tests common applications. You can read more details here: http://www.pcmark.com/wp-content/uploads/2011/05/P...

    While I would find a compilation benchmark interesting, are you suggesting that this will be more "real world"? How many people would do that compared to browsing, video, gaming? Probably not a lot.
  • Brian23 - Tuesday, June 14, 2011 - link

    Thanks for the link. I was looking for something that described what the synthetic benchmarks mean.

    As for "real world," it really depends from one user to the next. What I was really trying to say is that no-one buys a PC just to run benchmarks. Obviously the benchmark companies try to make their benchmarks simulate real world scenarios, but there's no way they can truly simulate a given person's exact workload because it's going to be different from someone else's workload.

    If we're going down the synthetic benchmark path, what I'd like to see is a set of benchmarks that specifically stresses one aspect of a system. (i.e. integer unit or FPU.) That way you can compare processor differences directly without worrying about how other aspects of the system affect what you're looking at. In the case of this review, I was looking at the Computation benchmark listed. After reading the whitepaper, I found out that benchmark is stressing both the CPU and the GPU, so it's not really telling me just about the CPU which is the part I'm interested in.

    Switching gears to actual real world tests, seeing a compile will tell me what I'm interested in: CPU performance. Like you said, most people aren't going to be doing this, but it's interesting because it will truly test just the CPU.
  • JarredWalton - Tuesday, June 14, 2011 - link

    Hi Brian,

    I haven't looked into compiling code in a while, but can you give me a quick link to a recommended (free) Windows compiler for Chrome? I can then run that on all the laptops and add it to my benchmark list. I would venture to say that an SSD will prove more important than the CPU on compiling, though.
  • Brian23 - Tuesday, June 14, 2011 - link

    Jarred,

    This link is a user's quick how-to for compiling chrome:
    http://cotsog.wordpress.com/2009/11/08/how-to-comp...

    This is the official chrome build instructions:
    http://dev.chromium.org/developers/how-tos/build-i...

    Both use Visual Studio Express which is free.

    I really appreciate this extra work. :-)
  • krumme - Tuesday, June 14, 2011 - link

    The first links at the top is sponsored
    3 times exactly the same i7 + 460 ! ROFL
    Then 1 i7 with a 540
    Damn - looks funny, but at least it not 1024 *768 like the preview, but the most relevant resolution for the market - thank you for that
  • Shadowmaster625 - Tuesday, June 14, 2011 - link

    Man what is it with this dumb yuppie nonsense. No I dont want to save $200 because I dont actually work for my money. Hell, if you're even reading this site then it is highly likely that the two places you want more performance from your notebook is games and internet battery life. All this preening about intel's crippled cpu being 50% faster dont mean jack because ... well its a crippled cpu. It cant play games yet it has a stupid igp. Why get all yuppity about such an obvious design failure, so much so that you'd be willing to sneeze at a $200 savings like it means nothing. It actually means something to people who work for a living. Most people just dont need the extra 50% cpu speed from a notebook. But having a game that runs actually does mean something tangible.
  • madseven7 - Tuesday, June 14, 2011 - link

    I'm not sure why people think this is such a crappy cpu. Am I missing something? Wasn't the Llano APU that was tested the lowest of the A8 series with DDR 1333? Doesn't it give up 500MHz-800MHz to the SB notebooks that were tested? Wouldn't the A8 3530mx perform much better? I for one would love to see a review of the A8 3530mx personally.
  • ET - Tuesday, June 14, 2011 - link

    Good comment. This is the highest end 35W CPU, but not the highest end Llano. So it gets commended for battery life but not performance. It will be interesting to see the A8-3530MX results for performance and battery life. It would still lose to Sandy Bridge quite soundly on many tests, I'm sure, but it's still a significant difference in clock speed over the A8-3500M..
  • Jasker - Tuesday, June 14, 2011 - link

    One thing that is really interesting that isn't brought up here is the amount of power used during gaming. Not only do you get much better gaming than Intel, but you also get much less power. Double whammy.

Log in

Don't have an account? Sign up now