Blu-ray & Flash Video Acceleration

Compatibility is obviously a strongpoint of Brazos. So long as what you’re decoding can be hardware accelerated you’re pretty much in the clear. But what about CPU utilization while playing back these hardware accelerated formats? The CPU still needs to feed data to the GPU, how many cycles are used in the process?

I fired up a few H.264/x264 tests to kick off the investigation. First we have a 1080p H.264 Blu-ray rip of Quantum of Solace, averaging around 15Mbps:

Quantum of Solace 1080p H.264 CPU Utilization (1:00 - 1:30)
Platform Min Avg Max
AMD E-350 22.7% 27.8% 35.3%
Intel Atom D510 Fail
Zotac ION 14.6% 17.2% 20.1%

A standard Atom platform can’t decode the video but ION manages a 17% average CPU utilization with an Atom 330. Remember that the Atom 330 is a dual-core CPU with SMT (4-threads total) so you’re actually getting 17.2% of four hardware threads used, but 34.4% of two cores. The E-350 by comparison leaves 27.8% of its two cores in use during this test. Both systems have more than enough horsepower left over to do other things.

Next up is an actual Blu-ray disc (Casino Royale) but stripped of its DRM using AnyDVD HD and played back from a folder on the SSD:

Casino Royale BD (no DRM) CPU Utilization (49:00 - 49:30)
Platform Min Avg Max
AMD E-350 28.1% 33.0% 38.4%
Intel Atom D510 Fail
Zotac ION 17.7% 22.5% 27.5%

Average CPU utilization here for the E-350 was 33% of two cores.

Finally I ran a full blown Blu-ray disc (Star Trek) bitstreaming TrueHD on the E-350 to give you an idea of what worst case scenario CPU utilization would be like on Brazos:

Star Trek BD CPU Utilization (2:30 - 3:30)
Platform Min Avg Max
AMD E-350 29.0% 40.1% 57.1%

At 40% CPU utilization on average there’s enough headroom to do something else while watching a high bitrate 1080p movie on Brazos. The GPU based video decode acceleration does work, however the limits here are clear. Brazos isn’t going to fare well as a platform you use for heavy multitasking while decoding video, even if the video decode is hardware accelerated. As a value/entry-level platform I doubt this needs much more explanation.

Now let’s talk about Flash.

I ran through a number of Flash video tests at both YouTube and Hulu ranging in resolution from 480p all the way up to 1080p. I used Flash 10.1, 10.2 beta as well as an unreleased version of 10.2 beta provided by AMD.

Flash Video Playback CPU Utilization - YouTube 720p

Flash Video Playback CPU Utilization - Hulu 480p

For the most part GPU accelerated Flash video does work well. Performance under both YouTube and Hulu was flawless, provided that I wasn’t watching 1080p content. Watching 1080p content in YouTube wasn’t entirely smooth on Brazos, despite posting very reasonable CPU utilization numbers.

Flash Video Playback CPU Utilization - YouTube 1080p

Flash Video Playback CPU Utilization - YouTube 1080p

I took my concerns to AMD and was told that this was a known issue with Brazos and Flash 10.1 and that 10.2 should alleviate the issue. AMD then supplied me with an unreleased version of Flash 10.2 to allow me to verify its claims. While 1080p playback improved with AMD’s 10.2 beta, it wasn’t perfect (although it was very close). AMD wouldn’t tell me the cause of the problem but it’s currently working on it with Adobe. At the end of the day I don’t believe it’s a dealbreaker, but early Brazos adapters should expect some stuttering when playing back 1080p YouTube videos. Note that 720p and lower resolution videos were perfectly smooth on Brazos.

Video Decode Capabilities: Is Brazos the New ION? The Radeon HD 6310: Very Good for the Money
Comments Locked

176 Comments

View All Comments

  • Scootiep7 - Friday, January 28, 2011 - link

    The major problem is that Intel's drivers are flat out junk for anything Gaming/HTPC related. With ULV Sandy Bridge, you'll be paying three times as much for a complete system that performs less than half as well on most of the stuff you want to do with it in this market segment.
  • redraider89 - Thursday, May 4, 2017 - link

    Yes, yes it is.
  • redraider89 - Thursday, May 4, 2017 - link

    An unreleased processor can't beat an existing processor. So, yes, yes it is. It's absurd when people think just because they repeat their no's that gives what they are saying more legitimacy.
  • Marlin1975 - Thursday, January 27, 2011 - link

    Talk about good for a first try.
    Think about it; AMDs first shot into this area is as good or way better than the much updated Atom from intel.

    Anand can you ask when amd thinks they will be able to move this to 32nm? Seems the design is good just needs updated shrinkage to increase the Mhz.

    Also since this will not be in a netbook/laptop can you overlclock???
  • nitrousoxide - Thursday, January 27, 2011 - link

    No 32nm Brazos parts, they will come directly with 28nm next year. Yes, 4 bobcat cores, VLIW 4D architecture GPUs, 2x performance at minimum or even higher if AMD gives it Turbo Boost feature.

    Overclocking doesn't make any sense on this chip, because no matter how you overclock it, the chip cannot do the things it can't do at default clock. it simply brings you more power consumption.
  • Marlin1975 - Thursday, January 27, 2011 - link

    Overclock for the desktop not the netbook.

    Could make some items go from unusable to usable as a low power desktop replacement.
    I like to see what would happen and what would help the most or not.
  • nitrousoxide - Thursday, January 27, 2011 - link

    Try AMD Overdrive
  • knedle - Saturday, January 29, 2011 - link

    You can overclock everything (even a notebook), but since the board didn't support it, it's much harder. ;)
    I'm sure someone will release overclockable boards, it's just the matter of time.
  • Gigantopithecus - Thursday, January 27, 2011 - link

    ...ETA for Brazos mini-ITX availability in retail channels?

    I can't think of a reason to buy anything, be it a netbook or nettop or htpc, with an Atom in it given Brazos's performance and power consumption. Can't wait to start building Brazos systems!
  • codedivine - Thursday, January 27, 2011 - link

    I am a little confused. If I put a graphics card in the x16 slot, will it run at x4 PCIe 2.0 speeds or x16 PCIe 2,0 speeds?

Log in

Don't have an account? Sign up now