Blu-ray & Flash Video Acceleration

Compatibility is obviously a strongpoint of Brazos. So long as what you’re decoding can be hardware accelerated you’re pretty much in the clear. But what about CPU utilization while playing back these hardware accelerated formats? The CPU still needs to feed data to the GPU, how many cycles are used in the process?

I fired up a few H.264/x264 tests to kick off the investigation. First we have a 1080p H.264 Blu-ray rip of Quantum of Solace, averaging around 15Mbps:

Quantum of Solace 1080p H.264 CPU Utilization (1:00 - 1:30)
Platform Min Avg Max
AMD E-350 22.7% 27.8% 35.3%
Intel Atom D510 Fail
Zotac ION 14.6% 17.2% 20.1%

A standard Atom platform can’t decode the video but ION manages a 17% average CPU utilization with an Atom 330. Remember that the Atom 330 is a dual-core CPU with SMT (4-threads total) so you’re actually getting 17.2% of four hardware threads used, but 34.4% of two cores. The E-350 by comparison leaves 27.8% of its two cores in use during this test. Both systems have more than enough horsepower left over to do other things.

Next up is an actual Blu-ray disc (Casino Royale) but stripped of its DRM using AnyDVD HD and played back from a folder on the SSD:

Casino Royale BD (no DRM) CPU Utilization (49:00 - 49:30)
Platform Min Avg Max
AMD E-350 28.1% 33.0% 38.4%
Intel Atom D510 Fail
Zotac ION 17.7% 22.5% 27.5%

Average CPU utilization here for the E-350 was 33% of two cores.

Finally I ran a full blown Blu-ray disc (Star Trek) bitstreaming TrueHD on the E-350 to give you an idea of what worst case scenario CPU utilization would be like on Brazos:

Star Trek BD CPU Utilization (2:30 - 3:30)
Platform Min Avg Max
AMD E-350 29.0% 40.1% 57.1%

At 40% CPU utilization on average there’s enough headroom to do something else while watching a high bitrate 1080p movie on Brazos. The GPU based video decode acceleration does work, however the limits here are clear. Brazos isn’t going to fare well as a platform you use for heavy multitasking while decoding video, even if the video decode is hardware accelerated. As a value/entry-level platform I doubt this needs much more explanation.

Now let’s talk about Flash.

I ran through a number of Flash video tests at both YouTube and Hulu ranging in resolution from 480p all the way up to 1080p. I used Flash 10.1, 10.2 beta as well as an unreleased version of 10.2 beta provided by AMD.

Flash Video Playback CPU Utilization - YouTube 720p

Flash Video Playback CPU Utilization - Hulu 480p

For the most part GPU accelerated Flash video does work well. Performance under both YouTube and Hulu was flawless, provided that I wasn’t watching 1080p content. Watching 1080p content in YouTube wasn’t entirely smooth on Brazos, despite posting very reasonable CPU utilization numbers.

Flash Video Playback CPU Utilization - YouTube 1080p

Flash Video Playback CPU Utilization - YouTube 1080p

I took my concerns to AMD and was told that this was a known issue with Brazos and Flash 10.1 and that 10.2 should alleviate the issue. AMD then supplied me with an unreleased version of Flash 10.2 to allow me to verify its claims. While 1080p playback improved with AMD’s 10.2 beta, it wasn’t perfect (although it was very close). AMD wouldn’t tell me the cause of the problem but it’s currently working on it with Adobe. At the end of the day I don’t believe it’s a dealbreaker, but early Brazos adapters should expect some stuttering when playing back 1080p YouTube videos. Note that 720p and lower resolution videos were perfectly smooth on Brazos.

Video Decode Capabilities: Is Brazos the New ION? The Radeon HD 6310: Very Good for the Money
POST A COMMENT

174 Comments

View All Comments

  • Shadowmaster625 - Monday, January 31, 2011 - link

    Nonsense. ARM is more optimized that x86. x86 code is always sloppy, because it has always been designed without having to deal with RAM, ROM, and clock constraints. When you code for an ARM device, you are presented with limits that most software engineers never even faced when writing x86 code. When writing software for Windows, 99.9% of developers will tell you they never even think about the amount of RAM they are using. For ARM it was probably 80% 10 years ago. Today it is probably less than 20% of ARM software engineers who would tell you they run into RAM and ROM limitations. With all this smartphone development going on today, ARM devices are getting more sloppy, but still nowhere near as bad as x86. Reply
  • Shadowmaster625 - Monday, January 31, 2011 - link

    Best buy is still littered with them. Literally. Littered. Reply
  • e36Jeff - Thursday, January 27, 2011 - link

    what review were you reading? The only bug that is actually mentioned is the issue with flash, which AMD and Adobe are both aware of and should be fixed in the next iteration of flash. Stop seeing anything from AMD as bad and Intel as good. For where AMD wants this product to compete, this is a fantastic product that Intel has very little to compete with now that they locked out Nvidia from another Ion platform. Reply
  • codedivine - Thursday, January 27, 2011 - link

    Ok one last question. Is it possible to run your VS2008 benchmark on it? Will be appreciated, thanks. Reply
  • Anand Lal Shimpi - Thursday, January 27, 2011 - link

    Running it now, will update with the results :)

    Take care,
    Anand
    Reply
  • Malih - Sunday, January 30, 2011 - link

    I'm with you on this.

    I'm thinking about buying a netbook and may be a couple net tops with E-350, which will mostly be used to code websites, may be some other dev that require IDE (Eclipse, Visual Studio and so on).
    Reply
  • micksh - Thursday, January 27, 2011 - link

    how can it be that "1080i60 works just fine" when it failed all deinterlacing tests? Reply
  • Anand Lal Shimpi - Thursday, January 27, 2011 - link

    It failed the quality tests but it can physically decode the video at full frame rate :)

    Take care,
    Anand
    Reply
  • nitrousoxide - Thursday, January 27, 2011 - link

    That's cool. When Intel owned everything, try buy a Core i5 with $200 as today. Actually, I'm not quite sure if an i3 will cost less than $150.

    And that's what happened during G80 era. nVidia has the best GPU, much much faster while the R600s are craps, ATI's on the edge of extinction. See what you got? $1000 for the flag-ship 8800 Ultra, $600 for a high-end 8800GTX. A decent card may cost you $400 (8800GTS) and even a crappy (though not as slow as Radeon 2600) 8600GT costs you $250.
    Reply
  • Enlightenment777 - Thursday, January 27, 2011 - link

    Rumors, Rumors, Rumors, Reviews, Reviews, Reviews, when can we buy E-350 motherboards from NewEgg? Reply

Log in

Don't have an account? Sign up now