Testing with AMD GPUs: Doesn't Work Yet

Update 4: AMD has released Catalyst 9.11 with Flash support for Radeon HD 5000 series and 4000 series GPUs. No word on integrated graphics platforms. We've begun testing but the drivers don't seem to enable H.264 decode acceleration under Hulu at this point, waiting for a response from AMD.

Update 3: AMD tells us that Flash 10.1 support is coming later today, we should have a working driver soon.

Update 2: The latest beta drivers from ATI do not enable Flash 10.1 hardware acceleration support (both leaked and the supposed Catalyst 9.11 drivers from ATI's developer site). We're still waiting for ATI to get us a version of their drivers that does enable GPU acceleration under Flash 10.1.

NVIDIA's drivers are publicly available however:

Desktop

http://www.nvidia.com/object/winxp_195.55.html

http://www.nvidia.com/object/win7_winvista_32bit_195.55.html

http://www.nvidia.com/object/win7_winvista_64bit_195.55.html

Notebook

http://www.nvidia.com/object/notebook_winxp_195.55.html

http://www.nvidia.com/object/notebook_winvista_win7_195.55.html

http://www.nvidia.com/object/notebook_winvista_win7_x64_195.55.html

Update: The Release Notes now indicate Catalyst 9.11 drivers are required, which would explain our difficulties in testing. We're still waiting on a version of Catalyst 9.11 from AMD that works with Flash 10.1. We will post updated data as soon as we have the driver.

I’d say that my ION testing went pretty smoothly, but the same definitely doesn’t hold true for AMD.

I setup an AMD 785G system (integrated Radeon HD 3200) with a AMD Sempron LE-1150. This is a 2.0GHz, single core, K8 based processor with a 512KB L2 cache. Definitely faster than an Atom.

The integrated graphics of the 785G chipset fully supports H.264 decode acceleration and shouldn’t have a problem with Flash 10.1. AMD has it on the supported list and things should be smooth. Unfortunately, the numbers don’t agree:

Windowed Average CPU Utilization Flash 10.0.32.18 Flash 10.1.51.45
Hulu Desktop - The Office - Murder 97% 100%
Hulu HD 720p - Legend of the Seeker Ep1 94% 100%
Hulu 480p - The Office - Murder 57% 60%
Hulu 360p - The Office - Murder 27% 35%
YouTube HD 720p - Prince of Persia Trailer 90% 100%
YouTube - Prince of Persia Trailer 8% 8%

 

Not only did CPU utilization figures not go down, in many cases they went up. I asked Jarred to help me with a sanity check. He had a notebook based on the mobile version of the same chipset with an Athlon 64 X2 QL-64 (dual core 2.0GHz) and ran his own numbers:

Windowed Average CPU Utilization Flash 10.0.32.18 Flash 10.1.51.45
YouTube HD 720p - Prince of Persia Trailer 46% 46.5%

 

There was no change in CPU utilization when moving from Flash 10.0 to 10.1.

The two of us did notice something however. Flash 10.1, although not perfect on AMD hardware, did seem to improve performance. Jarred measured the number of dropped frames between Flash 10.0 and 10.1 in our YouTube HD test:

Windowed # of Frames Dropped (lower is better) Flash 10.0.32.18 Flash 10.1.51.45
YouTube HD 720p - Prince of Persia Trailer 289 frames 212 frames

 

There’s a definite improvement in 10.1, but just not nearly as much as we saw from NVIDIA.

I tried a few more things before giving up on AMD. I tossed in a Radeon HD 5850 to see if it was the integrated GPU at fault - still no change in CPU utilization. Finally I upgraded processors and used an Athlon II X2 240 instead of the meager Sempron.

Full Screen (1920 x 1200) Average CPU Utilization Flash 10.0.32.18 Flash 10.1.51.45
Hulu Desktop - The Office - Murder (Sempron LE-1150) 100% 100%
Hulu Desktop - The Office - Murder (Athlon II X2 240) 80% 72%

 

CPU utilization finally went down, but not nearly as much as what we saw with NVIDIA. There’s something not quite right about how AMD’s hardware interacts with the Flash 10.1 preview; I guess that’s why they’re calling it a prerelease.

Flash/Hulu on ION: Nearly Perfect Flash on GM45 and Ion Laptops
Comments Locked

135 Comments

View All Comments

  • Autisticgramma - Tuesday, November 17, 2009 - link

    I saw all this happening long ago, when adobe aquired flash to begin with.

    Adobe used to just make Acrobat reader, it sucked then it sucks now, its just so embedded in any corperate high-wire act its stoopid. Not to mention all the memory space want on start up, leaves in memory ect sloppy from day one.

    Macromedia was the company that created flash (at least to my memory). When macromedia owned it, it wasn't bloated crap ware. And then again we weren't streaming whole shows, and 720I 1080P were not the buzzwords of the day.

    I realize homestarrunner and illwillpress are not fully transmitted/encoded video, they are created in flash for flash.
    But I don't see how this is enough to require gpu acceleration, isn't there a way to streamline this? Why doesn't other video kill everything else with such efficency? Are we sure they're not just accelerating how fast my computer can be exploited, this is a net application.

    I'm not a coder, or some software guru, just a dude that works on computers. Could some one explain, or link me to something, that explains how this isn't an incoding issue, and a NEEDZ M0r3 PoWA issue? Adobe on my GPU - Sounds like "Sure I need some nike xtrainers for my ears?
  • cosmotic - Tuesday, November 17, 2009 - link

    Flash original came from FutureSplash.

    You really need to work on your spelling. =/

    Video decode is extremely CPU intensive. This is why most video decode now happens (at least partially) on the GPU.
  • PrinceGaz - Tuesday, November 17, 2009 - link

    Video decode is quite CPU intensive, but nowhere near as heavy as video encoding with decent quality settings. Also, all current HD video formats will be able to be handled by the CPU within a few years once sex and octal-core or higher CPUs are mainstream.

    The situation we are in currently regarding HD video playback of MPEG4 AVC type video is rather like the mid-late 1990's with DVD MPEG2 video, where hardware assistance was required for the CPUs of the day (typically around 200-400MHz) and you could even buy dedicated MPEG2 decoder cards. Within a few years, the CPU was doing all of the important decoding work with the only assistance being from graphics-cards for some later steps (and even that was not necessary as the CPU could do it easily if required). The same will apply with HD video in due course, especially as the boundary between a CPU and GPU narrows.
  • bcronce - Tuesday, November 17, 2009 - link

    I can watch 1080p 1920x1080 HD videos from Apple's site with 10% cpu, silky smooth. Now that is 80% of one of my logical CPUs, but that's also some crazy nice graphics.

    A Core i5 dual core should handle full HD videos with sub 25% cpu usage.
  • Autisticgramma - Tuesday, November 17, 2009 - link

    Thanks for that.



    Misspellers Untie! Engrish is strictly a method of conveying information/ideas.
    If ya get the gist the rest is irrelevant, at least to me.
  • johnsonx - Tuesday, November 17, 2009 - link

    Flash has always had a Hardware Acceleration checkbox, at least in 9 & 10. What did it do?
  • KidneyBean - Wednesday, November 18, 2009 - link

    For video, I think it allowed the GPU to scale the screen size. So now you can maximize or resize the video without it taking up extra CPU resources.
  • SanLouBlues - Tuesday, November 17, 2009 - link

    Adobe is kinda right about Linux, but we're getting closer:
    http://www.phoronix.com/scan.php?page=article&...">http://www.phoronix.com/scan.php?page=article&...
  • phaxmohdem - Tuesday, November 17, 2009 - link

    I'm still rocking my trusty 8800GTX card. My heart sunk a little bit when I read that G80 cards are not supported. This is the first time since I bought the ol' girl years ago that she has not been able to perform.

    However, I also have an 8600GT that runs two extra monitors in my workstation, and I always do my Hulu watching on one of those monitors anyway, so things may still work out between us for a while longer.
  • CharonPDX - Tuesday, November 17, 2009 - link

    I have an original early 2006 MacBook Pro (2.0 GHz Core Duo; 2 GB RAM, Radeon X1600) running Snow Leopard 10.6.2.

    I not only don't see any difference, but I think something was wrong with your Mac Pro. Hulu 480P and YouTube 720P videos have been fully watchable on my system, in full screen on a 1080p monitor, all along.

    When playing your same Hulu video (The Office - Murder, 480P, full screen) with both versions of Flash, I get a nice stable full frame rate (I don't know how to measure frame rate on OS X, but it looks the same as when I watch it on broadcast TV,) with 150% CPU usage. (Average; varies from 130% to 160%; but seems to hover in the 148-152 range the vast majority of the time.)

    And Legend of the Seeker, episode 1 in HD skips a few frames, but is perfectly watchable.

Log in

Don't have an account? Sign up now