Unified Video Decoder and Playback Pathways

A typical consumer user experience revolves a lot around video, and AMD identified for Carrizo a big potential to decrease power consumption and increase performance in a couple of different ways. First up is adjusting the path by which data is moved around the system, particularly as not a lot of video matches up with the native resolution of the screen or is scaled 1:1.

When a video exhibits a form of scaling, either it is made full screen and scaled up or it is a higher resolution video that scales down, that scaling is typically performed by the GPU. The data leaves the decoder (either hardware or software), enters system memory, moves into the graphics memory, is processed by the GPU, moves back out to memory, and then is transferred to the display. This requires multiple read/write commands to memory, requires the GPU to be active but underutilized, and this happens for every frame. AMD’s solution to this is to provide some simple scaling IP in the display engine itself, allowing for scaled video to go from the decoder to the display engine, leaving the GPU in a low power state.

The video playback paths at the bottom of this side show the explanation graphically, and AMD is quoting a 4.8W down to 1.9W movement in power consumption for these tasks. Note that the 4.8W value is for Kaveri, so there are other enhancements in there is as well, but the overall picture is a positive one and AMD quotes a 500mW of APU power savings.

The Unified Video Decoder (UVD) has been built to support the above codecs, with HEVC decode on die as well as native 4K H.264 decode as well. I’ll come back to the 4K element in a second, but what is perhaps missing from this list is VP9, the codec used by Google for YouTube. YouTube is still the number one source for video content on the web, and as Google is transitioning more to VP9, as well as AMD’s competition advertising it as a perk on their latest hardware, it was perhaps confusing for AMD to miss it out. I did ask on this, and was told that they picked HEVC over VP9 as they believe it will be the more important codec going forward, particularly when you consider that the majority of the popular streaming services (NetFlix, Hulu, Amazon) will be using HEVC for their high definition titles.

Back onto the 4K equation, and this is possible because AMD has increased the decode bandwidth of the UVD from 1080p to 4K. This affords two opportunities – 4K video on the fly, or 1080p video decoded in a quarter of the time, allowing the race to sleep for both the UVD and DRAM. Despite a 75% reduction in work, as the UVD does not use that much power, it results in only 30 minutes of extra video playback time, but it is welcome and contributes to that often marketed ‘video playback’ number.

Power Saving and Power Consumption Graphics
Comments Locked

137 Comments

View All Comments

  • Refuge - Wednesday, June 3, 2015 - link

    Built my mother a new system from her old scraps with a new A8, she loves that desktop, and when she put an SSD in it finally she loved it ten times more. the upgrade only cost her $300, for CPU, Mobo, RAM. Threw it together in 45 minutes, and she hasn't had a problem with it in 2 years so far.
  • nathanddrews - Wednesday, June 3, 2015 - link

    I prefer the following setup:
    1. Beast-mode, high-performance desktop for gaming, video editing, etc.
    2. Low-power, cheap notebook/tablet for In-Home Steam Streaming and light gaming (720p) on the go.

    In my use case, as long as I can load and play the game (20-30fps for RTS, 30fps+ for everything else) on a plane ride or some other scenario without AC access, I'm not really concerned with the AA or texture quality. I still want to get the best experience possible, but balanced against the cheapest possible price. The sub-$300 range is ideal for me.
  • AS118 - Wednesday, June 3, 2015 - link

    Yeah, that's my thing as well. High resolutions at work, and at home, 768p or 900p's just fine, especially for gaming.

    I also recommend AMD to friends and relatives who want laptops and stuff that can do casual gaming for cheap.
  • FlushedBubblyJock - Tuesday, June 9, 2015 - link

    Why go amd when HD3000 does just fine gaming and the added power of the intel cpu is an awesome boost overall ...
  • Valantar - Wednesday, June 3, 2015 - link

    1366*768 on anything larger than 13" looks a mess, but in a cheap laptop I'd rather have a 13*7 IPS for the viewing angles and better visuals than a cheap FHD TN panel - bad viewing angles especially KILL the experience of using a laptop. Still, 13*7 is pretty lousy for anything other than multimedia - it's simply too short to fit a useful amount of text vertically. A decent compromise would be moving up to 1600*900 as the standard resolution on >11" displays. Or, of course, moving to 3:2 or 4:3 displays, which would make the resolution 1366*911 or 1366*1024 and provide ample vertical space. Still, 13*7 TN panels need to go away. Now.
  • yankeeDDL - Wednesday, June 3, 2015 - link

    Like I said, to each his own. I have a Lenovo Z50 which I paid less than $470 with the A10 7300.
    Quite frankly, I could not be happier and I think it provides a massive value for that money.
    Sure, a larger battery and a better screen would not hurt, but for hustling it around the house, or bring it to friend/family house, watch movies, play games at native resolution, it is fantastic.
    It's no road warrior, for sure (heavy and the battery life doesn't go much beyond 3hrs of "serious" use) but playing at 1366*768 on something that weights 5 pounds and costs noticeably less than $500, is quite amazing. Impossible on an Intel+discrete graphics, as far as I know.
  • FlushedBubblyJock - Tuesday, June 9, 2015 - link

    Nope, HD3000 plays just fine
  • Margalus - Wednesday, June 3, 2015 - link

    I'd rather have a cheaper 15.6" 1366x768 TN panel over a more expensive smaller ips panel.
  • UtilityMax - Wednesday, June 3, 2015 - link

    1366x768 is fine for movies and games. But it's a bad resolution for reading text or viewing images on the web, since you see pixels the size of moon crater.
  • BrokenCrayons - Thursday, June 4, 2015 - link

    I understand there's going to be a variety of differing opinions on the idea of seeing individual pixels. As far as I'm concerned, seeing individual pixels isn't a dreadful or horrific thing. In fact, to me it simply doesn't matter. I'm busy living my meat world life and enjoying whatever moments I have with family and friends so I don't give the ability to discern an individual pixel so much as a second thought. It is an insignificant part of my life, but what isn't is the associated decline in battery life (on relative terms) required to drive additional, utterly unnecessary pixels and to push out sufficient light as a result of the larger multitude of them. That sort of thing is marginally annoying -- then again, I still just don't care that much one way or another aside from noticing that a lot of people are very much infatuated with an insignificant, nonsense problem.

Log in

Don't have an account? Sign up now