HTPC Credentials - Local Media Playback and Video Processing

Evaluation of local media playback and video processing is done by playing back files encompassing a range of relevant codecs, containers, resolutions, and frame rates. A note of the efficiency is also made by tracking GPU usage and power consumption of the system at the wall. Users have their own preference for the playback software / decoder / renderer, and our aim is to have numbers representative of commonly encountered scenarios. Towards this, we played back the test streams using the following combinations:

  • MPC-HC x64 1.8.5 + LAV Video Decoder (DXVA2 Native) + Enhanced Video Renderer - Custom Presenter (EVR-CP)
  • MPC-HC x64 1.8.5 + LAV Video Decoder (D3D11) + madVR 0.92.17 (DXVA-Focused)
  • MPC-HC x64 1.8.5 + LAV Video Decoder (D3D11) + madVR 0.92.17 (Lanczos-Focused)
  • VLC 3.0.8
  • Kodi 18.5

The thirteen test streams (each of 90s duration) were played back from the local disk with an interval of 30 seconds in-between. Various metrics including GPU power consumption and at-wall power consumption were recorded during the course of this playback. Prior to looking at the metrics, a quick summary of the decoding capabilities of the Intel UHD Graphics is useful to have for context.

The Intel UHD Graphics GPU is no different from the GPUs in the Bean Canyon and Baby Canyon NUCs as far as video decoding capabilities are concerned. We have hardware acceleration for all common codecs including VP9 Profile 2.

All our playback tests were done with the desktop HDR setting turned on. It is possible for certain system configurations to have madVR automatically turn on/off the HDR capabilities prior to the playback of a HDR video, but, we didn't take advantage of that in our testing.

VLC and Kodi

VLC is the playback software of choice for the average PC user who doesn't need a ten-foot UI. Its install-and-play simplicity has made it extremely popular. Over the years, the software has gained the ability to take advantage of various hardware acceleration options. Kodi, on the other hand, has a ten-foot UI making it the perfect open-source software for dedicated HTPCs. Support for add-ons make it very extensible and capable of customization. We played back our test files using the default VLC and Kodi configurations, and recorded the following metrics.

Video Playback Efficiency - VLC and Kodi

VLC doesn't seem to take advantage of VP9 Profile 2 hardware acceleration, while Kodi is able to play back all streams without any hiccups.

MPC-HC

MPC-HC offers an easy way to test out different combinations of decoders and renderers. The first configuration we evaluated is the default post-install scenario, with only the in-built LAV Video Decoder forced to DXVA2 Native mode. Two additional passes were done with different madVR configurations. In the first one (DXVA-focused), we configured madVR to make use of the DXVA-accelerated video processing capabilities as much as possible. In the second (Lanczos-focused), the image scaling algorithms were set to 'Lanczos 3-tap, with anti-ringing checked'. Chroma upscaling was configured to be 'BiCubic 75 with anti-ringing checked' in both cases. The metrics collected during the playback of the test files using the above three configurations are presented below.

Video Playback Efficiency - MPC-HC with EVR-CP and madVR

LAV Filters with EVR-CP is able to play back all streams without dropped frames, but madVR is a different story. Almost all streams 1080p and higher see varying levels of significant spikes in power consumption pointing to the decode and display chain struggling to keep up with the required presentation frame rate. Given that the GPU is weaker than the one in Bean Canyon, this is not a surprise. Overall, the Frost Canyon NUC is acceptable for a vanilla decode and playback device without extensive video post-processing.

HTPC Credentials - YouTube and Netflix Streaming Power Consumption and Thermal Performance
Comments Locked

85 Comments

View All Comments

  • HStewart - Tuesday, March 3, 2020 - link

    I will say that evolution of Windows has hurt PC market, with more memory and such, Microsoft adds a lot of fat into OS. As as point of sale developer though all these OS, I wish Microsoft had a way to reduce the stuff one does not need.

    Just for information the original Doom was written totally different to games - back in old days Michael Abrash (a leader in original game graphics) work with John Carmack of Id software for Doom and Quake, Back then we did not have GPU driven graphics and code was done in assembly language.

    Over time, development got fat and higher level languages plus GPU and drivers. came in picture. This also occurred in OS area where in 1992 I had change companies because Assembly Language developers started becoming a dying breed.

    I think part of this is Microsoft started adding so many features in the OS, and there is a lot of bulk to drive the windows interface which is much simpler in older versions.

    If I was with Microsoft, I would have options in Windows for super trim version of the OS. Reducing overhead as much as possible. Maybe dual boot to it.

  • HStewart - Tuesday, March 3, 2020 - link

    I have some of original Abrash's books - quite a collectors item now a days

    https://www.amazon.com/Zen-Graphics-Programming-2n...
  • HStewart - Tuesday, March 3, 2020 - link

    And even more - with Graphics Programming Black book - almost $1000 now

    https://www.amazon.com/Michael-Abrashs-Graphics-Pr...
  • Qasar - Tuesday, March 3, 2020 - link

    you do know there are programs out there that can remove some of the useless bloat that windows auto installs, right ? maybe not to the extent that you are referring to, but ot is possible. on a fresh reinstall of win 10, i usually remove almost 500 megs of apps that i wont use.
  • erple2 - Saturday, March 14, 2020 - link

    This is an age old argument that ultimately falls flat in the face of history. "Bloated" software today is VASTLY more capable of the "efficient" code written decades ago. You could make the argument that we might not need all of the capabilities of software today, but I rather like having the incredibly stable OS's today than what I had to deal with in the past. And yes, OS's today are much more stable than they were in 1992 (not to mention vastly more capable)
  • Lord of the Bored - Thursday, March 5, 2020 - link

    My recollection is that was Windows Vista, not XP. XP was hitting 2D acceleration hardware that had stopped improving much around the time Intel shipped their first graphics adapter.
    Vista, however, had a newfangled "3D" compositor that took advantage of all the hardware progress that had happened since 1995... and a butt-ugly fallback plan for systems that couldn't use it(read as: Intel graphics).
    And then two releases later, Windows 8 dialed things way back because those damnable Intel graphics chips were STILL a significant install base and they didn't want to keep maintaining multiple desktop renderers.
    ...
    Unless the Vista compositor was originally intended for XP, in which case I eat my hat.
  • TheinsanegamerN - Monday, March 2, 2020 - link

    you dont need a 6 core CPU for back office systems or report machines either. So they wouldnt buy this at all.

    Dell, HP, ece make small systems with better CPU power for a lower price then this. The appeal of the NUCs was good CPUs with iris level GPUs isntead of the UHD that everyone else used.
  • PeachNCream - Monday, March 2, 2020 - link

    The intention of the NUC was to provide a fairly basic computing device in a small and power efficient package. Iris models were something of an aberration in more recent models. In fact, the first couple of NUC generations used some of Intel's slowest processors available at the time. tim
  • niva - Tuesday, March 3, 2020 - link

    The point is that if you're making a basic computing device why even go beyond 4 cores. I kind of want a NUC as a basic browsing computer that takes up little space. I can see these being used in the office too. Many use cases for a device like this with 6 or more cores in the office, especially for folks in engineering fields running Matlab or doing development/compiling. However, in almost all of these use cases having a stronger graphics package helps, never mind gaming. Taking a step back in the GPU side, especially given what AMD is doing right now and this being in response to the competition, doesn't make much sense. Perhaps this is just to hold them over until Intel fully transitions to using AMD GPUs in the future?
  • Lord of the Bored - Thursday, March 5, 2020 - link

    Can I just say how much I love that four cores is now considered a "basic" computing device? It leaves me suffused with a warm glow of joy.

Log in

Don't have an account? Sign up now