HTPC Credentials - YouTube and Netflix Streaming

The move to 4K, and the need to evaluate HDR support have made us choose Mystery Box's Peru 8K HDR 60FPS video as our test sample for YouTube playback. On PCs running Windows, it is recommended that HDR streaming videos be viewed using the Microsoft Edge browser after putting the desktop in HDR mode.

Similar to the last few NUC generations, the Frost Canyon NUC also has no trouble in hardware-accelerated playback of the VP9 Profile 2 video. Thankfully, AV1 (which is not hardware-accelerated yet) streams are not being delivered yet to the PC platform.

The 'new' GPU is not yet recognized by tools such as GPU-Z, and the media engine loading is not being correctly tracked by the various tools we usually use. Hence, we have only GPU power and at-wall power being reocrded for the media playback tests. The numbers for YouTube video playback are graphed below.

YouTube playback results in the GPU consuming around 9W of power on an average, with the at-wall consumption being around 35W.

The Netflix 4K HDR capability works with native Windows Store app as well as the Microsoft Edge browser. We used the Windows Store app to evaluate the playback of Season 4 Episode 4 of the Netflix Test Patterns title. The OS screenshot facilities obviously can't capture the video being played back. However, the debug OSD (reachable by Ctrl-Alt-Shift-D) can be recorded.

The (hevc,hdr,prk) entry corresponding to the Video Track in the debug OSD, along with the A/V bitrate details (192 kbps / 16 Mbps) indicate that the HDR stream is indeed being played back. Similar to the YouTube streaming case, a few metrics were recorded for the first three minutes of the playback of the title. The numbers are graphed below.

Similar to the YouTube playback case, the GPU and at-wall power consumption are slightly below 10W and around 33W respectively for the Netflix playback case.

HTPC Credentials - Display Outputs Capabilities HTPC Credentials - Local Media Playback and Video Processing
Comments Locked

85 Comments

View All Comments

  • HStewart - Tuesday, March 3, 2020 - link

    I will say that evolution of Windows has hurt PC market, with more memory and such, Microsoft adds a lot of fat into OS. As as point of sale developer though all these OS, I wish Microsoft had a way to reduce the stuff one does not need.

    Just for information the original Doom was written totally different to games - back in old days Michael Abrash (a leader in original game graphics) work with John Carmack of Id software for Doom and Quake, Back then we did not have GPU driven graphics and code was done in assembly language.

    Over time, development got fat and higher level languages plus GPU and drivers. came in picture. This also occurred in OS area where in 1992 I had change companies because Assembly Language developers started becoming a dying breed.

    I think part of this is Microsoft started adding so many features in the OS, and there is a lot of bulk to drive the windows interface which is much simpler in older versions.

    If I was with Microsoft, I would have options in Windows for super trim version of the OS. Reducing overhead as much as possible. Maybe dual boot to it.

  • HStewart - Tuesday, March 3, 2020 - link

    I have some of original Abrash's books - quite a collectors item now a days

    https://www.amazon.com/Zen-Graphics-Programming-2n...
  • HStewart - Tuesday, March 3, 2020 - link

    And even more - with Graphics Programming Black book - almost $1000 now

    https://www.amazon.com/Michael-Abrashs-Graphics-Pr...
  • Qasar - Tuesday, March 3, 2020 - link

    you do know there are programs out there that can remove some of the useless bloat that windows auto installs, right ? maybe not to the extent that you are referring to, but ot is possible. on a fresh reinstall of win 10, i usually remove almost 500 megs of apps that i wont use.
  • erple2 - Saturday, March 14, 2020 - link

    This is an age old argument that ultimately falls flat in the face of history. "Bloated" software today is VASTLY more capable of the "efficient" code written decades ago. You could make the argument that we might not need all of the capabilities of software today, but I rather like having the incredibly stable OS's today than what I had to deal with in the past. And yes, OS's today are much more stable than they were in 1992 (not to mention vastly more capable)
  • Lord of the Bored - Thursday, March 5, 2020 - link

    My recollection is that was Windows Vista, not XP. XP was hitting 2D acceleration hardware that had stopped improving much around the time Intel shipped their first graphics adapter.
    Vista, however, had a newfangled "3D" compositor that took advantage of all the hardware progress that had happened since 1995... and a butt-ugly fallback plan for systems that couldn't use it(read as: Intel graphics).
    And then two releases later, Windows 8 dialed things way back because those damnable Intel graphics chips were STILL a significant install base and they didn't want to keep maintaining multiple desktop renderers.
    ...
    Unless the Vista compositor was originally intended for XP, in which case I eat my hat.
  • TheinsanegamerN - Monday, March 2, 2020 - link

    you dont need a 6 core CPU for back office systems or report machines either. So they wouldnt buy this at all.

    Dell, HP, ece make small systems with better CPU power for a lower price then this. The appeal of the NUCs was good CPUs with iris level GPUs isntead of the UHD that everyone else used.
  • PeachNCream - Monday, March 2, 2020 - link

    The intention of the NUC was to provide a fairly basic computing device in a small and power efficient package. Iris models were something of an aberration in more recent models. In fact, the first couple of NUC generations used some of Intel's slowest processors available at the time. tim
  • niva - Tuesday, March 3, 2020 - link

    The point is that if you're making a basic computing device why even go beyond 4 cores. I kind of want a NUC as a basic browsing computer that takes up little space. I can see these being used in the office too. Many use cases for a device like this with 6 or more cores in the office, especially for folks in engineering fields running Matlab or doing development/compiling. However, in almost all of these use cases having a stronger graphics package helps, never mind gaming. Taking a step back in the GPU side, especially given what AMD is doing right now and this being in response to the competition, doesn't make much sense. Perhaps this is just to hold them over until Intel fully transitions to using AMD GPUs in the future?
  • Lord of the Bored - Thursday, March 5, 2020 - link

    Can I just say how much I love that four cores is now considered a "basic" computing device? It leaves me suffused with a warm glow of joy.

Log in

Don't have an account? Sign up now