HTPC Credentials - YouTube and Netflix Streaming

Our HTPC testing with respect to YouTube had been restricted to playback of a 1080p music video using the native HTML5 player in Firefox. The move to 4K, and the need to evaluate HDR support have made us choose Mystery Box's Peru 8K HDR 60FPS video as our test sample moving forward. On PCs running Windows, it is recommended that HDR streaming videos be viewed using the Microsoft Edge browser after putting the desktop in HDR mode.

The 'Stats for Nerds' debug OSD in the top left shows that the stream being played back is a VP9 Profile 2 bitstream.

Various metrics of interest such as GPU usage and at-wall power consumption were recorded for the first three minutes of the playback of the above video. The numbers are graphed below.

We find that the playback consumes about 40% of the resources of one of the two available decoders. Thanks to the stream being progressive, the video processing usage is minimal. In the steady state, the GPU consumes around 4W, while the system consumes around 30W (on an average)

The Netflix 4K HDR capability works with native Windows Store app as well as the Microsoft Edge browser. We used the Windows Store app to evaluate the playback of Season 4 Episode 4 of the Netflix Test Patterns title. The OS screenshot facilities obviously can't capture the video being played back. However, the debug OSD (reachable by Ctrl-Alt-Shift-D) can be recorded.

The (hevc,hdr,prk) entry corresponding to the Video Track in the debug OSD, along with the A/V bitrate details (192 kbps / 16 Mbps) indicate that the HDR stream is indeed being played back. Similar to the YouTube streaming case, metrics such as GPU usage and at-wall power consumption were recorded for the first five minutes of the playback of the title. The numbers are graphed below.

The HEVC Main10 stream consumes around 50% of one of the two decoders, and the at-wall power consumption in the steady state is around 23W.

HTPC Credentials - Display Outputs Capabilities HTPC Credentials - Local Media Playback and Video Processing
Comments Locked

81 Comments

View All Comments

  • DimeCadmium - Thursday, April 4, 2019 - link

    You do realize the skull doesn't have to be visible?
  • PeachNCream - Thursday, April 4, 2019 - link

    It's not just the morbid case cover that bothers me. The fact is that the brand name in general is something that discourages my interest in an otherwise solid computing device. I don't need death or bones or corpse-like branding on my computer parts. That kind of thing has a way of crawling into your head and sticking around in there. It may seem trivial, but to someone that has had to see and deal with real world violence, it just isn't something I want associated with something I use for work and play at home.
  • GreenReaper - Thursday, April 4, 2019 - link

    What I want to know is this: where are all these canyons? Time was, codenames were based on actual locations, but nowadays I'm not sure. There's nothing on Google Maps...
  • mikato - Thursday, April 4, 2019 - link

    Me too. And if Bean Canyon isn't a real place, then I can't understand how such a ridiculous name would be used for a CPU.
  • MrCommunistGen - Wednesday, April 3, 2019 - link

    I'm not at all disagreeing with your point -- Intel has made pretty substantial gains in efficiency -- but we should all just remember that the CPUs in both systems are probably blowing WAY past their TDP (non-turbo) ratings to achieve the performance we're seeing in these benchmarks.
  • MrCommunistGen - Wednesday, April 3, 2019 - link

    I kept not finding the Power Consumption figures in the article. Under a full CPU + GPU load it looks like Bean Canyon is pulling ~72W at the wall and Skull Canyon is pulling ~77W at the wall.

    Still impressive since Bean Canyon tends to be a bit faster and has a smaller GPU configuration.
  • IntelUser2000 - Wednesday, April 3, 2019 - link

    Skull Canyon just sucks. It should be performing 30-50% faster than this one. No wonder nothing outside of a single Intel NUC used it. The previous two Iris Pros sucked too. Each generation made it worse.
  • FATCamaro - Wednesday, April 3, 2019 - link

    These make a Mac mini look like a deal.
  • cacnoff - Wednesday, April 3, 2019 - link

    Ganesh,

    "Perhaps an additional Thunderbolt 3 controller directly attached to the CPU's PCIe lanes could make the platform look even more attractive."

    This is a 14nm U-Series Part, there are no CPU PCIe lanes on it. Maybe complain about the U-Series parts not having pcie on the cpu package rather than about the NUC not having a feature that is impossible to support.
  • jordanclock - Wednesday, April 3, 2019 - link

    You sure about that? Ark pretty clearly lists the 8559U has having 16 PCIe lanes.

Log in

Don't have an account? Sign up now