HTPC Credentials - YouTube and Netflix Streaming

The move to 4K, and the need to evaluate HDR support have made us choose Mystery Box's Peru 8K HDR 60FPS video as our test sample for YouTube playback. On PCs running Windows, it is recommended that HDR streaming videos be viewed using the Microsoft Edge browser after putting the desktop in HDR mode.

Similar to the last few NUC generations, the Frost Canyon NUC also has no trouble in hardware-accelerated playback of the VP9 Profile 2 video. Thankfully, AV1 (which is not hardware-accelerated yet) streams are not being delivered yet to the PC platform.

The 'new' GPU is not yet recognized by tools such as GPU-Z, and the media engine loading is not being correctly tracked by the various tools we usually use. Hence, we have only GPU power and at-wall power being reocrded for the media playback tests. The numbers for YouTube video playback are graphed below.

YouTube playback results in the GPU consuming around 9W of power on an average, with the at-wall consumption being around 35W.

The Netflix 4K HDR capability works with native Windows Store app as well as the Microsoft Edge browser. We used the Windows Store app to evaluate the playback of Season 4 Episode 4 of the Netflix Test Patterns title. The OS screenshot facilities obviously can't capture the video being played back. However, the debug OSD (reachable by Ctrl-Alt-Shift-D) can be recorded.

The (hevc,hdr,prk) entry corresponding to the Video Track in the debug OSD, along with the A/V bitrate details (192 kbps / 16 Mbps) indicate that the HDR stream is indeed being played back. Similar to the YouTube streaming case, a few metrics were recorded for the first three minutes of the playback of the title. The numbers are graphed below.

Similar to the YouTube playback case, the GPU and at-wall power consumption are slightly below 10W and around 33W respectively for the Netflix playback case.

HTPC Credentials - Display Outputs Capabilities HTPC Credentials - Local Media Playback and Video Processing
Comments Locked

85 Comments

View All Comments

  • The_Assimilator - Monday, March 2, 2020 - link

    It's not, but the point is still valid: nobody buying these things is doing so because they expect them to be graphics powerhouses.
  • HStewart - Monday, March 2, 2020 - link

    But some people are so naive and don't realize the point. I came up in days when your purchase card that didn't even have GPU's on it. Not sure what level iGPU's are but they surely can run business graphics fine and even games a couple of years ago.
  • notb - Thursday, March 5, 2020 - link

    Horrible?
    These iGPUs can drive 3 screens with maybe 1-2W power draw. Show me another GPU that can do this.

    This is an integrated GPU made for efficient 2D graphics. There's very little potential to make it any better.
  • PaulHoule - Monday, March 2, 2020 - link

    Well, Intel's horrible iGPUs forced Microsoft to walk back the graphical complexity of Windows XP. They kept the GPU dependent architecture, but had to downgrade to "worse than cell phone" visual quality because Intel kneecaped the graphics performance of the x86 platform. (Maybe you could get something better, but developers can't expect you to have it)
  • HStewart - Monday, March 2, 2020 - link

    I think we need actual proof on these bias statements. I think there is big difference of running a screen at 27 or more inches than 6 to 8 inches no matter what the resolution.
  • Korguz - Monday, March 2, 2020 - link

    we need proof of your bias statements, but yet, you very rarely provide any.. point is ??
  • Samus - Monday, March 2, 2020 - link

    What does screen size have to do with anything? Intel can't make an iGPU that can drive a 4K panel fluidly, meanwhile mainstream Qualcomm SoC's have GPU performance able to drive 4K panels using a watt of power.
  • HStewart - Tuesday, March 3, 2020 - link

    Can Qualcomm actually drive say a 32 in 4k screen efficiently. Also what is being measure here, Videos or actually games and that depends on how they are written.
  • erple2 - Saturday, March 14, 2020 - link

    I'm not sure that I understand your statement here, as it doesn't seem to make any sense. I was not aware that they physical dimensions of the screen mattered at all to the GPU, apart from how many pixels it has to individually manage/draw. If your implication is the complexity and quantity of information that can be made significant on a 32" screen is different from a 5.7" screen, then I suppose you can make that argument. However, I have to make guesses as to what you meant for this to come to that conclusion.

    Generally the graphical load to display 4k resolution is independent of whether the actual screen is 6" or 100". Unless I'm mistaken?
  • PeachNCream - Monday, March 2, 2020 - link

    For once, I agree with HStewart (feels like I've been shot into the Twilight Zone to even type that). To the point though, Windows XP was released in 2001. Phones in that time period were still using black and white LCD displays. Intel's graphics processors in that time period were the Intel Extreme series built into the motherboard chipset (where they would remain until around 2010, after the release of WIndows 7). Sure those video processors are slow compared to modern cell phones, but nothing a phone could do when XP was in development was anything close to what a bottom-feeder graphics processor could handle. I mean crap, Doom ran (poorly) on a 386 with minimal video hardware and that was in the early 1990s whereas phones eight years later still didn't have color screens.

Log in

Don't have an account? Sign up now