4K HTPC Credentials

The noise profile of the NUC8i7HVK is surprisingly good. At idle and low loads, the fans are barely audible, and they only kicked in during stressful gaming benchmarks. From a HTPC perspective, we had to put up with the fan noise during the decode and playback of codecs that didn't have hardware decode acceleration - 4Kp60 VP9 Profile 2 videos, for instance. Obviously, the unit is not for the discerning HTPC enthusiast who is better off with a passively cooled system.

Refresh Rate Accuracy

Starting with Haswell, Intel, AMD and NVIDIA have been on par with respect to display refresh rate accuracy. The most important refresh rate for videophiles is obviously 23.976 Hz (the 23 Hz setting). As expected, the Intel NUC8i7HVK (Hades Canyon) has no trouble with refreshing the display appropriately in this setting.

The gallery below presents some of the other refresh rates that we tested out. The first statistic in madVR's OSD indicates the display refresh rate.

Network Streaming Efficiency

Evaluation of OTT playback efficiency was done by playing back the Mystery Box's Peru 8K HDR 60FPS video in YouTube using Microsoft Edge and Season 4 Episode 4 of the Netflix Test Pattern title using the Windows Store App, after setting the desktop to HDR mode and enabling the streaming of HDR video.

The YouTube streaming test, unfortunately, played back a 1080p AVC version. MS Edge utilizes the Radeon GPU which doesn't have acceleration for VP9 Profile 2 decode. Instead of software decoding, Edge apparently requests the next available hardware accelerated codec, which happens to be AVC. The graph below plots the discrete GPU load, discrete GPU chip power, and the at-wall power consumption during the course of the YouTube video playback.

Since the stream was a 1080p version, we start off immediately with the highest possible bitrate. The GPU power consumption is stable around 8W, with the at-wall power consumption around 30W. The Radeon GPU does not expose separate video engine and GPU loads to monitoring software. Instead, a generic GPU load is available. Its behavior is unlike what we get from the Intel's integrated GPUs or NVIDIA GPUs - instead of a smooth load profile, it has frequent spikes close to 100% and rushing back to idle, as evident from the red lines in the graphs in this section. Therefore, we can only take the GPU chip power consumption as an indicator of the loading factor.

A similar graph for the Netflix streaming case (16 Mbps HEVC 10b HDR video) is also presented below. Manual stream selection is available (Ctrl-Alt-Shift-S) and debug information / statistics can also be viewed (Ctrl-Alt-Shift-D). Statistics collected for the YouTube streaming experiment were also collected here.

It must be noted that the debug OSD is kept on till the stream reaches the 16 Mbps playback stage around 2 minutes after the start of the streaming. The GPU chip power consumption ranges from 20W for the low resolution video (that requires scaling to 4K) to around 12W for the eventually fetched 16 Mbps 4K stream. The at-wall numbers range from 60W (after the initial loading spike) to around 40W in the steady state.

Update: I received a request to check whether the Netflix application was utilizing the Intel HD Graphics 630 for PlayReady 3.0 functionality. The screenshot below confirms that the 4K HEVC HDR playback with Netflix makes use of the Radeon RX Vega M GH GPU only.

Decoding and Rendering Benchmarks

In order to evaluate local file playback, we concentrate on Kodi (for the casual user) and madVR (for the HTPC enthusiast). Under madVR, we decided to test out only the default out-of-the-box configuration. We recently revamped our decode and rendering test suite, as described in our 2017 HTPC components guide.

madVR 0.92.12 was evaluated with MPC-HC 1.7.15 (unofficial release) with its integrated LAV Filters 0.71. The video decoder was set to Direct 3D mode, with automatic selection of the GPU for decoding operations. For hardware-accelerated codecs, we see the at-wall power consumption around 35-40W and the GPU chip power consumption to be around 10W. For the software decode case (VP9 Profile 2), the at-wall power consumption is around 90W, and the GPU chip power consumption is around 18W (the power budget is likely for madVR processing of the software-decoded video frames).

One of the praiseworthy aspects of the madVR / MPC-HC / LAV Filters combination that we tested above was the automatic switch to HDR mode and back while playing the last couple of videos in our test suite. All in all, the combination of playback components was successful in processing all our test streams in a smooth manner.

The same testing was repeated with the default configuration options for Kodi 17.6. The at-wall power consumption is substantially lower (around 30W) for the hardware-accelerated codecs. The GPU chip power is around 8W consistently for those. For the VP9 Profile 2 case, the at-wall number rises to 70W, but, there is not much change in the GPU chip power. We did encounter a hiccup in the 1080i60 VC-1 case, as the playback just froze for around 5 - 10s - evident in the graph below (the files were being played off the local SSD).

We attempted to perform some testing with VLC 3.0.1, but, encountered random freezes and blank screen outputs while using the default configuration for playing back the same videos. It is possible that the VLC 3.0.1 hardware decode infrastructure is not as robust as that of the MPC-HC / LAV Filters 0.71.0 combination, and the hardware acceleration APIs behave slightly differently with the Radeon GPU compared to the behavior seen with Intel's integrated GPU and  NVIDIA's GPU.

Moving on to the codec support, while the Intel HD 630 is a known quantity with respect to the scope of supported hardware accelerated codecs, the Radeon RX Vega M GH is not. DXVA Checker serves as a confirmation for the former and a source of information for the latter.

We can actually see that the codec support from the Intel side is miles ahead of the Radeon's capabilities. It is therefore a pity that users can't somehow set a global option to make all video decoding and related identification rely on the integrated GPU.

Intel originally claimed at the launch of the Hades Canyon NUCs that they would be able to play back UltraHD Blu-rays. The UHD BD Advisor tool from CyberLink, however, presented a different story.

After a bit of back and forth with Intel, it appears that the Hades Canyon NUCs will not be able to play back UHD Blu-rays. Apparently, the use of the Protected Audio Video Path (PAVP) in the integrated GPU is possible only if the display is also being driven by the same GPU. It turned out to be quite disappointing, particularly after Intel's promotion of UHD Blu-ray playback and PAVP as unique differentiating features of the Kaby Lake GPU.

Networking and Storage Performance Power Consumption and Thermal Performance
Comments Locked

124 Comments

View All Comments

  • eva02langley - Friday, March 30, 2018 - link

    "So, tell me why I am wrong in saying that the Intel iGPU is miles ahead of the Radeon Vega ?"

    Because it can render games at 1080p...? This is seriously a question?

    This is actually incredible to see iGPU able to do that. And we forget at this time the PS4 and the Xbox One X capabilities.

    This is not a discrete GPU.
  • The_Assimilator - Monday, April 2, 2018 - link

    Way to take Ganesh's statement out of context to push in your own VEGA UBER ALLES viewpoint. He was very obviously talking about the video playback capabilities of Vega, which are objectively inferior to Intel's.
  • Hifihedgehog - Friday, March 30, 2018 - link

    As a neutral industry observer myself, I have had to build with discrete graphics in ITX cases (with both Intel and AMD CPUs) because of the timing and handshaking issues of Intel NUCs’ DP-to-HDMI converters. I have no major qualms with Intel as CPU company; it is their graphics solutions that I am not fond of and well familiar with as being compromised.
  • ganeshts - Friday, March 30, 2018 - link

    Not denying that the NUC's HDMI ports have some compatibility issues, but, to their credit, they have been very responsive and tried to figure out fixes (I spent almost 6 months last year trying to get their KBL NUC to work with the 4K TV in my testbed).

    Every vendor has some problem or the other. In my experience, NVIDIA has one of the best generic solutions for multimedia systems, but, Intel wins out in niche use-cases (YouTube HDR, for example). Less said about AMD, the better - their drivers for multimedia functions turned from good to bad to worse, and I don't think I have done any HTPC testing on AMD GPU-based system in the last couple of years - they basically haven't released anything competitive in that segment, to be honest. Hopefully, that changes with the Ryzen APUs, but, I can't say for sure unless it undergoes a thorough evaluation.

    Multiple readers email me with request for guidance on what to buy from a HTPC perspective. In most cases, I point them towards some NUC-based solution. Feedback after purchase has never been negative.
  • Hifihedgehog - Friday, March 30, 2018 - link

    “Less said about AMD, the better - their drivers for multimedia functions turned from good to bad to worse,”

    Please qualify this with an example. I and others at SmallFormFactor forums are using the Raven Ridge APUs, and I have had no issues with Kodi, MPC-BE and MadVR for 12-bit UHD home theater duty. Saying current generation AMD graphics drivers are bad and worse is just as inaccurate as saying Intel HD Graphics are good for nothing except Solitaire—both signify naïveté with either products.

    “In most cases, I point them towards some NUC-based solution. Feedback after purchase has never been negative.”

    I kindly point you to this thread, 674 replies and counting, responses comprised mostly of complaints. There have been droves of disgruntled NUC users this last generation. Intel NUCs have been awful, and many have abandoned them for alternative small form factor products.

    communities (dot) intel (dot) com/message/490689#490689
  • ganeshts - Friday, March 30, 2018 - link

    Example, right now with Vega GPU in Hades Canyon :

    Use VLC 3.0.1 with default preferences on Windows 10 latest stable release and attempt to play
    back an interlaced MPEG2 clip - the video output is blank and only the audio plays. The same scenario in systems using the KBL iGPU or NVIDIA GPUs is absolutely fine.

    Now, if the VLC developers have to do something special to make code that works for both Intel iGPU and NVIDIA GPU, I have to unfortunately say it is AMD's driver that is at fault for having undefined behavior in their video decode acceleration or rendering API.

    If you play only one type of codec and it works great for that, it doesn't mean the drivers are flawless.

    AMD drivers were good when their PR team was trying to promote the HQV benchmark for the HTPC market. They started turning bad around the AMD 7000 series where their DXVA APIs used to result in BSODs when people attempted to use them. And, after that, I got disillusioned with AMD's GPU for HTPC duties and stopped recommending them. Ryzen might be different - I haven't tested it yet. But, based on my experience in Hades Canyon, I am not very bullish.

    NUC-based, from my perspective, is any UCFF PC based on the -U series. In the KBL-U generation, my first recommendation has always been the ASRock Beebox-S 7200U, followed by the NUC7i7BNH : Both of them have got very good feedback from people I recommended them to. Btw, the incompatibility issue that I had with the NUC7i7BNH and the TCL 55P607 in HDR mode was actually fixed after a silent firmware update on the TV side. The blame is not on one supplier (holding no torch for Intel here, I am just saying that no one manufacturer can be blamed all the time).
  • Hifihedgehog - Saturday, March 31, 2018 - link

    VLC is well-known to be a overly processor intensive program (or CPU hog; see here: pcworld (dot) com/article/3023430/hardware/tested-vlc-vs-windows-10-video-player-the-winner-may-surprise-you.html ) and due to this in more recent years, many videophiles moved along to MPC-HC and MPC-BE. I do not understand why many computer geeks still insist on it. I have used the MPC twin programs for over five years now and have had no issues for codec usage with either, which rely on LAV filters. Last I used VLC, it used more than double the CPU usage, it had worse image scaling than the forks of MPC, and file support was just as good if not superior. Honestly, VLC was a great solution a decade ago, but times have changed and I now highly recommend and always use the MPC products. I cannot see any reason why to insist on VLC at this point especially with the problems you mention which I never encountered in the MPC forked projects.
  • Hifihedgehog - Saturday, March 31, 2018 - link

    PS:

    techhive (dot) com/article/2892383/which-is-the-better-free-video-player-mpc-hc-176-vs-vlc-22.html

    reddit (dot) com/r/pcmasterrace/comments/43do0n/is_anyone_still_using_vlc_if_thats_the_case/
  • Hifihedgehog - Saturday, March 31, 2018 - link

    videohelp (dot) com/softwareimages/madvr_1196.jpg
  • ganeshts - Saturday, March 31, 2018 - link

    All those references to VLC are pre-3.0 release. With 3.0, VLC had a major overhaul. That is the reason why I never touched VLC in my earlier systems reviews, but started doing so with the ones from this month.

    https://www.videolan.org/vlc/releases/3.0.0.html

    The new release is very power efficient - as good as a lean MPC-HC + LAV Filters configuration. I believe they have done an excellent job, and will be using VLC moving forward (in addition to Kodi and MPC-HC / madVR).

    Like it or not, it is the geeks and the nerds who use MPC-HC. The mass market still uses Kodi and VLC (despite the latter's inefficiencies pre-3.0).

Log in

Don't have an account? Sign up now