HTPC Credentials

The higher TDP of the processor in Skull Canyon, combined with the new chassis design, makes the unit end up with a bit more noise compared to the traditional NUCs. It would be tempting to say that the extra EUs in the Iris Pro Graphics 580, combined with the eDRAM, would make GPU-intensive renderers such as madVR operate more effectively. That could be a bit true in part (though, madVR now has a DXVA2 option for certain scaling operations), but, the GPU still doesn't have full HEVC 10b decoding, or stable drivers for HEVC decoding on WIndows 10. In any case, it is still worthwhile to evaluate basic HTPC capabilities of the Skull Canyon NUC6i7KYK.

Refresh Rate Accurancy

Starting with Haswell, Intel, AMD and NVIDIA have been on par with respect to display refresh rate accuracy. The most important refresh rate for videophiles is obviously 23.976 Hz (the 23 Hz setting). As expected, the Intel NUC6i7KYK (Skull Canyon) has no trouble with refreshing the display appropriately in this setting.

The gallery below presents some of the other refresh rates that we tested out. The first statistic in madVR's OSD indicates the display refresh rate.

Network Streaming Efficiency

Evaluation of OTT playback efficiency was done by playing back our standard YouTube test stream and five minutes from our standard Netflix test title. Using HTML5, the YouTube stream plays back a 1080p H.264 encoding. Since YouTube now defaults to HTML5 for video playback, we have stopped evaluating Adobe Flash acceleration. Note that only NVIDIA exposes GPU and VPU loads separately. Both Intel and AMD bundle the decoder load along with the GPU load. The following two graphs show the power consumption at the wall for playback of the HTML5 stream in Mozilla Firefox (v 46.0.1).

YouTube Streaming - HTML5: Power Consumption

GPU load was around 13.71% for the YouTube HTML5 stream and 0.02% for the steady state 6 Mbps Netflix streaming case. The power consumption of the GPU block was reported to be 0.71W for the YouTube HTML5 stream and 0.13W for Netflix.

Netflix streaming evaluation was done using the Windows 10 Netflix app. Manual stream selection is available (Ctrl-Alt-Shift-S) and debug information / statistics can also be viewed (Ctrl-Alt-Shift-D). Statistics collected for the YouTube streaming experiment were also collected here.

Netflix Streaming - Windows 8.1 Metro App: Power Consumption

Decoding and Rendering Benchmarks

In order to evaluate local file playback, we concentrate on EVR-CP, madVR and Kodi. We already know that EVR works quite well even with the Intel IGP for our test streams. Under madVR, we used the DXVA2 scaling logic (as Intel's fixed-function scaling logic triggered via DXVA2 APIs is known to be quite effective). We used MPC-HC 1.7.10 x86 with LAV Filters 0.68.1 set as preferred in the options. In the second part, we used madVR 0.90.19.

In our earlier reviews, we focused on presenting the GPU loading and power consumption at the wall in a table (with problematic streams in bold). Starting with the Broadwell NUC review, we decided to represent the GPU load and power consumption in a graph with dual Y-axes. Nine different test streams of 90 seconds each were played back with a gap of 30 seconds between each of them. The characteristics of each stream are annotated at the bottom of the graph. Note that the GPU usage is graphed in red and needs to be considered against the left axis, while the at-wall power consumption is graphed in green and needs to be considered against the right axis.

Frame drops are evident whenever the GPU load consistently stays above the 85 - 90% mark. We did not hit that case with any of our test streams. Note that we have not moved to 4K officially for our HTPC evaluation. We did check out that HEVC 8b decoding works well (even 4Kp60 had no issues), but HEVC 10b hybrid decoding was a bit of a mess - some clips worked OK with heavy CPU usage, while other clips tended to result in a black screen (those clips didn't have any issues with playback using a GTX 1080).

Moving on to the codec support, the Intel Iris Pro Graphics 580 is a known quantity with respect to the scope of supported hardware accelerated codecs. DXVA Checker serves as a confirmation for the features available in driver version 15.40.23.4444.

It must be remembered that the HEVC_VLD_Main10 DXVA profile noted above utilizes hybrid decoding with both CPU and GPU resources getting taxed.

On a generic note, while playing back 4K videos on a 1080p display, I noted that madVR with DXVA2 scaling was more power-efficient compared to using the EVR-CP renderer that MPC-HC uses by default.

Networking and Storage Performance Power Consumption and Thermal Performance
POST A COMMENT

133 Comments

View All Comments

  • Zero Day Virus - Monday, May 23, 2016 - link

    Yep, same here! Would like to see how it compares and if it's worth it :) Reply
  • hubick - Monday, May 23, 2016 - link

    It would also be interesting to see how the new BRIX like the GB-BSi7T-6500 stack up. Reply
  • Barilla - Monday, May 23, 2016 - link

    I think it's time to drop the 1280x1024 gaming benchmarks. Virtually no one is going to play at such resolution, especially not with a 1000$ pc if a 22" 1080p monitor can be bought for a hundred bucks and change. Reply
  • MrSpadge - Monday, May 23, 2016 - link

    If your GPU is slow you HAVE to game at such resolutions, no matter what monitor you have. Reply
  • TheinsanegamerN - Monday, May 23, 2016 - link

    Then test at 720p. Nobody buys 5:4 monitors anymore. Reply
  • MrSpadge - Tuesday, May 24, 2016 - link

    The aspect ratio does not really matter for GPU testing, it's just the number of pixels the GPU has to compute. So performance at 720p will actually be a bit better. Reply
  • cknobman - Monday, May 23, 2016 - link

    Its rather lame that Anand would post up these low resolution benchmarks to try and make the iGPU not look like a total joke (which it is, at least at this price point).

    For $1000 if it can muster a playable framerate at a resolution outside of a decade old standard than this thing is overpriced.
    Reply
  • DanNeely - Monday, May 23, 2016 - link

    Lots of casual gamers do play at low resolutions because they don't have the budget to stay on the high end GPU treadmill. The real issue is that the days of doing so at 1280x1024 instead of 1366x768 are long past. This was brought up the last time gaming benchmarks were updated here; but is even more of a glaring issue as time goes on. Reply
  • DanNeely - Monday, May 23, 2016 - link

    1680x1050 really should be replaced with 1600x900 too. 16:9 monitors have become ubiquitous; testing at narrower aspect ratios doesn't fit real world usage anymore.

    I could see a case for going wider at the upper end and slotting an ultrawide 3440x1440 test between conventional 2560x1440 and 3840x2180 gaming. Mostly because it looks like the 1080 still falls just short of being able to play at 4k without having to turn settings down in a lot of games; making 1440p ultra widescreen the effective max single card resolution. (An increasingly important consideration with SLI/xFire becoming progressively less relevant due to temporal AA/post processing techniques that play really badly with multi-GPU setups.)
    Reply
  • Barilla - Monday, May 23, 2016 - link

    Yeah, I guess my point was IF you want to test at low res, then test at a more relevant low res - 1280x720, 1366x768, 1600x900 etc. But my other point would be that those graphs looke like they look now cause low resolution is paired with low settings, mid resolution with mid settings and so on. Many games these days don't really slow down that much at increased resolution, but rather at increased postprocessing effects - shadows, antialiasing, DoF, you name it. Before I had my current gaming PC I used to game on a laptop with GT555M inside, which is probably weaker than this IGP by some margin, and I ran most games in 1080p at acceptable framerates by turnig the details down. In general it yielded better fps AND better looks than running non-native res and mid graphics settings.
    But maybe it's just me, I like pixels a lot ;)
    Reply

Log in

Don't have an account? Sign up now