HTPC Credentials

The Beebox-S series, unlike the Braswell Beebox, does not have any fanless members. However, the noise profile is attractive enough for the unit to be used as a HTPC. Operation of the Core i5-7200U at its default TDP ensures that the fan doesn't need to spin as fast as what we have seen in some of the other UCFF PCs (which configure the TDP up). However, given the specifications of the Intel HD Graphics 620, it is clear that the Beebox-S 6200U is more suited for the casual HTPC user, rather than someone who wants all the bells and whistles like customized renderers (madVR etc.). Based on this use-case, we evaluated refresh rate accuracy, over-the-top (OTT) streaming, and Kodi 17.0 for local media playback.

Refresh Rate Accuracy

Starting with Haswell, Intel, AMD and NVIDIA have been on par with respect to display refresh rate accuracy. The most important refresh rate for videophiles is obviously 23.976 Hz (the 23 Hz setting). As expected, the ASRock Beebox-S 7200U has no trouble with refreshing the display appropriately in this setting.

The gallery below presents some of the other refresh rates that we tested out. The first statistic in madVR's OSD indicates the display refresh rate.

Network Streaming Efficiency

Evaluation of OTT playback efficiency was done by playing back our standard YouTube test stream and five minutes from our standard Netflix test title. Using HTML5, the YouTube stream plays back a 720p encoding. Since YouTube now defaults to HTML5 for video playback, we have stopped evaluating Adobe Flash acceleration. Note that only NVIDIA exposes GPU and VPU loads separately. Both Intel and AMD bundle the decoder load along with the GPU load. The following two graphs show the power consumption at the wall for playback of the HTML5 stream in Mozilla Firefox (v 50.1.0).

YouTube Streaming - HTML5: Power Consumption

GPU load was around 9.7% for the YouTube HTML5 stream and 0.01% for the steady state 6 Mbps Netflix streaming case.

Netflix streaming evaluation was done using the Windows 10 Netflix app. Manual stream selection is available (Ctrl-Alt-Shift-S) and debug information / statistics can also be viewed (Ctrl-Alt-Shift-D). Statistics collected for the YouTube streaming experiment were also collected here.

Netflix Streaming - Windows 10 Metro App: Power Consumption

The Beebox-S 7200U is not particularly power efficient because our configuration uses a NVMe SSD. A SATA SSD cuts down the power numbers by around 2.5 to 3W in each of the scenarios.

One of the most interesting features of Kaby Lake PCs with a HDMI 2.0 / HDCP 2.2 port is the ability to play 4K Netflix streams. This is enabled by some content protection features introduced in Kaby Lake. In our initial trials, we were unable to get Netflix 4K to play back successfully despite being able to drive 4K @ 60 Hz to a HDCP 2.2-capable television. After some back and forth with ASRock, and getting hold of an updated LSPCon firmware and BIOS (v1.60), we were able to get Netflix 4K streams to work.

On the Beebox-S 7200U, our Netflix 4K test title was delivered as a 16 Mbps HEVC encode.

Decoding and Rendering Benchmarks

In order to evaluate local file playback, we concentrate on EVR-CP and Kodi 17.0. We already know that EVR works quite well even with the Intel IGP for our test streams. The decoder used was LAV Filters bundled with MPC-HC v1.7.10.276. We have now added HEVC streams to our test suite.

In our earlier reviews, we focused on presenting the GPU loading and power consumption at the wall in a table (with problematic streams in bold). Starting with the Broadwell NUC review, we decided to represent the GPU load and power consumption in a graph with dual Y-axes. Eleven different test streams of 90 seconds each were played back with a gap of 30 seconds between each of them. The characteristics of each stream are annotated at the bottom of the graph. Note that the GPU usage is graphed in red and needs to be considered against the left axis, while the at-wall power consumption is graphed in green and needs to be considered against the right axis.

Frame drops are evident whenever the GPU load consistently stays above the 85 - 90% mark. Like the MSI Cubi2-005BUS (which also uses the Intel HD Graphics 620), the ASRock Beebox-S 7200U has absolutely no trouble in keeping up with the media playback use-cases.

Moving on to the codec support, the Intel HD Graphics 620 is a known quantity with respect to the scope of supported hardware accelerated codecs (based on Intel's claims). DXVA Checker serves as a confirmation.

Kaby Lake-U has one of the most comprehensive codec supports in the market after Intel decided to add HEVC 8b and 10b full hardware decode. In fact, there is also support for 10-bit VP9 in the GPU. It is a pity that the display engine still doesn't support HDMI 2.0 natively, but ASRock has ensured that it is not a problem for the Beebox-S 7200U.

Networking and Storage Performance Power Consumption and Thermal Performance
Comments Locked

33 Comments

View All Comments

  • fanofanand - Thursday, February 9, 2017 - link

    Your logic is sound, but most failures don't occur within the warranty period. Sometimes it seems they set the warranty period based on failure rates during QA testing. If they have a 50% failure rate on year 4 they just make sure the warranty is 3 years (oversimplified and inaccurate figures but that's the gist of it). Auto manufacturers have been doing this for decades, there was a reason most powertrain warranties went to 50,000 miles but massive head gasket failures or transmission failures would occur at 60k (I experienced both with GM products which is why they won't be getting any more of my hard earned money).
  • OzzyLogic - Monday, February 20, 2017 - link

    It's like you took the words out of my mouth, i really dislike fanned systems because of the dust build up that you will eventually get. No matter what you do, dust will always find its way into your system somehow. Also one of the reasons why i bought my self one of these. https://www.logicsupply.com/eu-en/ml100g-50/
  • thesloth - Tuesday, February 7, 2017 - link

    I probably just need to RTFA properly, but I don't see any graphics or mention of noise (dB). For a HTPC I would have thought that relevant.
  • Sene - Tuesday, February 7, 2017 - link

    Why don't you test the GPU with MadVR. Even if it has limited power it would be interesting to know the best settings it can support
  • David_K - Tuesday, February 7, 2017 - link

    From my testing with a 7700K and its build in HD 630, madVR is just too heavy for the gpu, on 4K videos it becomes a stutterfest.
  • Samus - Wednesday, February 8, 2017 - link

    Same experience on my i5-7600k with HD630...only reason I bothered trying was because my 1070 took a week longer to arrive than the rest of the parts. I wasn't optimistic going in but figured I'd see what Intel GPU's can do since I haven't really toyed with one since my 4th gen Haswell laptop.
  • TheinsanegamerN - Wednesday, February 8, 2017 - link

    I'm still incredibly dissapointed in intel's lack of improvements in the iGPU race. HD 630 is onyl, on average, 2FPS faster in games then the hd 4600 from haswell.

    three years and a whole 2FPS, mostly thanks to a better memory controller. Why cant intel start offering more chips with iris graphics?

    Raven ridge cant get here soon enough.
  • BrokenCrayons - Wednesday, February 8, 2017 - link

    I haven't really looked into iGPU performance improvements at all since Ivy Bridge's HD 4000. Is that really all we've gained in the past few years out of non-Iris Intel graphics? They've got to be hitting some kind of shared system memory bottleneck that makes it a difficult prospect to wring more out of their iGPUs. Though that doesn't explain the A-series GPUs being fairly quick despite lacking any sort of additional memory bandwidth.
  • nathanddrews - Wednesday, February 8, 2017 - link

    There's not much to look into unless you play mostly older games. People are creative and I've seen playable frame rates on non-Iris IGP newer games, but it usually involves 720p resolution and minimal settings or INI hacks to disable engine features. Even the most powerful Intel IGP (Iris Pro) chokes on games like Doom (2016) and Tomb Raider (latest). Context is everything.
    https://youtu.be/LV8Msa-Pxl8
  • BrokenCrayons - Thursday, February 9, 2017 - link

    Thanks for the response. I'd gotten a vague sense that Intel wasn't really leaping ahead with iGPU performance by the fact that the company's announcements stressed additional features as opposed to "x-times more performance" or "y-percent faster than last gen graphics" but I didn't realize things have gotten so stagnant recently. The fact that Iris exists sort of glosses over and distracts from the much more common eDRAM-less iGPU performance.

    *rant disclaimer* Iris has really done a lot of damage to the GPU market in general. By raising the bar of iGPU performance to the point where lower end discrete cards are rivaled by Iris parts, Intel's effectively eliminated the low end discrete GPU segment altoghether. At the same time, Iris is an uncommon thing so while the performance exists, it's not available for purchase and there aren't GPUs available to fill the gap between the iGPUs you can actually buy and the bottom end of the current discrete GPU product stack. Thanks for that crap Intel. Thanks a lot for sticking us with the choice of a 75W TDP discrete card or an anemic iGPU that hasn't gotten faster in years.

Log in

Don't have an account? Sign up now