Power Consumption and Thermal Performance

The power consumption of the NUC8i7HVK at the wall was measured with a 4K display (LG 43UD79B) being driven through the HDMI port in the rear. In the graphs below, we compare the idle and load power of the system with other high-performance SFF PCs that we have evaluated before. For load power consumption, we ran our own custom stress test (Prime95 and FurMark) as well as the AIDA64 System Stability Test with various stress components, and noted the maximum sustained power consumption at the wall.

Idle Power Consumption

Load Power Consumption (AIDA64 SST)

The power efficiency is pleasing - not only does the NUC have the lowest idle power, it also comes in the middle of the pack from a loading perspective (closer to the systems with the 65W TDP desktop processors).

Our thermal stress routine starts with the system at idle, followed by four stages of different system loading profiles using the AIDA64 System Stability Test (each of 30 minutes duration). In the first stage, we stress the CPU, caches and RAM. In the second stage, we add the GPU to the above list. In the third stage, we stress the GPU standalone. In the final stage, we stress all the system components (including the disks). Beyond this, we leave the unit idle in order to determine how quickly the various temperatures in the system can come back to normal idling range. The various clocks, temperatures and power consumption numbers for the system during the above routine are presented in the graphs below.

The cores manage to consistently stay above the rated clock (3.1 GHz) under all loading conditions. Given the higher power level (65W) that the CPU is configured for, we find that it stays close to 3.9 GHz till the CPU die starts to approach the 100C junction temperature. The thermal solution manages to easily keep the die below Tjmax while operating the cores at the rated clock.

The measurement of the power consumption presents some challenges due to the dynamic power sharing technology employed by Intel to share the package TDP across the CPU and the GPU. Currently, hardware monitoring programs are able to tap into the CPU die power (being misinterpreted as CPU package power consumption, while it actually appears to be just the CPU die power), the IA cores power (logically closer to the CPU die power, unless the iGPU is active), DRAM power consumption (refers to the SODIMMs, and not the HBM memory), and the Radeon GPU's chip power consumption. In almost all our previous system reviews, the at-wall power consumption has been close to the sum of the CPU package power and discrete GPU power (which accounting for the power consumption of the DRAM, physical disks etc.). However, in the case of the NUC8i7HVK, the at-wall power consumption is substantially higher. In the AIDA64 stress tests, we see that the CPU die power tracks the sum of the iGPU and IA cores power - around 65W, as expected. The dGPU power is only around 35W, but the maximum at-wall power consumption is as high as 175W. We are still looking into the reasons for this anomalous readings, but, it is likely that the current hardware monitoring programs are missing some key power consumption aspects of the KBL-G package.

We repeated the same observations with our legacy stress test using the latest versions of Prime95 and Furmark. Prime95 immediately pushes the core clocks to the rated speed (3.1 GHz) with infrequent spikes to 3.9 GHz, and this allows the cooling solution to maintain the CPU die at around 80C. However, adding Furmark to the mix stresses the solution, and makes it unable to prevent the die from approaching the 100C junction temperature. At that point, we see more aggressive scaling back of the cores' frequency to the rated speed.

The combination of Prime95 and Furmark makes the at-wall power consumption to go as high as 230W. However, the component power readings from the monitoring programs still show only 65W for the CPU die and around 60W for the Radeon GPU.

4K HTPC Credentials Miscellaneous Aspects and Concluding Remarks
Comments Locked

124 Comments

View All Comments

  • Hifihedgehog - Saturday, March 31, 2018 - link

    “The mass market still uses Kodi and VLC.”

    I respectfully disagree. Many home theater users I know use Kodi in combination with MPC-HC or MPC-BE, due to MadVR highly superior scaling abilities. Check the Kodi forums. This is a very popular configuration:

    forum (dot) kodi (dot) tv/showthread.php?tid=209596

    Check out this thread. Many reference it. Perhaps you should as welll in going forward:

    forum (dot) doom9 (dot) org/showthread.php?t=171787

    I have tried VLC 3.0 and CPU usage and image quality are still inferior to MPC-HC and MPC-BE. For these reasons, it is still not worth recommending.
  • Hifihedgehog - Saturday, March 31, 2018 - link

    PS:

    hardforum (dot) com/threads/vlc-3-0-released-with-hdr-chromecast-support.1954247/#post-1043479175
  • Trixanity - Saturday, March 31, 2018 - link

    Try the latest 3.0.2 nightly. It should work there unless Hades have special drivers.
  • mode_13h - Friday, March 30, 2018 - link

    Rabid angry people like you are funny, do you really think anyone is going to read or care about your comment? Go away LOL
  • cfenton - Thursday, March 29, 2018 - link

    The claim was about codec support. Just by looking at the DXVA charts it's pretty clear the Intel IGP has better codec support. Hardware decode is pretty important for most people looking for a box to sit near their TV.

    Of course, you may be right that the Ryzen 5 is a better solution overall if you're willing to sacrifice UHD Blu-ray and some hardware decode ability.
  • eva02langley - Friday, March 30, 2018 - link

    Who the hell is using Blu-Ray anymore?
  • mooninite - Friday, March 30, 2018 - link

    People use Blu-Ray when they want to view the best possible video AND audio quality on something other than their laptop with a 1280x768 17" screen that's on shared wi-fi.
  • PeachNCream - Friday, March 30, 2018 - link

    Do you mean 1366x768? 1280x800 used to be pretty popular when screens went to 16:10.
  • cfenton - Friday, March 30, 2018 - link

    Anyone who cares about image and audio quality, which is precisely the market for an HTPC.
  • bill44 - Thursday, March 29, 2018 - link

    As far as I know, no Intel NUC with HDMI 2.x can deal with frame packed 3D ISO.

    As the Hades Canyon uses AMD GPU's HDMI 2.x output, it may be able to. Can someone test this?

Log in

Don't have an account? Sign up now