Power Consumption and Thermal Performance

The power consumption of the NUC8i7HVK at the wall was measured with a 4K display (LG 43UD79B) being driven through the HDMI port in the rear. In the graphs below, we compare the idle and load power of the system with other high-performance SFF PCs that we have evaluated before. For load power consumption, we ran our own custom stress test (Prime95 and FurMark) as well as the AIDA64 System Stability Test with various stress components, and noted the maximum sustained power consumption at the wall.

Idle Power Consumption

Load Power Consumption (AIDA64 SST)

The power efficiency is pleasing - not only does the NUC have the lowest idle power, it also comes in the middle of the pack from a loading perspective (closer to the systems with the 65W TDP desktop processors).

Our thermal stress routine starts with the system at idle, followed by four stages of different system loading profiles using the AIDA64 System Stability Test (each of 30 minutes duration). In the first stage, we stress the CPU, caches and RAM. In the second stage, we add the GPU to the above list. In the third stage, we stress the GPU standalone. In the final stage, we stress all the system components (including the disks). Beyond this, we leave the unit idle in order to determine how quickly the various temperatures in the system can come back to normal idling range. The various clocks, temperatures and power consumption numbers for the system during the above routine are presented in the graphs below.

The cores manage to consistently stay above the rated clock (3.1 GHz) under all loading conditions. Given the higher power level (65W) that the CPU is configured for, we find that it stays close to 3.9 GHz till the CPU die starts to approach the 100C junction temperature. The thermal solution manages to easily keep the die below Tjmax while operating the cores at the rated clock.

The measurement of the power consumption presents some challenges due to the dynamic power sharing technology employed by Intel to share the package TDP across the CPU and the GPU. Currently, hardware monitoring programs are able to tap into the CPU die power (being misinterpreted as CPU package power consumption, while it actually appears to be just the CPU die power), the IA cores power (logically closer to the CPU die power, unless the iGPU is active), DRAM power consumption (refers to the SODIMMs, and not the HBM memory), and the Radeon GPU's chip power consumption. In almost all our previous system reviews, the at-wall power consumption has been close to the sum of the CPU package power and discrete GPU power (which accounting for the power consumption of the DRAM, physical disks etc.). However, in the case of the NUC8i7HVK, the at-wall power consumption is substantially higher. In the AIDA64 stress tests, we see that the CPU die power tracks the sum of the iGPU and IA cores power - around 65W, as expected. The dGPU power is only around 35W, but the maximum at-wall power consumption is as high as 175W. We are still looking into the reasons for this anomalous readings, but, it is likely that the current hardware monitoring programs are missing some key power consumption aspects of the KBL-G package.

We repeated the same observations with our legacy stress test using the latest versions of Prime95 and Furmark. Prime95 immediately pushes the core clocks to the rated speed (3.1 GHz) with infrequent spikes to 3.9 GHz, and this allows the cooling solution to maintain the CPU die at around 80C. However, adding Furmark to the mix stresses the solution, and makes it unable to prevent the die from approaching the 100C junction temperature. At that point, we see more aggressive scaling back of the cores' frequency to the rated speed.

The combination of Prime95 and Furmark makes the at-wall power consumption to go as high as 230W. However, the component power readings from the monitoring programs still show only 65W for the CPU die and around 60W for the Radeon GPU.

4K HTPC Credentials Miscellaneous Aspects and Concluding Remarks
Comments Locked

124 Comments

View All Comments

  • cacnoff - Thursday, March 29, 2018 - link

    I see that it can play back netflix 4k HDR?

    Does this make Intel the first Radeon GPU implementation to handle Playready 3.0?
  • patrickjp93 - Friday, March 30, 2018 - link

    Actually that's handled by the iGPU on Kaby Lake. Vega is not PlayReady 3.0-capable.
  • ganeshts - Friday, March 30, 2018 - link

    On traditional KBL systems, you are right about iGPU handling PlayReady 3.0 video decoding.

    On the Hades Canyon, it appears that the Vega GPU is handling it. I have updated the '4K HTPC Credentials' section with the appropriate text after capturing the screenshot below:

    https://images.anandtech.com/doci/12572/Netflix-GP...
  • gigahertz20 - Thursday, March 29, 2018 - link

    I've built two Intel NUC's for family members in the past couple of years and they love them. Fast, quiet and so far reliable. They don't game at all which is why I convinced them to buy them. I'm not sure if this NUC is going to be popular at all though at $1,000 barebones. Who is going to buy it? The gaming performance of this NUC is nothing special, gamers and enthusiasts are going to stick with desktops, alot of people are just waiting for the cryptocurrency craze to die down so we can get video cards at decent prices again. If that takes another year or 2 so be it.

    Your average person that just needs an office computer won't buy this at $1k, you can get a much cheaper NUC and throw in a SSD and that will work fine. Why pay a premium for a cute little powerful box, if you want small and portable you can get a laptop for cheaper. If they would have priced this at $600 barebones it would have been much more appealing to your average user that might want to play the occasional game at 1080P.
  • Crazyeyeskillah - Thursday, March 29, 2018 - link

    Nuc's have always been geared as thing clients for businesses. It's a niche market that pretty much just wants reliability and 'good enough' performance. I don't see many people loading up on the $1700 version like we see here, but Intel will get good sales from the lowest tier when ordered by the hundreds for large companies.
  • Sailor23M - Friday, March 30, 2018 - link

    I bought the Skull Canyon version last year at a good discount on newegg. I am very happy with it and intel’s support (for at least the skull canyon) has been great with a dedicated website and easy to find updates, firmware and drivers. I have it mounted behind my monitor and use it as my main PC. I’m sure that although the retail price on these is $999, you will be able to find it for much less in a few months time.
  • The_Assimilator - Thursday, March 29, 2018 - link

    For the love of god Ganesh, please, PLEASE give us proper teardowns of the units you review. That means taking the damh things apart and showing us what all the bits look like, NOT just removing the lid that allows you to access the user-upgradable bits.
  • Crazyeyeskillah - Thursday, March 29, 2018 - link

    Why do you need a tear down of this product?
  • The_Assimilator - Monday, April 2, 2018 - link

    I don't "need" it, but a review should attempt to be as thorough as possible, and for hardware that means showing as much of the system as possible.
  • cfenton - Thursday, March 29, 2018 - link

    Usually review units are on loan from the manufacturer. They aren't typically too keen on reviewers tearing them apart before returning them.

Log in

Don't have an account? Sign up now