Power Consumption and Thermal Performance

The power consumption of the NUC8i7HVK at the wall was measured with a 4K display (LG 43UD79B) being driven through the HDMI port in the rear. In the graphs below, we compare the idle and load power of the system with other high-performance SFF PCs that we have evaluated before. For load power consumption, we ran our own custom stress test (Prime95 and FurMark) as well as the AIDA64 System Stability Test with various stress components, and noted the maximum sustained power consumption at the wall.

Idle Power Consumption

Load Power Consumption (AIDA64 SST)

The power efficiency is pleasing - not only does the NUC have the lowest idle power, it also comes in the middle of the pack from a loading perspective (closer to the systems with the 65W TDP desktop processors).

Our thermal stress routine starts with the system at idle, followed by four stages of different system loading profiles using the AIDA64 System Stability Test (each of 30 minutes duration). In the first stage, we stress the CPU, caches and RAM. In the second stage, we add the GPU to the above list. In the third stage, we stress the GPU standalone. In the final stage, we stress all the system components (including the disks). Beyond this, we leave the unit idle in order to determine how quickly the various temperatures in the system can come back to normal idling range. The various clocks, temperatures and power consumption numbers for the system during the above routine are presented in the graphs below.

The cores manage to consistently stay above the rated clock (3.1 GHz) under all loading conditions. Given the higher power level (65W) that the CPU is configured for, we find that it stays close to 3.9 GHz till the CPU die starts to approach the 100C junction temperature. The thermal solution manages to easily keep the die below Tjmax while operating the cores at the rated clock.

The measurement of the power consumption presents some challenges due to the dynamic power sharing technology employed by Intel to share the package TDP across the CPU and the GPU. Currently, hardware monitoring programs are able to tap into the CPU die power (being misinterpreted as CPU package power consumption, while it actually appears to be just the CPU die power), the IA cores power (logically closer to the CPU die power, unless the iGPU is active), DRAM power consumption (refers to the SODIMMs, and not the HBM memory), and the Radeon GPU's chip power consumption. In almost all our previous system reviews, the at-wall power consumption has been close to the sum of the CPU package power and discrete GPU power (which accounting for the power consumption of the DRAM, physical disks etc.). However, in the case of the NUC8i7HVK, the at-wall power consumption is substantially higher. In the AIDA64 stress tests, we see that the CPU die power tracks the sum of the iGPU and IA cores power - around 65W, as expected. The dGPU power is only around 35W, but the maximum at-wall power consumption is as high as 175W. We are still looking into the reasons for this anomalous readings, but, it is likely that the current hardware monitoring programs are missing some key power consumption aspects of the KBL-G package.

We repeated the same observations with our legacy stress test using the latest versions of Prime95 and Furmark. Prime95 immediately pushes the core clocks to the rated speed (3.1 GHz) with infrequent spikes to 3.9 GHz, and this allows the cooling solution to maintain the CPU die at around 80C. However, adding Furmark to the mix stresses the solution, and makes it unable to prevent the die from approaching the 100C junction temperature. At that point, we see more aggressive scaling back of the cores' frequency to the rated speed.

The combination of Prime95 and Furmark makes the at-wall power consumption to go as high as 230W. However, the component power readings from the monitoring programs still show only 65W for the CPU die and around 60W for the Radeon GPU.

4K HTPC Credentials Miscellaneous Aspects and Concluding Remarks
Comments Locked

124 Comments

View All Comments

  • Hifihedgehog - Monday, April 2, 2018 - link

    PS: Raven Ridge’s claim to fame is support for fixed function 10-bit VP9 decoding.

    tomshardware (dot) co (dot) uk/amd-ryzen-5-2400g-zen-vega-cpu-gpu,review-34205-4.html
  • kunal29 - Saturday, March 31, 2018 - link

    What about the latency benchmarks between GPU and CPU?
  • beginner99 - Sunday, April 1, 2018 - link

    Not being able to play UHD BluRay basically kills the product as HTPC which limits it to gaming and that is a steep price to ask just for that. My effing TV can play 4k HDR but this $1300 PC can't???
  • Tyler_Durden_83 - Monday, April 2, 2018 - link

    Here is an idea, the benchmarks as images are so last decade, seeing the review of the zotac without the benchmarks of hades canyon just because it came out one day earlier, or with a terribly old xps 15 model even though you did bench the latest, is quite frankly not the high standard that people expect from Anand
  • kmmatney - Monday, April 2, 2018 - link

    This is a nice system, but still way too expensive. You can get a gaming laptop with 15" screen, 7700HQ cpu, RAM, Windows OS, usually an SSD OS drive, and a GTX 1060 for around this barebone price. Even less if you go for a 1050 Ti, which is about equivalent to this. It's impressive, but I just have never gotten the point of these expensive NUCs.
  • JKJK - Tuesday, April 3, 2018 - link

    lack of UHD/HDR support in many cases and those kodi freezes .... meh.
    I would like to see some update on these freeze-issues in the future.
  • HakkaH - Tuesday, April 3, 2018 - link

    Too bad they didn't throw in the AMD 200G and 2400G with the benchmarks. You can build a small system with it which would be a whole lot cheaper and probably pretty decent when it comes to gaming speed.
  • Dev3 - Thursday, April 5, 2018 - link

    Hey Ganesh, can you comment on the current status of apparent lack of iGPU/AMD-CPU switchable graphics? Is this just an early BIOS/software issue or an unfixable design flaw where video-out is forced to route through the power-sucking Vega chip? This may be tolerable on a NUC but would be totally unacceptable on a laptop.

    I have an XPS-15 2-in-1 (9575) on order having assumed that Dell would never release a laptop with such a glaring flaw. But now with the first review out (https://www.digitaltrends.com/laptop-reviews/dell-... saying battery life is really bad, I'm getting concerned. Three hours runtime? Really??

    I thought this was the laptop I was waiting for but now I'm seriously considering canceling my order before it ships and holding off until the issue is sorted out or at least understood.

    I assume Intel is aware of the issue - can they fix it or did they (intentionally or unintentionally) sabotage their own (AMD) product??
  • AllThings3D - Saturday, April 7, 2018 - link

    I noticed you only benchmarked the faster and more expensive NUC8i7HVK. Do you have any plans to benchmark the NUC8i7HNK? I have found very little on this unit and would love to know how much less is expected in performance. If we can obtain at least 50% over the equivalent KabyLake NUC with Iris 620 Graphics, this would be okay for my needs especially since the TDP is 65W versus 100W for the NUC8i7HVK. My purpose is to use this Microsoft Mixed Reality backpack PC with two Sony V-Lock or AntonBauer battery packs. My current NUC "belt system" using the Iris 620 IGPU has worked out very well for doing engineering and architectural VR visualization and with the MSXR using an unbounded positional tracking system, you can navigate larger spaces than with the current HTC/Vive systems. In this YouTube video (https://youtu.be/hM8uwzmhaJY) I am in my backyard, something I don't think I have seen done with the any of the other VR solution :)

    One more questions. Do you know what the actual VDC in range is? I know previous NUCs had an actual range between 11-24 VDC. Since the 'Belt System' uses a KabyLake NUC, the 14.8 VDC AntonBauer LiPo works great. I hope this is the case here since it would complicate my battery circuit to have have to go with a custom solution.
  • JKJK - Tuesday, April 10, 2018 - link

    So... As a future proof media center (mostly kodi use), should I buy the previous gen?
    Need answer asap if I need to cancel my order

Log in

Don't have an account? Sign up now