Networking and Storage Performance

Networking and storage are two major aspects which influence our experience with any computing system. This section presents results from our evaluation of the storage aspect in the Intel NUC10i7FNH (Frost Canyon). One option would be repetition of our strenuous SSD review tests on the drive(s) in the PC. Fortunately, to avoid that overkill, PCMark 8 and PCMark 10 have storage benches where certain common workloads such as loading games and document processing are replayed on the target drive. Results are presented in two forms, one being a benchmark number and the other, a bandwidth figure.

We first ran the PCMark 8 storage bench on selected PCs and the results are presented below.

Futuremark PCMark 8 Storage Bench - Score

Futuremark PCMark 8 Storage Bench - Bandwidth

The PCIe 3.0 x2 SSD doesn't perform up to the mark when compared to the PCIe 3.0 x4 SSDs used in the other systems. However, it is still better than the SATA SSDs used in other systems. The Frost Canyon NUC is the first one in our SFF PC set to be subject to the PCMark 10 Storage Bench, and as such, we do not have any other systems to compare its average access time of 204 us and storage bandwidth of 139.58 MBps against.

Futuremark PCMark 10 Storage Bench - Average Access Time

Futuremark PCMark 10 Storage Bench - Bandwidth

Futuremark PCMark 10 Storage Bench - Score

On the networking side, we are yet to set up our 802.11 ax / Wi-Fi 6 testbed for small form-factor PCs, and hence, there are no bandwidth numbers to report yet. However, it must be noted that the Frost Canyon NUC is the first NUC to come with 802.11ax / Wi-Fi 6 support, and its theoretical maximum bandwidth of 2400 Mbps betters the 1733 Mbps offered by the Wireless-AC 9560 in the Bean Canyon NUC. The AX 201 WLAN component uses the CNVi capability in the Comet Lake-U SiP with only the radio being an external chip. The AX 201 has a 2x2 simultaneous dual-operation in 2.4 GHz and 5 GHz bands and also comes with support for 160 MHz-wide channels.

Miscellaneous Performance Metrics HTPC Credentials - Display Outputs Capabilities
Comments Locked

85 Comments

View All Comments

  • The_Assimilator - Monday, March 2, 2020 - link

    It's not, but the point is still valid: nobody buying these things is doing so because they expect them to be graphics powerhouses.
  • HStewart - Monday, March 2, 2020 - link

    But some people are so naive and don't realize the point. I came up in days when your purchase card that didn't even have GPU's on it. Not sure what level iGPU's are but they surely can run business graphics fine and even games a couple of years ago.
  • notb - Thursday, March 5, 2020 - link

    Horrible?
    These iGPUs can drive 3 screens with maybe 1-2W power draw. Show me another GPU that can do this.

    This is an integrated GPU made for efficient 2D graphics. There's very little potential to make it any better.
  • PaulHoule - Monday, March 2, 2020 - link

    Well, Intel's horrible iGPUs forced Microsoft to walk back the graphical complexity of Windows XP. They kept the GPU dependent architecture, but had to downgrade to "worse than cell phone" visual quality because Intel kneecaped the graphics performance of the x86 platform. (Maybe you could get something better, but developers can't expect you to have it)
  • HStewart - Monday, March 2, 2020 - link

    I think we need actual proof on these bias statements. I think there is big difference of running a screen at 27 or more inches than 6 to 8 inches no matter what the resolution.
  • Korguz - Monday, March 2, 2020 - link

    we need proof of your bias statements, but yet, you very rarely provide any.. point is ??
  • Samus - Monday, March 2, 2020 - link

    What does screen size have to do with anything? Intel can't make an iGPU that can drive a 4K panel fluidly, meanwhile mainstream Qualcomm SoC's have GPU performance able to drive 4K panels using a watt of power.
  • HStewart - Tuesday, March 3, 2020 - link

    Can Qualcomm actually drive say a 32 in 4k screen efficiently. Also what is being measure here, Videos or actually games and that depends on how they are written.
  • erple2 - Saturday, March 14, 2020 - link

    I'm not sure that I understand your statement here, as it doesn't seem to make any sense. I was not aware that they physical dimensions of the screen mattered at all to the GPU, apart from how many pixels it has to individually manage/draw. If your implication is the complexity and quantity of information that can be made significant on a 32" screen is different from a 5.7" screen, then I suppose you can make that argument. However, I have to make guesses as to what you meant for this to come to that conclusion.

    Generally the graphical load to display 4k resolution is independent of whether the actual screen is 6" or 100". Unless I'm mistaken?
  • PeachNCream - Monday, March 2, 2020 - link

    For once, I agree with HStewart (feels like I've been shot into the Twilight Zone to even type that). To the point though, Windows XP was released in 2001. Phones in that time period were still using black and white LCD displays. Intel's graphics processors in that time period were the Intel Extreme series built into the motherboard chipset (where they would remain until around 2010, after the release of WIndows 7). Sure those video processors are slow compared to modern cell phones, but nothing a phone could do when XP was in development was anything close to what a bottom-feeder graphics processor could handle. I mean crap, Doom ran (poorly) on a 386 with minimal video hardware and that was in the early 1990s whereas phones eight years later still didn't have color screens.

Log in

Don't have an account? Sign up now