Benchmark Processing and Sensitivity Analysis

The PCMark 10 Extended benchmark was processed on a wide range of systems based on Skylake / Kaby Lake (since those were the ones that we had readily available). These ranged from a high-end gaming notebook to a passively-cooled mini-PC. The following list includes the main specifications of the systems.

  1. High-end Gaming Notebook: Razer Blade Pro 2016 (Razer BP16)
    • Intel Core i7-6700HQ, NVIDIA GTX 1080 8GB, 32GB LPDDR4, 2x Samsung PM961 1TB
  2. High-end mini-PC: Intel Skull Canyon NUC (Intel NUC6i7KYK)
    • Intel Core i7-6770HQ, Intel Iris Pro Graphics 580, 32GB LPDDR4, 1x Samsung 950 PRO 512GB
  3. Mid-range mini-PC: ECS LIVA Z Plus (ECS LIVA ZP)
    • Intel Core i5-7300U, Intel HD Graphics 620, 4GB LPDDR4, 1x Transcend MTS400 128GB
  4. Passively-cooled mini-PC: Zotac ZBOX CI523 nano (Zotac CI523n)
    • Intel Core i3-6100U, Intel HD Graphics 520, 16GB DDR3L, 1x Samsung SSD 840 EVO 128GB
  5. Mid-range Desktop PC: AnandTech's DAS Testbed (GIGABYTE Z170X-UD5 TH with Core i5-6600K) (SKL DAS TB)
    • Intel Core i5-6600K, Intel HD Graphics 530, 32GB DDR4, 1x Samsung SSD SM951 256GB

PCMark 10 Extended Score
PCMark 10 Essentials Score
PCMark 10 Productivity Score
PCMark 10 Gaming Score
PCMark 10 Digital Content Creation Score

Sensitivity analysis was performed for three different aspects:

  • Display resolution
  • DRAM configuration
  • Windows 10 Power Profile setting

In order to test out the sensitivity of the scores to the display resolution, we chose the Razer Blade Pro 2016 as the test system. One of the interesting aspects with DPI scaling is that PCMark 10 reports the effective resolution. For example, with the Razer Blade Pro set to 175% scaling, the 4K screen was recorded as 2194 x 1234 px. We ran the PCMark 10 Extended benchmark at that resolution, and also with the DPI scaling set to 100% (for a 3840 x 2160 resolution). In addition, we also drove a 1080p monitor over the HDMI port and set it as the sole display before processing the benchmark. The results below show that resolutions of 1080p and above all result in no significant differences in the PCMark 10 scores. That said, having DPI scaling activated seems to affect the scores negatively by a small amount.

PCMark 10 Extended Score
PCMark 10 Essentials Score
PCMark 10 Productivity Score
PCMark 10 Gaming Score
PCMark 10 Digital Content Creation Score

In order to test out the sensitivity of the scores to the DRAM configuration, we chose the Intel Skull Canyon NUC (NUC6i7KYK) as the test system. Our default configuration was with 2x 16GB DDR4 SODIMMs. We repeated the tests with only one slot occupied (1x 16GB) and with both slots occupied by 8GB SODIMMs 2x 8GB). The results below show that the performance (particularly for workloads utilizing the GPU) suffers a bit when one slot is unoccupied, but, there is no significant difference in the scores when moving from 16GB to 32GB of memory.

PCMark 10 Extended Score
PCMark 10 Essentials Score
PCMark 10 Productivity Score
PCMark 10 Gaming Score
PCMark 10 Digital Content Creation Score

Our final sensitivity analysis test looks at how the scores vary with changes in the Windows power plan. For this purpose, we chose our desktop PC (AnandTech DAS Testbed) with the Core i5-6600K in the GIGABYTE Z170X-UD5 TH motherboard. Depending on the system, Winows provides a collection of hardware and software settings under various power profiles. Systems usually default to the balanced profile. We ran the PCMark 10 Extended benchmark with the power profile set to 'High performance' and 'Power saver' also. As the results below show, it is very important to ensure that the power profile is consistent across different systems whose scores are going to be compared.

PCMark 10 Extended Score
PCMark 10 Essentials Score
PCMark 10 Productivity Score
PCMark 10 Gaming Score
PCMark 10 Digital Content Creation Score

Introduction and Evaluation Setup Concluding Remarks
POST A COMMENT

18 Comments

View All Comments

  • surfnaround - Monday, June 5, 2017 - link

    pricing for the commercial version is not mentioned, because it is 1000 dollars. I would assume it is the same price as the previous versions... how could it get anymore expensive? Reply
  • surfnaround - Monday, June 5, 2017 - link

    Ignore the post above... i though i was talking about 3Dmark... D'oh! Reply
  • waltsmith - Monday, June 5, 2017 - link

    Well, sounds like they are headed in the right direction!! Reply
  • TelstarTOS - Monday, June 5, 2017 - link

    Only open souce software is a step back IMO. Is LibreOffice the most used office suite? Doubt that. Same for the lack of acrobat. HEVC is another disappointment. Reply
  • BrokenCrayons - Monday, June 5, 2017 - link

    There's no doubt that LibreOffice is far behind MS Office, however there are signs of growth. In 2015, there were an estimated 100 million users.

    Source: http://news.softpedia.com/news/libreoffice-now-has...
    Reply
  • JoeyJoJo123 - Monday, June 5, 2017 - link

    They'd need some collaboration effort with Microsoft to get their blessing on being able to hook into Office applications at such a native level when doing these benchmarking tasks.

    LibreOffice is a lot easier to do this native testing in due to its open source nature; they can look up the code to see how they can invoke opening a document, generating 1000 pages of "Lorem Ipsum", doing a Ctrl + H find and replace all, etc. from the runtime of their own executable benchmark, and evaluate to a fine level the amount of time it took to complete these tasks to the clock cycle.

    Can you do Word and Excel based productivity benchmarking? Yeah, but it's manual and I hope you're quick and accurate with that stop watch. Why is it manual? Well go ask Microsoft why they continue to build walled garden closed source software where making extensions is difficult or impossible to accomplish.
    Reply
  • ganeshts - Monday, June 5, 2017 - link

    I doubt Word & Excel-based productivity benchmarking is that difficult. SYSmark 2014 SE does it without any issues. Also, you can always use AutoIt scripting to achieve the required 'workload' and 'automate' it similar to the way a real user would interact (PCMark 10 also uses AutoIt). Reply
  • JoeyJoJo123 - Friday, June 9, 2017 - link

    Wasn't actually aware of AutoIt existing. In that case, yes, I do agree that dropping Word/Excel automation, which could otherwise be done via AutoIt is short-sighted. Reply
  • FMJarnis - Thursday, June 22, 2017 - link

    PCMark 8 includes this exact thing as a separate application test. Implementing one is not that difficult. It is likely that a separate test for MS Office is added to PCMark 10 at a later date.

    But requiring everyone who wants to run PCMark 10 to first buy and install Microsoft Office is not really feasible as it is commercial software. For the baseline tests that should be usable without any other prerequisites, what you propose unfortunately doesn't work.
    Reply
  • FMJarnis - Thursday, June 22, 2017 - link

    HVEC is not used because it is not supported on Windows 7 by the OS.

    Also when PCMark 10 development started, HVEC licensing was... complicated. Software implementations on computers still had per-application licensing fees for the codec. Open source solutions were in a legal limbo. Since then HVEC licensing terms have changed, but it was too late for this benchmark.
    Reply

Log in

Don't have an account? Sign up now