Workstation Performance - SPECworkstation 3.1

SFF PCs traditionally do not lend themselves to workstation duties. However, a recent trend towards miniaturized workstations has been observed. While the Raptor Canyon NUC is primarily marketed towards gamers, its capabilities encouraged us to benchmark the system for both content creation workloads as well as professional applications. Towards this, we processed two SPEC benchmarks geared towards workstations - SPECworkstation 3.10 and SPECviewperf 2020 v3.

SPECworkstation 3.1

The SPECworkstation 3.1 benchmark measures workstation performance based on a number of professional applications. It includes more than 140 tests based on 30 different workloads that exercise the CPU, graphics, I/O and memory hierarchy. These workloads fall into different categories.

  • Media and Entertainment (3D animation, rendering)
  • Product Development (CAD/CAM/CAE)
  • Life Sciences (medical, molecular)
  • Financial Services
  • Energy (oil and gas)
  • General Operations
  • GPU Compute

Individual scores are generated for each test and a composite score for each category is calculated based on a reference machine (HP Z240 tower workstation using an Intel E3-1240 v5 CPU, an AMD Radeon Pro WX3100 GPU, 16GB of DDR4-2133, and a SanDisk 512GB SSD). Official benchmark results generated automatically by the benchmark itself are linked in the table below for the systems being compared.

SPECworkstation 3.1 Official Results (2K)
Beelink GTR7 Run Summary
ASRock DeskMeet B660 Run Summary
GEEKOM AS 6 (ASUS PN53) Run Summary
ASRock NUC BOX-1260P Run Summary
ASRock NUC BOX-1360P-D5 (Performance) Run Summary
ASRock 4X4 BOX-7735U (Performance) Run Summary
ASRock 4X4 BOX-5800U (Performance) Run Summary
Intel NUC13ANKi7 (Arena Canyon) Run Summary

Details of the tests in each category, as well as an overall comparison of the systems on a per-category basis are presented below.

Media and Entertainment

The Media and Entertainment category comprises of workloads from five distinct applications:

  • The Blender workload measures system performance for content creation using the open-source Blender application. Tests include rendering of scenes of varying complexity using the OpenGL and ray-tracing renderers.
  • The Handbrake workload uses the open-source Handbrake application to transcode a 4K H.264 file into a H.265 file at 4K and 2K resolutions using the CPU capabilities alone.
  • The LuxRender workload benchmarks the LuxCore physically based renderer using LuxMark.
  • The Maya workload uses the SPECviewperf 13 maya-05 viewset to replay traces generated using the Autodesk Maya 2017 application for 3D animation.
  • The 3ds Max workload uses the SPECviewperf 13 3dsmax-06 viewset to replay traces generated by Autodesk's 3ds Max 2016 using the default Nitrous DX11 driver. The workload represents system usage for 3D modeling tasks.

SPECworkstation 3.1 - Media and Entertainment

This category sees the different systems getting ordered on the basis of the configured sustained TDP. The GTR7 with its 65W TDP is at the top, followed closely behind by the 4X4 BOX-7735U with its 42W TDP, and the 40W RPL-P systems (including the Arena Canyon NUC) follow behind. The Rembrandt-R-based GEEKOM AS 6 with its 35W setting is in the middle of the pack, ahead of the previous generation systems.

Product Development

The Product Development category comprises of eight distinct workloads:

  • The Rodinia (CFD) workload benchmarks a computational fluid dynamics (CFD) algorithm.
  • The WPCcfd workload benchmarks another CFD algorithm involving combustion and turbulence modeling.
  • The CalculiX workload uses the Calculix finite-element analysis program to model a jet engine turbine's internal temperature.
  • The Catia workload uses the catia-05 viewset from SPECviewperf 13 to replay traces generated by Dassault Systemes' CATIA V6 R2012 3D CAD application.
  • The Creo workload uses the creo-02 viewset from SPECviewperf 13 to replay traces generated by PTC's Creo, a 3D CAD application.
  • The NX workload uses the snx-03 viewset from SPECviewperf 13 to replay traces generated by the Siemens PLM NX 8.0 CAD/CAM/CAE application.
  • The Solidworks workload uses the sw-04 viewset from SPECviewperf 13 to replay traces generated by Dassault Systemes' SolidWorks 2013 SP1 CAD/CAE application.
  • The Showcase workload uses the showcase-02 viewset from SPECviewperf 13 to replay traces from Autodesk???s Showcase 2013 3D visualization and presentation application

SPECworkstation 3.1 - Product Development

The trend observed in the Media and Entertainment category repeats here, with the GTR7 faring better than the 4X4 BOX-7735U. The performance advantage is also more noticeable and representative of the 65W - 42W gap in the TDPs.

Life Sciences

The Life Sciences category comprises of four distinct test sets:

  • The LAMMPS set comprises of five tests simulating different molecular properties using the LAMMPS molecular dynamics simulator.
  • The NAMD set comprises of three tests simulating different molecular interactions.
  • The Rodinia (Life Sciences) set comprises of four tests - the Heartwall medical imaging algorithm, the Lavamd algorithm for calculation of particle potential and relocation in a 3D space due to mutual forces, the Hotspot algorithm to estimate processor temperature with thermal simulations, and the SRAD anisotropic diffusion algorithm for denoising.
  • The Medical workload uses the medical-02 viewset from SPECviewperf 13 to determine system performance for the Tuvok rendering core in the ImageVis3D volume visualization program.

SPECworkstation 3.1 - Life Sciences

The addition of a GPU-centric component in the workload allows the Rembrandt-R-based GEEKOM AS 6 to leapfrog the RPL-P-based systems, but the 65W configuration of the GTR7 continues to enjoy a healthy lead.

Financial Services

The Financial Services workload set benchmarks the system for three popular algorithms used in the financial services industry - the Monte Carlo probability simulation for risk assessment and forecast modeling, the Black-Scholes pricing model, and the Binomial Options pricing model.

SPECworkstation 3 - Financial Services

The GTR7 continues to outperform the rest of the systems in this workload set also. The comparison against the DeskMeet B660 is particularly striking. Devoting 65W to eight high-performance and four efficiency cores is not a particularly good solution compared to devoting 65W completely to eight high-performance cores - particularly when the task at hand is a single workload requiring extensive CPU power.

Energy

The Energy category comprises of workloads simulating various algorithms used in the oil and gas industry:

  • The FFTW workload computes discrete Fourier transforms of large matrices.
  • The Convolution workload computes the convolution of a random 100x100 filter on a 400 megapixel image.
  • The SRMP workload processes the Surface-Related Multiples Prediction algorithm used in seismic data processing.
  • The Kirchhoff Migration workload processes an algorithm to calculate the back propogation of a seismic wavefield.
  • The Poisson workload takes advantage of the OpenMP multi-processing framework to solve the Poisson's equation.
  • The Energy workload uses the energy-02 viewset from SPECviewperf 13 to determine system performance for the open-source OPendTec seismic visualization application.

SPECworkstation 3 - Energy

This workload also includes a GPU-centric component. The mixture of CPU-heavy and GPU-heavy operations is such that the GTR7 even outperforms the DeskMeet B660 with a low-end discrete GPU. Other systems are ordered in terms of TDP, with differences being in the realm of run-to-run variations).

General Operations

In the General Options category, the focus is on workloads from widely used applications in the workstation market:

  • The 7zip workload represents compression and decompression operations using the open-source 7zip file archiver program.
  • The Python workload benchmarks math operations using the numpy and scipy libraries along with other Python features.
  • The Octave workload performs math operations using the Octave programming language used in scientific computing.
  • The Storage workload evaluates the performance of the underlying storage device using transaction traces from multiple workstation applications.

SPECworkstation 3 - General Operations

The components in this workload benefit from single-threaded performance as well as high-performance storage subsystems. The SSD used in the Arena Canyon NUC, NUC BOX-1360P/D5, and 4X4 BOX-7735U use SSDs with DRAM for the FTL. While the Crucial P3 Plus in the GTR7 doesn't have DRAM, it is compensated by much higher CPU performance. As a result, there is a significant gulf in the scores of the other systems and the above systems, with the GTR7 belonging to the leading pack - albeit, at its bottom.

GPU Compute

In the GPU Compute category, the focus is on workloads taking advantage of the GPU compute capabilities using either OpenCL or CUDA, as applicable:

  • The LuxRender benchmark is the same as the one seen in the media and entertainment category.
  • The Caffe benchmark measures the performance of the Caffe deep-learning framework.
  • The Folding@Home benchmark measures the performance of the system for distributed computing workloads focused on tasks such as protein folding and drug design.

We only process the OpenCL variants of the benchmark, as the CUDA version doesn't process correctly with default driver installs.

SPECworkstation 3 - GPU Compute

The GTR7 has the highest power budget and the latest integrated GPU microarchitecture. It is no surprise, therefore, that the system comes out on top among all the UCFF / SFF systems with iGPUs in the GPU Compute workloads.

GPU Performance: Synthetic Benchmarks System Performance: Multi-Tasking
Comments Locked

56 Comments

View All Comments

  • ActionJ26 - Friday, August 25, 2023 - link

    Go with Minisforum um790 it is $519 bareboned
  • haplo602 - Monday, August 28, 2023 - link

    that and tested as a SteamOS platform as well ...
  • 29a - Thursday, August 24, 2023 - link

    "One of the interesting aspects of the I/O ports is the presence of an audio jack in both front and rear panels. Beelink has designed this in such a way that the connection of a headset of speakers to the rear jack automatically disables the front one."

    Does that mean you cant output different audio streams to both, for example game audio through the speakers in the back and chat audio through headphones on the front. Most MB allow this.
  • ganeshts - Thursday, August 24, 2023 - link

    Can you give me some MB examples that allow this? I want to check their hardware audio path.

    As per Beelink's user manual, the disabling of the front jack is the expected behavior when the rear jack has a connected sink.
  • UserZ - Thursday, August 24, 2023 - link

    Disabling the front jack seems really odd. I would have a pair of speakers connected to the rear jack as the default audio. When I occasionally plug in a headset to the front, I want to use that. I would hope that you could still choose which to use without unplugging anything in case I don't like their default behavior.
  • darkswordsman17 - Friday, August 25, 2023 - link

    Yeah I think it'd be preferable for the inverse (i.e. mute the rear when the front is detected), or for it to be able to be configured so it could do mic input from one with audio output from the other. Its probably easier for them to do this though. But then there's options if you use an external audio via USB (or probably Bluetooth as well).
  • darkswordsman17 - Friday, August 25, 2023 - link

    PC motherboards use separate audio chips for front and rear ports generally, and thus its easy for Windows/games to then be configured to output different for each one. I think there might be some external gaming audio boxes that could allow this as well (headset plugged in managing just chat whilst outputting game audio to speakers), so it could come down to drivers (or maybe it auto-configures).
  • 1_rick - Thursday, August 24, 2023 - link

    The Crucial isn't a bad SSD if your needs align with it's capabilities. One place it completely falls down is large writes: I copied a ~60GB game to a Beelink SEI12 from a USB-C connected SSD, rather than let it be downloaded, and the pSLC cache was exhausted pretty quickly. At that point the performance tanked to somewhere around 40MBps, down about 90% from peak speed of about 500MBps.

    For normal day-to-day usage, you probably won't see much of a speed penalty, though.
  • NextGen_Gamer - Thursday, August 24, 2023 - link

    @AnandTech: Were the 3DMark Port Royal benchmarks rerun on all of the older systems? Because the DeskMeet B660 system seems way off. The Radeon RX 6400 and Radeon 680M iGPU are actually the same in specs: RDNA-2, 12 Ray Accelerators, 32 ROPs, 48 TMUs, 768 Shading Units. It should, in theory, be RX 6400 just ahead of Ryzen 9 6900HX which in turn should be just ahead of Ryzen 7 7735U. And then the latest Ryzen 7 7840HUS, with its newer and higher-clocked RDNA-3 Radeon 780M iGPU, should be on top of the charts still.
  • ganeshts - Thursday, August 24, 2023 - link

    Unlike CPU or GPU reviews, for mini-PCs, we do not update the results in every review because most of the mini-PCs are loaner samples and go back to the manufacturer.

    The numbers presented in the graph for the Deskmeet B660 are from January 2023, using Adrenalin GPU drivers that were the latest in December 2022. FWIW, 3DMark also has online score submissions from different users searchable at www.3dmark.com/search

    For RX 6400, Port Royal overall scores range from 126 to 558 (seems to depend on the CPU also), with an average of 252

    For 680M, they range from 1081 to 1415 with an average of 1026.

    In the above context, the scores we have graphed (427 and 1212) are entirely plausible.

    It is also possible that recent driver releases might have improved scores, but our policy for mini-PC reviews is that we carry forward the scores from the time of the original review. Every few years, we purge the database and move to the latest versions of the benchmarks and also update the OS to the latest stable (for example, we are currently using Win 11 21H2 with the latest updates, but not 22H2). At that time, we choose a set of PCs that we still have in hand, re-bench them and use the newly obtained scores with the new benchmark version / OS for comparisons starting from that point onwards.

Log in

Don't have an account? Sign up now