HEDT Performance: Web and Legacy Tests

While more the focus of low-end and small form factor systems, web-based benchmarks are notoriously difficult to standardize. Modern web browsers are frequently updated, with no recourse to disable those updates, and as such there is difficulty in keeping a common platform. The fast paced nature of browser development means that version numbers (and performance) can change from week to week. Despite this, web tests are often a good measure of user experience: a lot of what most office work is today revolves around web applications, particularly email and office apps, but also interfaces and development environments. Our web tests include some of the industry standard tests, as well as a few popular but older tests.

We have also included our legacy benchmarks in this section, representing a stack of older code for popular benchmarks.

All of our benchmark results can also be found in our benchmark engine, Bench.

WebXPRT 3: Modern Real-World Web Tasks, including AI

The company behind the XPRT test suites, Principled Technologies, has recently released the latest web-test, and rather than attach a year to the name have just called it ‘3’. This latest test (as we started the suite) has built upon and developed the ethos of previous tests: user interaction, office compute, graph generation, list sorting, HTML5, image manipulation, and even goes as far as some AI testing.

For our benchmark, we run the standard test which goes through the benchmark list seven times and provides a final result. We run this standard test four times, and take an average.

Users can access the WebXPRT test at http://principledtechnologies.com/benchmarkxprt/webxprt/

WebXPRT 3 (2018)

WebXPRT 2015: HTML5 and Javascript Web UX Testing

The older version of WebXPRT is the 2015 edition, which focuses on a slightly different set of web technologies and frameworks that are in use today. This is still a relevant test, especially for users interacting with not-the-latest web applications in the market, of which there are a lot. Web framework development is often very quick but with high turnover, meaning that frameworks are quickly developed, built-upon, used, and then developers move on to the next, and adjusting an application to a new framework is a difficult arduous task, especially with rapid development cycles. This leaves a lot of applications as ‘fixed-in-time’, and relevant to user experience for many years.

Similar to WebXPRT3, the main benchmark is a sectional run repeated seven times, with a final score. We repeat the whole thing four times, and average those final scores.

WebXPRT15

Speedometer 2: JavaScript Frameworks

Our newest web test is Speedometer 2, which is a accrued test over a series of javascript frameworks to do three simple things: built a list, enable each item in the list, and remove the list. All the frameworks implement the same visual cues, but obviously apply them from different coding angles.

Our test goes through the list of frameworks, and produces a final score indicative of ‘rpm’, one of the benchmarks internal metrics. We report this final score.

Speedometer 2

Google Octane 2.0: Core Web Compute

A popular web test for several years, but now no longer being updated, is Octane, developed by Google. Version 2.0 of the test performs the best part of two-dozen compute related tasks, such as regular expressions, cryptography, ray tracing, emulation, and Navier-Stokes physics calculations.

The test gives each sub-test a score and produces a geometric mean of the set as a final result. We run the full benchmark four times, and average the final results.

Google Octane 2.0

Mozilla Kraken 1.1: Core Web Compute

Even older than Octane is Kraken, this time developed by Mozilla. This is an older test that does similar computational mechanics, such as audio processing or image filtering. Kraken seems to produce a highly variable result depending on the browser version, as it is a test that is keenly optimized for.

The main benchmark runs through each of the sub-tests ten times and produces an average time to completion for each loop, given in milliseconds. We run the full benchmark four times and take an average of the time taken.

Mozilla Kraken 1.1

3DPM v1: Naïve Code Variant of 3DPM v2.1

The first legacy test in the suite is the first version of our 3DPM benchmark. This is the ultimate naïve version of the code, as if it was written by scientist with no knowledge of how computer hardware, compilers, or optimization works (which in fact, it was at the start). This represents a large body of scientific simulation out in the wild, where getting the answer is more important than it being fast (getting a result in 4 days is acceptable if it’s correct, rather than sending someone away for a year to learn to code and getting the result in 5 minutes).

In this version, the only real optimization was in the compiler flags (-O2, -fp:fast), compiling it in release mode, and enabling OpenMP in the main compute loops. The loops were not configured for function size, and one of the key slowdowns is false sharing in the cache. It also has long dependency chains based on the random number generation, which leads to relatively poor performance on specific compute microarchitectures.

3DPM v1 can be downloaded with our 3DPM v2 code here: 3DPMv2.1.rar (13.0 MB)

3DPM v1 Single Threaded3DPM v1 Multi-Threaded

x264 HD 3.0: Older Transcode Test

This transcoding test is super old, and was used by Anand back in the day of Pentium 4 and Athlon II processors. Here a standardized 720p video is transcoded with a two-pass conversion, with the benchmark showing the frames-per-second of each pass. This benchmark is single-threaded, and between some micro-architectures we seem to actually hit an instructions-per-clock wall.

x264 HD 3.0 Pass 1x264 HD 3.0 Pass 2

HEDT Performance: Office Tests HEDT Performance: SYSMark 2018
Comments Locked

143 Comments

View All Comments

  • zeromus - Wednesday, November 14, 2018 - link

    @linuxgeek, logged in for the first time in 10 years or more just to laugh with you for having cracked the case with your explanation there at the end!
  • HStewart - Tuesday, November 13, 2018 - link

    I have IBM Thinkpad 530 with NVidia Quadro - in software development unless into graphics you don't even need more than integrated - even more for average business person - unless you are serious into gaming or high end graphics you don't need highend GPU. Even gaming as long as you are not into latest games - lower end graphics will do.
  • pandemonium - Wednesday, November 14, 2018 - link

    "Even gaming as long as you are not into latest games - lower end graphics will do."

    This HEAVILY depends on your output resolution, as every single review for the last decade has clearly made evident.
  • Samus - Wednesday, November 14, 2018 - link

    Don't call it an IBM Thinkpad. It's disgraceful to associate IBM with the bastardization Lenovo has done to their nameplate.
  • imaheadcase - Tuesday, November 13, 2018 - link

    Uhh yah but no one WILL do it on mobility. Makes no sense.
  • TEAMSWITCHER - Tuesday, November 13, 2018 - link

    You see .. there you are TOTALLY WRONG. Supporting the iPad is a MAJOR REQUIREMENT as specified by our customers.

    Augmented reality has HUGE IMPLICATIONS for our industry. Try as you may ... you can't hold up that 18 core desktop behemoth (RGB lighting does not defy gravity) to see how that new Pottery Barn sofa will look in your family room. I think what you are suffering from is a historical perspective on computing which the ACTUAL WORLD has moved away from.
  • scienceomatica - Tuesday, November 13, 2018 - link

    @TEAMSWITCHER - I think your comments are an unbalanced result between fantasy and ideals. I think you're pretty superficially, even childishly looking at the use of technology and communicating with the objective world. Of course, a certain aspect of things can be done on a mobile device, but by its very essence it is just a mobile device, therefore, as a casual, temporary solution. It will never be able to match the raw power of "static" desktop computers.working in a laboratory for physical analysis, numerous simulations of supersymmetric breakdowns of material identities, or transposition of spatial-temporal continuum, it would be ridiculous to imagine doing on a mobile device.There are many things I would not even mention.
  • HStewart - Tuesday, November 13, 2018 - link

    For videos - as long as you AVX 2 (256bit) you are ok.
  • SanX - Wednesday, November 14, 2018 - link

    AMD needs to beat Intel with AVX to be considered seriously for scientific apps (3D particle movement test)
  • PeachNCream - Tuesday, November 13, 2018 - link

    All seven of our local development teams have long since switched from desktops to laptops. That conversion was a done deal back in the days of Windows Vista and Dell Latitude D630s and 830s. Now we live in a BYOD (bring your own device) world where the company will pay up to a certain amount (varies between $1,100 and $1,400 depending on funding from upper echelons of the corporation) and employees are free to purchase the computer hardware they want for their work. There are no desktop PCs currently and in the past four years, only one person purchased a desktop in the form of a NUC. The reality is that desktop computers are for the most part a thing of the past with a few instances remaining on the desks of home users that play video games on custom-built boxes as the primary remaining market segment. Why else would Intel swing overclocking as a feature of a HEDT chip if there was a valid business market for these things?

Log in

Don't have an account? Sign up now