A Timely Re-Discovery

Most users have no need to worry about the internals of a computer: point, click, run, play games, and spend money if they want something faster. However one of the important features in a system relates to how they measure time. A modern system relies on a series of both hardware and software timers, both internal and external, in order to maintain a linear relation between requests, commands, execution, and interrupts.

The timers have different users, such as following instructions, maintaining video coherency, tracking real time, or managing the flow of data. Timers can (but not always) use external references to ensure their own consistency – damage, unexpected behavior, and even thermal environments can cause timers to lose their accuracy.

Timers are highly relevant for benchmarking. Most benchmark results are a measure of work performed per unit time, or in a given time. This means that both the numerator and the denominator need to be accurate: the system has to be able to measure what amount of work has been processed, and how long it took to do it in. Ideally there is no uncertainty in either of those values, giving an accurate result.

With the advent of Windows 8, between Intel and Microsoft, the way that the timers were used in the OS were changed. Windows 8 had the mantra that it had to ‘support all devices’, all the way from the high-cost systems down to the embedded platforms. Most of these platforms use what is called an RTC, a ‘real time clock’, to maintain the real-world time – this is typically a hardware circuit found in almost all devices that need to keep track of time and the processing of data. However, compared to previous versions of Windows, Microsoft changed the way it uses timers, such that it was compatible with systems that did not have a hardware-based RTC, such as low-cost and embedded devices. The RTC was an extra cost that could be saved if the software was built to do so.

Ultimately, any benchmark software in play has to probe the OS to determine the current time during the benchmark to then at the end give an accurate result. However the concept of time, without an external verifying source, is an arbitrarily defined constant – without external synchronization, there is no guarantee that ‘one second’ on the system equals ‘one second’ in the real world. For the most part, all of us rely on the reporting from the OS and the hardware that this equality is true, and there are a lot of hardware/software engineers ensuring that this is the case.

However, back in 2013, it was discovered that it was fairly easy to 'distort time' on a Windows 8 machine. After loading into the operating system, any adjustment in the base frequency of the processor, which is usually 100 MHz, can cause the ‘system time’ to desynchronise with ‘real time’. This was a serious issue in the extreme overclocking community, where world records require the best system tuning: when comparing two systems at the same frequency but with different base clock adjustments, up to a 7% difference in results were observed when there should have been a sub-1% difference. This was down to how Windows was managing its timers, and was observed on most modern systems.

For home users, most would suspect that this is not an issue. Most users tend not to adjust the base frequencies of their systems manually. For the most part that is true. However, as shown in some of our motherboard testing over the years, frequency response due to default BIOS settings can provide an observable clock drift around a specified value, something which can be exacerbated by the thermal performance. Having a system with observable clock drift, and subsequent timing drift, is not a good thing. It relies on the accuracy and quality of the motherboard components, as well as the state of the firmware. This issue has formally been classified as ‘RTC Bias’.

The extreme overclocking community, after analysing the issue, found a solution: forcing the High Performance Event Timer, known as HPET, found in the chipset. Some of our readers will have heard of HPET before, however our analysis is more interesting than it first appears.

Why A PC Has Multiple Timers

Aside from the RTC, a modern system makes use of many timers. All modern x86 processors have a Time Stamp Counter (TSC) for example, that counts the number of cycles from a given core, which was seen back in the day as a high-resolution, low-overhead way to get CPU timing information. There is also a Query Performance Counter (QPC), a Windows implementation that relies on the processor performance metrics to get a better resolution version of the TSC, which was developed in the advent of multi-core systems where the TSC was not applicable. There is also a timer function provided by the Advanced Configuration and Power Interface (ACPI), which is typically used for power management (which means turbo related functionality). Legacy timing methodologies, such as the Programmable Interval Timer (PIT), are also in use on modern systems. Along with the High Performance Event Timer, depending on the system in play, these timers will run at different frequencies.


Ryzen 7 2700X with HPET Off
ASUS ROG Crosshair VII Hero

Core i7-8700K with HPET Off
ASRock Z370 Gaming i7

Core i7-6950X with HPET On
ASUS X99-E-10G

Core i7-6700K with HPET Off
GIGABYTE X170-Extreme ECC

Core i5-5200U with HPET Off
GIGABYTE BRIX

Core i7-3960X with HPET Off
EVGA X79 SLI

The timers will be used for different parts of the system as described above. Generally, the high performance timers are the ones used for work that is more time sensitive, such as video streaming and playback. HPET, for example, was previously referred to by its old name, the Multimedia Timer. HPET is also the preferred timer for a number of monitoring and overclocking tools, which becomes important in a bit.

With the HPET timer being at least 10 MHz as per the specification, any code that requires it is likely to be more in sync with the real-world time (the ‘one-second in the machine’ actually equals ‘one-second in reality’) than using any other timer.

In a standard Windows installation, the operating system has access to all the timers available. The software used above is a custom tool developed to show if a system has any of those four timers (but the system can have more). For the most part, depending on the software instructions in play, the operating system will determine which timer is to be used – from a software perspective, it is fundamentally difficult to determine which timers will be available, so the software is often timer agnostic. There is not much of a way to force an algorithm to use one timer or another without invoking specific hardware or instructions that rely on a given timer, although the timers can be probed in software like the tool above.

HPET is slightly different, in that it can be forced to be the only timer. This is a two stage process:

The first stage is that it needs to be enabled in the BIOS. Depending on the motherboard and the chipset, there may or may not be an option for this. The options are usually for enable/disable, however this is not a simple on/off switch. When disabled, HPET is truly disabled. However, when enabled, this only means that the HPET is added to the pool of potential timers that the OS can use.

The second stage is in the operating system. In order to force HPET as the only timer to be used for the OS, it has to be explicitly mentioned in the system Boot Configuration Data (BCD). In standard operation, HPET is not in the BCD, so it remains in the pool of timers for the OS to use. However, for software to guarantee that the HPET is the only timer running, the software will typically request to make a change and make an accompanying system reboot to ensure the software works as planned. Ever wondered why some overclocking software requests a reboot *before* starting the overclock? One of the reasons is sometimes to force HPET to be enabled.

This leads to four potential configuration implementations:

  1. BIOS enabled, OS default: HPET is in list of potential timers
  2. BIOS enabled, OS forced: HPET is used in all situations
  3. BIOS disabled, OS default: HPET is not available
  4. BIOS disabled, OS forced: HPET is not available

Again, for extreme overclockers relying on benchmark results to be equal on Windows 8/10, HPET has to be forced to ensure benchmark consistency. Without it, the results are invalid.

The Effect of a High Performance Timer

With a high performance timer, the system is able to accurately determine clock speeds for monitoring software, or video streaming processing to ensure everything hits in the right order for audio and video. It can also come into play when gaming, especially when overclocking, ensuring data and frames are delivered in an orderly fashion, and has been shown to reduce stutter on overclocked systems. And perhaps most importantly, it avoids any timing issues caused by clock drift.

However, there are issues fundamental to the HPET design which means that it is not always the best timer to use. HPET is a continually upward counting timer, which relies on register recall or comparison metrics rather than a ‘set at x and count-down’ type of timer. The speed of the timer can, at times, cause a comparison to fail, depending on the time to write the compared value to the register and that time already passing. Using HPET for very granular timing requires a lot of register reads/writes, adding to the system load and power draw, and in a workload that requires explicit linearity, can actually introduce additional latency. Usually one of the biggest benefits to disabling HPET on some systems is the reduction in DPC Latency, for example.

An Audit after our Ryzen 2000-series Review AMD and Intel Have Different HPET Guidance
Comments Locked

242 Comments

View All Comments

  • Spunjji - Thursday, April 26, 2018 - link

    Is there any other kind? Either you're at the budget end where everything is GPU limited or at the high-end where not spending a decent amount on a monitor to go with your £500 GPU is a crying shame.

    There's a niche where Intel has a clear win, and that's people running 240Hz 1080p rigs. For most folks with the money to spend, 2560x1440 (or an ultra-wide equivalent) @ 144hz is where it's at for the ideal compromise between picture quality, smoothness and cost. There are a lot of monitors hitting those specs right now.
  • eva02langley - Wednesday, April 25, 2018 - link

    I was mentioning in the review that 1080p benchmarks need to go... now it is even more true with HPET.

    Kudos on this guys, it is really interesting to read.
  • DanNeely - Wednesday, April 25, 2018 - link

    >93% of Steam gamers main display is at 1080p or lower.

    If the new review suit split what GPUs were run at what resolutions, dropping 1080p from the high end card section might be reasonable. OTOH with 240hz 1080p screens a thing there's still an enthusiast market for 1080p combined with a flagship GPU.
  • IndianaKrom - Wednesday, April 25, 2018 - link

    * Raises hand, that's me, someone with a GTX 1080 and 240Hz 1920x1080 display.

    The industry seems obsessed with throwing higher and higher spacial resolution at gamers when what I really want is better temporal resolution.
  • eva02langley - Thursday, April 26, 2018 - link

    1080p @ 60Hz which is a non issue because we are talking about RX 580/1060 GTX or below. At that point the GPU is the bottleneck.

    It only affect 1080p @ 144Hz with a 1080 GTX/Vega 64 minimum which is really < 2%.

    You are really the exception, however the 1080p CPU bottleneck focus on you entirely without even taking in consideration other Use Cases.
  • Holliday75 - Thursday, April 26, 2018 - link

    I am willing to bet that 95%+ of Steam users have no clue what we are talking about and don't care.
  • mapesdhs - Sunday, May 6, 2018 - link

    IndianaKrom, are you aware that using high(er) frequency monitors retrains your brain's vision system so that you become tuned to that higher refresh rate? New Scientist had an article about this recently; gamers who use high frequency monitors can't use normal monitors anymore, even if previously they would not have found 60Hz bothersome at all. In other words, you're chasing goalposts that will simply keep moving by virtue of using ever higher refresh rates. I mean blimey, 240Hz is higher than the typical "analogue" vision refresh of a bird. :D

    IMO these high frequency monitors are bad for gaming in general, because they're changing product review conclusion via authors accepting that huge fps numbers are normal (even though the audience that would care is minimal). Meanwhile, game devs are not going to create significantly more complex worlds if it risks new titles showing more typical frame rates in the 30s to 80s as authors would then refer to that as slow, perhaps criticise the 3D engine, moan that gamers with HF monitors will be disappointed, and I doubt GPU vendors would like it either. We're creating a marketing catch22 with all this, doubly so as VR imposes some similar pressures.

    I don't mind FPS fans wanting HF monitors in order to be on the cutting edge of competitiveness, but it shouldn't mean reviews become biased towards that particular market in the way they discuss the data (especially at 1080p), and it's bad if it's having a detrimental effect on new game development (I could be wrong about the latter btw, but I strongly suspect it's true from all I've read and heard).

    We need a sanity check with frame rates in GPU reviews: if a game is doing more than 80 or 90fps at 1080p, then the conclusion emphasis should be that said GPU is more than enough for most users at that resolution; if it's well over 100fps then it's overkill. Just look at the way 8700K 1080p results are described in recent reviews, much is made of differences between various CPUs when the frame rates are already enormous. Competitive FPS gamers with HF monitors might care, but for the vast majority of gamers the differences are meaningless.
  • Luckz - Monday, May 14, 2018 - link

    So the real question is if someone first exposed to 100/120/144 Hz immediately squirms in delight, or if they only vomit in disgust months later when they see a 60 Hz screen again. That should be the decider.
  • Spunjji - Thursday, April 26, 2018 - link

    1080p is popular in the Steam survey where, incidentally, so is low-end GPU and CPU hardware. Most of those displays are 60hz and an awful lot of them are in laptops. Pointing at the Steam surveys to indicate where high-end CPU reviews should focus their stats is misguided.

    I'm still not certain that testing CPUs in a way that artificially amplifies their differences in a non-CPU-reliant workload is really the way to go.
  • ElvenLemming - Wednesday, April 25, 2018 - link

    You can just ignore the 1080p benchmarks if you don't think they're meaningful. As DanNeely said, 93% of surveyed Steam users are 1080p or lower, so I'd be shocked if more than a handful of review sites get rid of it.

Log in

Don't have an account? Sign up now