CPU Tests: Microbenchmarks

Core-to-Core Latency

As the core count of modern CPUs is growing, we are reaching a time when the time to access each core from a different core is no longer a constant. Even before the advent of heterogeneous SoC designs, processors built on large rings or meshes can have different latencies to access the nearest core compared to the furthest core. This rings true especially in multi-socket server environments.

But modern CPUs, even desktop and consumer CPUs, can have variable access latency to get to another core. For example, in the first generation Threadripper CPUs, we had four chips on the package, each with 8 threads, and each with a different core-to-core latency depending on if it was on-die or off-die. This gets more complex with products like Lakefield, which has two different communication buses depending on which core is talking to which.

If you are a regular reader of AnandTech’s CPU reviews, you will recognize our Core-to-Core latency test. It’s a great way to show exactly how groups of cores are laid out on the silicon. This is a custom in-house test built by Andrei, and we know there are competing tests out there, but we feel ours is the most accurate to how quick an access between two cores can happen.

All three CPUs exhibit the same behaviour - one core seems to be given high priority, while the rest are not.

Frequency Ramping

Both AMD and Intel over the past few years have introduced features to their processors that speed up the time from when a CPU moves from idle into a high powered state. The effect of this means that users can get peak performance quicker, but the biggest knock-on effect for this is with battery life in mobile devices, especially if a system can turbo up quick and turbo down quick, ensuring that it stays in the lowest and most efficient power state for as long as possible.

Intel’s technology is called SpeedShift, although SpeedShift was not enabled until Skylake.

One of the issues though with this technology is that sometimes the adjustments in frequency can be so fast, software cannot detect them. If the frequency is changing on the order of microseconds, but your software is only probing frequency in milliseconds (or seconds), then quick changes will be missed. Not only that, as an observer probing the frequency, you could be affecting the actual turbo performance. When the CPU is changing frequency, it essentially has to pause all compute while it aligns the frequency rate of the whole core.

We wrote an extensive review analysis piece on this, called ‘Reaching for Turbo: Aligning Perception with AMD’s Frequency Metrics’, due to an issue where users were not observing the peak turbo speeds for AMD’s processors.

We got around the issue by making the frequency probing the workload causing the turbo. The software is able to detect frequency adjustments on a microsecond scale, so we can see how well a system can get to those boost frequencies. Our Frequency Ramp tool has already been in use in a number of reviews.

From an idle frequency of 800 MHz, It takes ~16 ms for Intel to boost to the top frequency for both the i9 and the i5. The i7 was most of the way there, but took an addition 10 ms or so. 

Power Consumption: Caution on Core i9 CPU Tests: Office and Science
Comments Locked

279 Comments

View All Comments

  • Qasar - Friday, April 2, 2021 - link

    no but i am sure you are as YOU are the one that keeps moving the goal posts, not me. YOU said PC gaming was an expensive hobby, so i suggested a console, so you can save money vs a comp, as this seems to be you WHOLE POINT, to save money.

    " I also never said use the iGPU to game, because gaming on a iGPU, basic dGPU, or APU will be a crappy experience on modern titles. " no but you INSINUATED that you did, so who is the idiot ? and to go buy a 11600k and NOT use it for gaming, as YOU IMPLIED, (cause if you are not going to game with and need the cores, the 5600X is clearly the better choice, as its multi threaded performance, is above the 11600k) is well, whats that word you keep crying about, oh yea, E waste. if that is the case and you dont intend on gaming then getting a MUCH cheaper cpu, with your beloved igp, would be a better option.
    as i said in my other post, as you are now resorting to name calling, is further shows, you are wrong, and your whole point, has been proved wrong by giving other options, so, run along little child, when you can talk with out resorting to name calling, then come back
  • vanish1 - Monday, April 5, 2021 - link

    please stop, you keep being wrong.

    why would anyone buy a console if they intend to build a PC or PC game? Do you understand what saving money means? It means not spending it.

    Once again, I never said gaming on a PC, I said build a PC. You keep assuming incorrectly. As such, see original post.
  • 1_rick - Tuesday, March 30, 2021 - link

    Ridiculous. Bottom-tier dGPUs are $50-60, even on Newegg. Sure, they're worthless for gaming, but they'll be fine for office work and basic web browsing.
  • vanish1 - Wednesday, March 31, 2021 - link

    Okay so spend $60 on overpriced E-waste that you will have to eventually replace anyways when that money could have been put into a higher tier CPU, saved towards your actual GPU, or spent on other parts of the PC build.

    Who wants to spend $60 on a GPU just to make their CPU work? Its ridiculous.
  • Qasar - Wednesday, March 31, 2021 - link

    who says you have to throw it out ? you COULD keep it for emergencies, put it in another comp, or, um i dunno, sell/give it to a friend who could use a vid card for what ever reason.

    you say intel is the only option/best option, but you obviously havent considered anything thing else.
  • vanish1 - Wednesday, March 31, 2021 - link

    The fanboys that exist here crack me up. Constant complaining about the GPU market, overpriced and out of stock, yet willing to add fuel to that fire just to have an AMD CPU grace your presence; the hypocrisy is outstanding. I never said throw it out, it just ends up being E-waste at the end, but your mindset is the issue with the disposable culture we live in. Beyond that, I dont want to go through the hassle of buying and selling multiple cards, Ill buy one when its time, plug it into my system, and be done. Put it into another computer? So build another computer on top of the one youre already building, not alot of sense there. Give it to a friend, why would you waste your friends time with a 710 gt? Sounds more like trying to pass the buck.
  • 29a - Wednesday, March 31, 2021 - link

    Did you really just call other people fanboys?
  • Qasar - Wednesday, March 31, 2021 - link

    thats what i thought, looks like there is a new intel fanboy on here :-) maybe he is upset cause rocket lake is well, pathetic ( going by GN's review )
  • vanish1 - Thursday, April 1, 2021 - link

    I mean when people like yourself and 29a cant comment on the point I'm making and instead try to dunk on me for calling out Intel shills when I see them, it clearly shows who is right (me) and who is wrong (both of you)
  • BushLin - Thursday, April 1, 2021 - link

    Your argument is to PC gaming enthusiasts that they should enjoy the performance they had in their gaming rig over a decade ago but on modern titles because there is a GPU shortage. If you truly cared about ewaste, why not just continue using your old rig rather than buy a dead end motherboard to have a worse experience?

Log in

Don't have an account? Sign up now