Power Saving and Power Consumption 

When it comes to power, Carrizo features two/three technologies worth discussing. The first is the use of low power states, and the different frequency domains within the SoC. Previous designs had relatively few power planes, which left not as many chances for the SoC to power down areas not in use. Carrizo has ten power planes that can be controlled at run-time, allowing for what can be described as a dynamic race to sleep. This is bundled with access to the S0i3 power state, giving sub 50mW SoC power draw when in sleep and wake-up times under a second.

This is also combined with automated voltage/frequency sensors, of which an Excavator core has 10 each. These sensors take into consideration the instructions being processed, the temperature of the SoC, the quality of power delivery as well as the voltage and frequency at that point in order to relay information about how the system should adjust for the optimal power or performance point.

AMD states that this gives them the ability to adjust the frequency/power curve on a per-module basis further again to the right, providing another reduction in power or increase in frequency as required.

Next up for discussion is the voltage adaptive operation that was introduced back in Kaveri. I want to mention it here again because when it was first announced, I thought I understood it at a sufficient level in order to write about it. Well, having crossed another explanation of the feature by David Kanter, the reason for doing so clicked. I’m not going to steal his thunder, but I suggest you read his coverage to find out in more detail, but the concept is this:

When a processor does work, it draws power. The system has to be in a position to provide that power, and the system acts to restabilize the power while the processor is performing work. The work being done will cause the voltage across the processor to drop, to what we classically call Voltage Droop. As long as the droop does not cause the system to go below the minimum voltage required for operation, all is good. Voltage Droop works if the supply of power is consistent, although that cannot always be guaranteed – the CPU manufacturer does not have control over the quality of the motherboard, the power supply or the power conversion at hand. This causes a ripple in the quality of the power, and the CPU has to be able to cope with these ripples as these ripples, combined with a processor doing work, could cause the voltage to drop below the threshold.

The easiest way to cope is to put the voltage of the processor naturally higher, so it can withstand a bigger drop. This doesn’t work well in mobile, as more voltage results in a bigger power draw and a worse experience. There are other potential solutions which Kanter outlines in his piece.

AMD has tackled the problem is to get the processor to respond directly. When the voltage drops below a threshold value, the system will reduce the frequency and the voltage of the processor by around 5%, causing the work being done to slow down and not drain as much. At AMD’s Tech Day, they said this happens in as quickly as 3 cycles from detection, or in under a nanosecond. When the voltage drop is normalized (i.e. the power delivery is a more tolerable level), the frequency is cranked back up and work can continue at a normal rate.

Obviously the level of the threshold and the frequency drop will determine how much time is spent in this lower frequency state. We were told that with the settings used in Carrizo, the CPU hits this state less than 1% of the time, but it accounts for a sizeable chunk of overall average power reduction for a 3.5 GHz processor. This may sound odd, but it can make sense when you consider that the top 5% of the frequency is actually the most costly in terms of power than any other 5%. By removing that 5% extreme power draw, for a minimal performance loss (5% frequency loss for sub 1% of the time), it saves enough power to be worthwhile. 

IPC Increases: Double L1 Data Cache, Better Branch Prediction Unified Video Decoder and Playback Pathways
Comments Locked

137 Comments

View All Comments

  • albert89 - Monday, June 8, 2015 - link

    Tell me where you can buy one for that price. Because most I see are like this article described, double the price.
  • Prashant Jain - Wednesday, June 3, 2015 - link

    Switchable Graphics is a real deal in Windows ecosystem, Linux lacks a ton of drivers and not expected to deliver more therefore AMD has scope in enterprise servers where Linux is thriving therefore AMD will succeed in long run if they somehow manage to have at par CPUs with Intel.
  • Penti - Wednesday, June 3, 2015 - link

    Switchable graphics from AMD with Intel CPU's is just so much worse than Optimus.
  • Wolfpup - Wednesday, June 3, 2015 - link

    And Optimus already doesn't work. I'm NEVER buying another PC with it, and am just thankful mine lets you disable it and run on the Nvidia GPU directly.
  • duploxxx - Thursday, June 4, 2015 - link

    well nice to hear that switchable amd graphics sucks, so does optimus. setting manual profiles is ok with optimus, i supose the same with AMD, anything automatic is NOT.

    oh and every time i dock - undock my explorer, chrome, firefox will crash due to graphics issues... uber optimus.
  • barleyguy - Wednesday, June 3, 2015 - link

    Switchable graphics sucks, bad. My current work laptop is Intel/ATI, and I've also used Intel/NVidia. Neither one works as well as just a single video adapter. I've had issues with windows not refreshing, driver crashes, and just overall wonkiness.

    Luckily Dell lets you disable it completely in the BIOS. That tends to get rid of the issues.
  • RandUser - Thursday, June 4, 2015 - link

    It depends. On my ASUS laptop Optimus works perfectly, no isues.
  • Margalus - Wednesday, June 3, 2015 - link

    1366x768 is perfectly fine on 15.6 inch devices...
  • fokka - Wednesday, June 3, 2015 - link

    if you're bordering on blind, yes.
  • meacupla - Wednesday, June 3, 2015 - link

    Actually, the blind and hard of seeing, benefit greatly from higher DPI with much sharper images.

    I wouldn't be surprised if 1366x768 caused blindness in the first place, however.

Log in

Don't have an account? Sign up now