Power Consumption

With Vishera, AMD was in a difficult position: it had to drive performance up without blowing through its 125W TDP. As the Piledriver cores were designed to do just that, Vishera benefitted. Remember that Piledriver was predominantly built to take this new architecture into mobile. I went through the details of what makes Piledriver different from its predecessor (Bulldozer) but at as far as power consumption is concerned, AMD moved to a different type of flip-flop in Piledriver that increased complexity on the design/timing end but decreased active power considerably. Basically, it made more work for AMD but resulted in a more power efficient chip without moving to a dramatically different architecture or new process node.

In mobile, AMD used these power saving gains to put Piledriver in mobile APUs, a place where Bulldozer never went. We saw this with Trinity, and surprisingly enough it managed to outperform the previous Llano generation APUs while improving battery life. On desktops however, AMD used the power savings offered by Piledriver to drive clock speeds up, thus increasing performance, without increasing power consumption. Since peak power didn't go up, overall power efficiency actually improves with Vishera over Zambezi. The chart below illustrates total system power consumption while running both passes of the x264 HD (5.0.1) benchmark to illustrate my point:

In the first pass Vishera actually draws a little less power, but once we get to the heavier second encode pass the two curves are mostly indistinguishable (Vishera still drops below Zambezi regularly). Vishera uses its extra frequency and IPC tweaks to complete the task sooner, and drive down to idle power levels, thus saving energy overall. The picture doesn't look as good though if we toss Ivy Bridge into the mix. Intel's 77W Core i5 3570K is targeted by AMD as the FX-8350's natural competitor. The 8350 is priced lower and actually outperforms the 3570K in this test, but it draws significantly more power:

The platforms aren't entirely comparable, but Intel maintains a huge power advantage over AMD. With the move to 22nm, Intel dropped power consumption over an already more power efficient Sandy Bridge CPU at 32nm. While Intel drove power consumption lower, AMD kept it constant and drove performance higher. Even if we look at the FX-8320 and toss Sandy Bridge into the mix, the situation doesn't change dramatically:

Sandy Bridge obviously consumes more than Ivy Bridge, but the gap between a Vishera and any of the two Intel platforms is significant. As I mentioned earlier however, this particular test runs quicker on Vishera however the test would have to be much longer in order to really give AMD the overall efficiency advantage.

If we look at average power over the course of the two x264 encode passes, the results back up what we've seen above:

Power Consumption - Load (x264 HD 5.0.1)

As more client PCs move towards smaller form factors, power consumption may become just as important as the single threaded performance gap. For those building in large cases this shouldn't be a problem, but for small form factor systems you'll want to go Ivy Bridge.

Note that idle power consumption can be competitive, but will obviously vary depending on the motherboard used (the Crosshair Formula V is hardly the lowest power AM3+ board available):

Power Consumption - Idle

3D Gaming Performance Projected Performance: Can AMD Catch up with Intel?
Comments Locked

250 Comments

View All Comments

  • CeriseCogburn - Tuesday, October 30, 2012 - link

    more speculation from mr gnu
    This of course caps it all off - the utter amd fanboy blazing in our faces, once again the FANTASY FUTURE is the big amd win :

    " If they make a server chip based on that technology, with high performance-per-watt and 12 or more cores, that is very well within realms of possible and could very well be a GREAT winner in that market. "

    LOL - Why perhaps you should be consulting or their next COO or CEO ?

    I'm telling you man, that is why, that is why.
  • Siana - Thursday, October 25, 2012 - link

    It looks like extra 10W in idle test could be largely or solely due to mainboard. There is no clear evidence to what extent and whether at all the new AMD draws more power than Intel at idle.

    A high end CPU and low utilization (mostly idle time) is in fact a very useful and common case. For example, as a software developer, i spend most time reading and writing code (idle), or testing the software (utilization: 15-30% CPU, effectively two cores tops). However, in between, software needs to be compiled, and this is unproductive time which i'd like to keep as short as possible, so i am inclined to chose a high-end CPU. For GCC compiler on Linux, new AMD platform beats any i5 and a Sandy Bridge i7, but is a bit behind Ivy Bridge i7.

    Same with say a person who does video editing, they will have a lot of low-utilization time too just because there's no batch job their system could perform most of the time. The CPU isn't gonna be the limiting factor while editing, but when doing a batch job, it's usually h264 export, they may also have an advantage from AMD.

    In fact every task i can think of, 3D production, image editing, sound and music production, etc, i just cannot think of a task which has average CPU utilization of more than 50%, so i think your figure of 80Wh/day disadvantage for AMD is pretty much unobtainable.

    And oh, noone in their right mind runs an internet-facing server as their desktop computer, for a variety of good reasons, so while Linux is easy to use as a server even at home, it ends up a limited-scope, local server, and again, the utilization will be very low. However, you are much less likely to be bothered by the services you're providing due to the sheer number of integer cores. In case you're wondering, in order to saturate a well-managed server running Linux based on up to date desktop components, no connection you can get at home will be sufficient, so it makes sense to co-locate your server at a datacenter or rent theirs. Datacenters go to great lengths to not be connected to a single point, which in your case is your ISP, but to have low-latency connections to many Internet nodes, in order to enable the servers to be used efficiently.

    As for people who don't need a high end system, AMD offers better on-die graphics accelerator and at the lower end, the power consumption difference isn't gonna be big in absolute terms.

    And oh, "downloading files" doesn't count as "complex stuff", it's a very low CPU utilization task, though i don't think this changes much apropos the argument.

    And i don't follow that you need a 125$ mainboard for AMD, 60$ boards work quite well, you generally get away with cheaper boards for AMD than for Intel obviously even when taking into account somewhat higher power-handling capacity of the board needed.

    The power/thermal advantage of Intel of course extends to cooling noise, and it makes sense to pay extra to keep the computer noise down. However, the CPU is just so rarely the culprit any longer, with GPU of a high-end computer being the noise-maker number one, vibrations induced by harddisk number two, and only to small extent the CPU and its thermal contribution.

    Hardly anything of the above makes Piledriver the absolute first-choice CPU, however it's not a bad choice still.

    Finally, the desktop market isn't so important, the margins are terrible. The most important bit for now for AMD is the server market. Obviously the big disadvantage vs. Intel with power consumption is there, and is generally important in server market, however with virtualization, AMD can avoid sharp performance drop-off and allow to deploy up to about 1/3rd more VMs per CPU package because of higher number of integer cores, which can offset higher power consumption per package per unit of performance. I think they're onto something there, they have a technology they use on mobile chips now which allows them to sacrifice top frequency but reduce surface area and power consumption. If they make a server chip based on that technology, with high performance-per-watt and 12 or more cores, that is very well within realms of possible and could very well be a GREAT winner in that market.
  • Kjella - Tuesday, October 23, 2012 - link

    Except the "Any CPU is fine" market isn't about $200 processors, Intel or AMD. That market is now south of $50 and making them pennies with Celerons and Atoms competing with AMD A4 series. You're not spending this kind of money on a CPU unless performance matters. Funny that you're dissing the overclockability of the IVB while pushing a processor that burns 200W when overclocked, you honestly want THAT in your rig instead.

    Honestly, while this at least puts them back in the ring it can't be that great for AMDs finances. They still have the same die size and get to raise prices of their top level process or from $183 to $199, yay. But I guess they have to do something to try bringing non-APU sales back up, Bulldozer can not have sold well at all. And I still fear Haswell will knock AMD out of the ring again...
  • Jaybus - Tuesday, October 23, 2012 - link

    I agree. I would think they may do better with the 16-core socket G34 Opterons with 4 RAM channels, particularly if they can get down to 95W at 2.5 GHz. A 2-socket board gives 32 cores with lots of RAM per 2U server chassis. This should work nicely for high availability virtualized clusters. In this environment, it is better to have more cores in the same power envelope than faster per-core performance, because the virtual machines are independent from one another. I think Piledriver can compete in this environment much better than in the non-APU desktop/workstation market.
  • Sufo - Tuesday, October 23, 2012 - link

    "If all you do is benchmark all day long and you have money to burn, blow it on an Intel CPU"

    Uh, I'd happily take one to play games on my "Windoze" machine.

    Idiot.
  • cfaalm - Tuesday, October 23, 2012 - link

    The thing is that people would want a balanced performance. Balanced between single and multithreaded that is. Now Piledriver does a lot better than Bulldozer here, but I think Intel offers a better balance still. As much as I would like to build a new AMD system, I think it will be Intel this time around.
  • lmcd - Tuesday, October 23, 2012 - link

    What class of gaming are you looking at? If you're looking at even midrange gaming, your best bet is an A10 + a 6670 (runs $60-$70 average and $90 for low profile). Really a great gaming value option.
  • just4U - Tuesday, October 23, 2012 - link

    lmcd,

    I just did that for our secondary machine and put in a 6850. Works quite well... aside from bios issues on a brand new board chipset that is. Considering prices on the 7750/70 I'd probably opt out for one of those at $30 more then any of the 6x series. I'd also have probably picked up one of these new cpu's over a A10 given the oportunity.
  • CeriseCogburn - Tuesday, October 30, 2012 - link

    LMAO at fanboy system frikk failure.... hahahahha "adise from bios issues" and uhh.. the "crashing" .. and uhh, I'd buy not the 6850, but 7770, and uh... not the A10 but one of these...

    LOL - there is the life of the amd fanboy
  • 'nar - Tuesday, October 23, 2012 - link

    I frequently have two or three high-cpu apps running at a time, so would AMD be better in this case? Even though each app runs better on Core-i5 individually?

    I shoot for a do-it-all system. I run video encode, get bored and start a game. I run malware scans on external drives and backup other drives into compressed images. Perhaps if you ran h.264 encodes while you ran another benchmark, like Skyrim, or the browser bench?

    Oh, typo on page 6, I think "gian" where you meant gain.

Log in

Don't have an account? Sign up now