Power Consumption

With Vishera, AMD was in a difficult position: it had to drive performance up without blowing through its 125W TDP. As the Piledriver cores were designed to do just that, Vishera benefitted. Remember that Piledriver was predominantly built to take this new architecture into mobile. I went through the details of what makes Piledriver different from its predecessor (Bulldozer) but at as far as power consumption is concerned, AMD moved to a different type of flip-flop in Piledriver that increased complexity on the design/timing end but decreased active power considerably. Basically, it made more work for AMD but resulted in a more power efficient chip without moving to a dramatically different architecture or new process node.

In mobile, AMD used these power saving gains to put Piledriver in mobile APUs, a place where Bulldozer never went. We saw this with Trinity, and surprisingly enough it managed to outperform the previous Llano generation APUs while improving battery life. On desktops however, AMD used the power savings offered by Piledriver to drive clock speeds up, thus increasing performance, without increasing power consumption. Since peak power didn't go up, overall power efficiency actually improves with Vishera over Zambezi. The chart below illustrates total system power consumption while running both passes of the x264 HD (5.0.1) benchmark to illustrate my point:

In the first pass Vishera actually draws a little less power, but once we get to the heavier second encode pass the two curves are mostly indistinguishable (Vishera still drops below Zambezi regularly). Vishera uses its extra frequency and IPC tweaks to complete the task sooner, and drive down to idle power levels, thus saving energy overall. The picture doesn't look as good though if we toss Ivy Bridge into the mix. Intel's 77W Core i5 3570K is targeted by AMD as the FX-8350's natural competitor. The 8350 is priced lower and actually outperforms the 3570K in this test, but it draws significantly more power:

The platforms aren't entirely comparable, but Intel maintains a huge power advantage over AMD. With the move to 22nm, Intel dropped power consumption over an already more power efficient Sandy Bridge CPU at 32nm. While Intel drove power consumption lower, AMD kept it constant and drove performance higher. Even if we look at the FX-8320 and toss Sandy Bridge into the mix, the situation doesn't change dramatically:

Sandy Bridge obviously consumes more than Ivy Bridge, but the gap between a Vishera and any of the two Intel platforms is significant. As I mentioned earlier however, this particular test runs quicker on Vishera however the test would have to be much longer in order to really give AMD the overall efficiency advantage.

If we look at average power over the course of the two x264 encode passes, the results back up what we've seen above:

Power Consumption - Load (x264 HD 5.0.1)

As more client PCs move towards smaller form factors, power consumption may become just as important as the single threaded performance gap. For those building in large cases this shouldn't be a problem, but for small form factor systems you'll want to go Ivy Bridge.

Note that idle power consumption can be competitive, but will obviously vary depending on the motherboard used (the Crosshair Formula V is hardly the lowest power AM3+ board available):

Power Consumption - Idle

3D Gaming Performance Projected Performance: Can AMD Catch up with Intel?
Comments Locked

250 Comments

View All Comments

  • angusfox - Monday, December 31, 2012 - link

    The Intel fanboys simply do not understand market dynamics. If if were not for AMD, Intel processors would be three or four times as expensive. In addition, competition drives innovation. Intel CPUs were mediocre until AMD created it first Athlon 64-bit processor that kicked Intel's P-4 butt in every measure of performance. Intel fanboys, you simply don't give credit where it it due. I am the first to admit that Intel's Sandy Bridge CPUs are, for the most part, better designed than competing CPUs from AMD. However, I feel obligated to do my part to keep AMD alive. It benefits the users of both AMD and Intel CPUs and it keeps prices down. Only an idiot would hope for Intel to drive AMD out of business. This is not football, basketball or NASCAR, you dimwits!
  • GustoGuy - Thursday, January 17, 2013 - link

    Exactly. I have been building AMD systems since my Athlon Xp 2500+ OC to 3200+ back in 2004. My last Intel was a fast Celeron back in 2003. If AMD were to go out of bussiness Intel would get complacent and jack the prices up. Competition is good for providing innovation. Imagine if Intel had a total monopoly and AMD never existed. I doubt that we would have anything near as good as the processors we have right now. Plus AMD has always offered good performance for the dollar spent. I never buy bleeding edge technology because it costs twice as much for just a small amount of perfomance advantage I like to buy AMD and then use the money saved to put a better GPU into the system since a FX8350 is not going to bottle neck any modern GPU and spend the bucks were it counts. I find all the talk about Fanbois hilarious. I like to build powerful gaming systems on a budget and when I am done with them they are sold on Ebay and I build another. I wish AMD well for if they were to fail then Intel would shoot their processor prices up into the stratosphere again. I felt Intel was surprised by AMD back in 2004 when they came out with the 939 dual core chips that pummeled the fasted PIV systems. In fact I had a Asrock dual sata 939 with a Dual core 939 3800+ that my son was using for his gaming rig until this Christmas when I built him a new AM3+ system with an Asrock 990Fx extreme 4 motherboard. Competition if good and I want to see it continue so thats why I continue to buy AMD
  • iceman34572 - Wednesday, January 2, 2013 - link

    I thought I came on this site to read in depth reviews, not to see a bunch of fanboys fighting each other for who has the better processor, using grade school humor to do so. I get sick of it. People, just buy what YOU want; I could not care less what you spend your money on. If you are happy with it, then that is all that matters.
  • CeriseCogburn - Saturday, February 2, 2013 - link


    If you came on this site to read the in depth review then do so, you idiot.

    Oh wait, instead of doing what you demand everyone else do, you go down as low as possible while holding yourself up to the angelic light and claim everyone needs to stop fanboying .... well guess what moron - if you came to read the article read it, THEN LEAVE, or stop whining in comments too, being a fanboy of sorts, the one who, doesn't FREAKING REALIZE the article itself is a BIG FAT FIGHT BETWEEN INTEL AND AMD, YOU FREAKING IDIOT.

    Have a nice day being a stupid lying fake angel.
  • Ukdude21 - Thursday, August 15, 2013 - link

    Don't listen to him, he is just a silly bitch lol.
  • nissangtr786 - Thursday, January 17, 2013 - link

    Over 2x performance per watt skower then intel and 3x slower performance per watt when both overclocked. Its crazy how far intel are ahead and the most worrying thing is haswell is where intel will be going all out to bring better performance per watt and also beat amd trump card as in integrated graphics haswell will be amazing.

    Yes amd cost less then for most of there cpu's but you pay for hat you get. Amd are releasing cpu's that intel had 3-4 years ago that were better performance per watt. Also you are saving over 2x the electricity and completing tasks much faster on intel cpu's.

    Put it this way if amd get to where intel will be with haswell in 2013, I mean for amd by 2016 or 2017 to get to where haswell will be in 2013 amd would have done a miracle as they are that far behind. I reckon amd are so far behind now that they will just target lower end market with there apu's for gaming.
  • lordxyx - Tuesday, April 2, 2013 - link

    i just bought an FX8350 with a new mainboard and 7980 radeon. i think for an 8 core chip at 4ghz its a good price. hope it out performs my old core 2 q6600 @ 3ghz anyhow or ill feel like a complete sucker!
  • ToastedJellyBowl - Friday, April 5, 2013 - link

    A lot of people posting comments in this article are nothing but tools. For those of you who can't see beyond benchmarks and who think a slight advantage in a benchmark = blowing the competition out of the water, let me give you a lesson.

    There's a difference in say a 30% difference at say 30 FPS, and a 30% difference at say, 110 FPS. When both chips are performing at 60 FPS, there is no blowing the other chip aware. At that point, it's simply a stalemate. It's just a shame that Intel fanboys are too arrogant and also ignorant to admit this. They're so fixated on "OMFG, my chip gets an extra 11.71 FPS on a benchmark than your chip does".

    Everything AMD has out there this side of a Phenom II X4 (or hell, even a low end AMD FX 4100) will run anything on the market maxed out at a solid 60+ FPS, given you are supporting it with a video card that doesn't hold it back. With that said, most people play with v-sync enabled anyways due to massive screen tearing with most games. What does it matter that a Core i7 is pulling 147 FPS and a AMD FX 8350 is pulling 107 FPS when your frame rate is just going to be locked down to 60 FPS anyways?

    I know a lot of people like to be future proofed, and the more overhead you have over 60 FPS the more future proof your system is, but future proof by an extra year != blowing the competition out of the water. Gaming requirements has pretty much hit a brick wall. System requirements has not really went up much at all in the last 2-3 years. With a Phenom II X4 965 and a GeForce 650 Ti my system runs anything I throw at it at a solid 55-60 FPS on Ultra settings. If I threw a 650 Ti Boost, or even better a 660 Ti or a 680 in my system, everything would run even better. My CPU still never really gets maxed out in most games.

    Anymore where the difference lies is how fast the CPU can encode and how fast the CPU can do other things that are not gaming related. That's where Intel is focusing right now, but as far as gaming, we've hit a brick wall, and have been behind that brick wall for several years now.

    With that being said, I'm very proud of my AMD Phenom II X4 965 that is coupled with my GeForce 650 Ti. In many games I play with my friend, this hardware compared to his Core i7-920 overclocked to over 4.0GHz running GeForce GTX 470's in SLI. In some games, I was slightly below his performance. In other games I was equal to, and in a few games my system actually outperformed his. He has since upgraded his GeForce GTX 470's with a single GeForce GTX 680, and even against that card, my system does very well in comparison. In DIRT Showdoown, we both were over the 50 FPS average mark. I was at about 57, he was at about 70, on average. Now, that may sound like a lot, right?

    Well, then you factor in the pricing. My motherboard, processor and RAM was less than $250. His motherboard alone was more expensive than everything I paid combined. Coupled with another $250 for the CPU. That's $500. That's double price just for the motherboard and processor compared to what I paid for everything, outside of a PSU, case, monitor, etc in which I already had. The performance difference, however, definitely isn't double.

    I mean, you can either go pay $600+ to build a system (motherboard, CPU, RAM since most people reuse other parts such as optical drive, sound card, network card, hard drive, PSU, etc for many years), or you can pay $250 to build a system that will get slightly less performance on benchmarks, but still be future proof.

    It's your call. I don't know about other people, but I like knowing I'm getting the best bang for the buck, and while Intel definitely may offer slightly better performance in benchmarks, AMD definitely offers the best bang for the buck. How can you turn your head at a 4 module 8 thread CPU for $185 when it costs over $300 to get a decent Intel chip? They're both future proof and will run anything at over 60 FPS for years to come, so why blow the extra $100 on the CPU and an extra $100 on a motherboard? Oh, and good luck finding an Intel motherboard that compares to the AM3+ ASUS M5A97 with a UEFI BIOS for under $200.
  • jmcb - Monday, April 22, 2013 - link

    Most ppl in the general public will be like me. I dont OC, I tried it but never got into it. I dont even game on my PC...and for what I use my PC for....stock vs stock.....Intel is where its at. Sorry. I do lots of video encoding.

    The general public see this article....they will probably think the same thing. I also look at power consumption. Again, Intel is where its at. I had my sights set on a i7 3770k for over a year. I can probably wait 2 more years...and it might still be a better buy vs AMD.
  • IntelBias - Sunday, June 2, 2013 - link

    I never noticed this until just now, I always heard Anandtech intel bias but never noticed it until this article. They purposely set the resolutions lower knowing that Piledriver fares much better against SB/IB in 1080p and 1440p.

Log in

Don't have an account? Sign up now