Power Consumption

Power consumption, like performance, isn't surprising here. I noticed I couldn't get Cool'n'Quiet to properly underclock the Phenom II X4 980 BE when idle, resulting in a constant 3.7GHz operating frequency and thus higher than expected idle power numbers. I updated our test platform to the latest public BIOS but I suspect this is something that'll be addressed in a future update.

Idle Power Consumption

Load Power Consumption

I also measured power at the ATX12V connector to give you an idea of what actual CPU power consumption is like (excluding the motherboard, PSU loss, etc...):

Processor Idle Load (Cinebench R11.5)
Intel Core i7 2600K @ 4.4GHz 5W 111W
Intel Core i7 2600K (3.4GHz) 5W 86W
AMD Phenom II X4 975 BE (3.6GHz) 14W 96W
AMD Phenom II X4 980 BE (3.7GHz) 35W 104W
AMD Phenom II X6 1100T (3.3GHz) 20W 109W
Intel Core i5 661 (3.33GHz) 4W 33W
Intel Core i7 880 (3.06GHz) 3W 106W
Gaming Performance Final Words
Comments Locked

78 Comments

View All Comments

  • silverblue - Wednesday, May 4, 2011 - link

    Phenom II came to market after Nehalem. Yes, it wasn't exactly meant to compete with it, and was more about sorting out what Phenom did wrong, but the unfortunate truth is that the i7 beat it to market.

    529th - I think the issue would be that you would be disabling a feature and thus not showing accurate performance, however that in itself would show the difference given by having Turbo in the first place. Doing the same with Thuban would be interesting.

    Zosma would've made more sense than continuing on with Deneb, however AMD quickly shelved that idea.
  • GullLars - Wednesday, May 4, 2011 - link

    Nah, it's just Intel has a huge advantage on fabs. They also have a clock for clock advantage on most workloads, but AMD's engineers are by no means incompetent. Brazos is the only recent great product though.
    If both Bulldozer and Lano fails spectacularly, i'll start leaning towards your side though. The last years Intel have alienated me with horrible business ethics and artificial restrictions on processors for market segmentation with very high prices on the chips that were not gimped (disabled functional units).
    Hopefully power consumption will be better with GloFo.
  • silverblue - Wednesday, May 4, 2011 - link

    Weren't you banned? No? Oh.

    Intel actually DOES need to make its compilers more vendor-agnostic since the settlement. If a piece of software works rather poorly on an AMD CPU and much better for an Intel offering, this isn't always down to the Intel CPU being stronger; it might actually be the result of the AMD CPU working on an inefficient codepath. Of course, it may actually be the limit of the processor. If something is compiled using a neutral compiler, I'd be far more inclined to trust the results of that.

    I doubt Intel is forcing people to use their compiler nor are they making them develop for Intel-only platforms, however in the interests of fair competition it's only right that checking for the CPU ID tag be removed, if that is what they're doing.
  • silverblue - Wednesday, May 4, 2011 - link

    Since you're such an expert, could you perhaps tell us, oh mighty one, what errata requires fixing? What is so buggy about an AMD processor?

    If you're going to make such bold statements, it's about time you actually backed them up with FACTS.
  • silverblue - Thursday, May 5, 2011 - link

    Still waiting.
  • silverblue - Wednesday, May 4, 2011 - link

    The most popular CPUs are towards the lower end. Intel doesn't have a quad core CPU below $100 whereas AMD has multiple. There's no reason why an Athlon II X4 setup shouldn't easily undercut an i3 setup, and occasionally, four true cores will be a benefit.
  • wh3resmycar - Wednesday, May 4, 2011 - link

    why and why would are you still guys sticking to 1680x1050? don't give me the crap about correctly segmenting processor performance hence a cpu bound resolution for gameX.

    people who actually are gamers, who are looking for a quadcore and up cpu solutions are probably gaming at HD resolutions and upwards.

    at least add graphs that are done on higher resolutions. your gaming performance benchmarks don't work in the real world.
  • stimudent - Sunday, May 8, 2011 - link

    This is great hardware, but it seems that most everyone I know who is an average user has left their tower systems and switched to smart phones, iPads, and gaming consoles for the most part. These items are now being discussed at family get togethers instead of PCs as was the case just a few years ago. Some advanced users are using the latest graphics cards for computational projects instead of gaming - the CPU is no longer seen as the best way to fold. It's the graphics card doing the grunt work while the quad-core processor sits mostly idle.

Log in

Don't have an account? Sign up now