Power Consumption

At idle, the Phenom's power consumption is competitive with Intel's quad-core, but under load Intel takes the cake. Power consumption will only get better for Intel with Penryn, without a doubt we'll see improvements to Phenom's power consumption as yields improve and production increases just as we did with K8.

Power Consumption - Idle

Power Consumption - Load

Gaming Performance Final Words
Comments Locked

124 Comments

View All Comments

  • B166ER - Monday, November 19, 2007 - link

    Had to reply and clarify. The phrase refers to current setups in which quadcore gaming is not a primary reason to purchase said processors. Alan Wake, Crysis, and others, while being able to take advantage of quadcore setups, are not out yet, and I would guess that less than 3% of games out there now are quadcore capable, and scaling of such games is probably hardly optimized even if there were more supply. You speak in a future context in which these games will be available, even still the abundance of them will still be in wait.

    Nonetheless, Anand that might be your best review in my book to date. It speaks honestly and depicts a very seemingly structureless company that only has time on its side to pull itself up from potential disaster. AMD has not shown too many positive strides as a company lately, and mindless spending on what should have been a direct "let em have at it" approach only shows what was speculated previously: the company has a vacuum in leadership that needs to be filled by capable hands. And we all know proper administration starts from the very top; Hectors step down will not be mourned by me. Dirk has his cup filled, but his past credentials speak highly and show him capable. I can only wish well.
    Phenom needs to be priced competitively, simple enough. and it needs higher quality yields for overclocking. Its amazing how they can stay just one step behind in almost every step currently. I hope the 7 series mobos bring about better competition vs Intel and Nvidia boards. We as a community need this to happen.
  • leexgx - Monday, November 19, 2007 - link

    there are about 2-3 games i think that use quad
  • wingless - Monday, November 19, 2007 - link

    Thats such a mixed bag it makes me sick. Phenoms are mediocre at best at almost everything but somehow magically rape Intel in Crysis. I'll have nightmares. WTF is going on internally in those four cores to make this happen? I hope software manufacturers code well for AMD so they can shine. The pro-Intel software market is huge and thats where the fight is. Unfortunately it doesn't look good for AMD there either because programmers hate having to learn new code.
  • defter - Monday, November 19, 2007 - link

    Rape Intel in Crysis??

    Crysis was one of the few benchmarks where fastest Phenom was faster than slowest Core2 Quad. Still the Phenom's advantage was less than a percent.

    I think it's better to say that Crysis is the benchmark where Phenom doesn't utterly suck (it just sucks a lot).
  • eye smite - Tuesday, November 20, 2007 - link

    You really think those shining numbers are realistic from pre production sample cpu's? I think you should all wait til sites have full production MB's and cpu's and can give real data with all their tests, then you can decide it's a steaming sack of buffalo droppings. Until then, you'll just sound like a racaous bunch of squabbling crows.
  • JumpingJack - Monday, November 19, 2007 - link

    Ohhhh, here we go again with the 'It's note coded well for AMD' conspiracy theories.
  • wingless - Monday, November 19, 2007 - link

    AMD recently released new software libraries for these processors....
  • JumpingJack - Monday, November 19, 2007 - link

    Yeah, great for the FPU library which they already compiled into their PGI for the SPEC.ORG runs, which consquently are slowly getting the non-compliant branding pasted all over them.
  • TSS - Monday, November 19, 2007 - link

    "It turns out that gaming performance is really a mixed bag; there are a couple of benchmarks where AMD really falls behind (e.g. Half Life 2 and Unreal Tournament 3), while in other tests AMD is actually quite competitive (Oblivion & Crysis)."

    the UT series and halflife 2 are both very CPU intensive, while oblivion and especially crysis are videocard killers. it's hard to say but it sucks in games as well. you'd wanna see a difference in crysis though, make it scale to about 640x480. it's just too demanding on the graphics card to compare at 1024x768. in lament terms, in HL2 the graphics card is usually picking his nose waiting for the CPU, so a stronger CPU will make a dig difference. in crysis especially it's exactly the other way around, so that's why scores are closer together.

    why is this true? in half life 2, there are 50 frames between the best and worst of the line up. in UT3, there are 50 frames between the best and the worst of the lineup (meaning regardless of clockspeeds or architecture. the frames per second is also measured in the hundreds, even by such a new and graphic intensive game as UT3 (though it should be noted as far as i know the beta demo did NOT ship with the highest resolution textures to keep file size down). now crysis has about 9 frames per second difference between a 2,2 ghz phenom and a 2,66 ghz intel proc, and oblivion manages about 16. same system different game it can only be concluded that the game is much more graphic card intensive, which shouldn't be hard to imagine since crysis is well known to be the graphics card killer of the moment.... and oblivion of the last generation (HDR and AA anybody?).

    AMD's 10-30% slower in every other test they are as well in gaming. i belive the difference would've shown more if they used a SLI or crossfire solution, though i understand that's not possible with the chipsets and drivers existing at the moment.

  • MDme - Monday, November 19, 2007 - link

    Time to upgrade.....to the dark side.

Log in

Don't have an account? Sign up now