• What
    is this?

    You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD. It features a collection of all of our independent AMD content, as well as Tweets & News from AMD directly. AMD will also be running a couple of huge giveaways here so check back for those.

    PRESENTED BY

Power Consumption

Most impressive is AMD's ability to run six 45nm cores at the same power consumption as four 45nm cores. The Phenom II architecture in general does reasonably well at idle, but without power gating AMD can't compete with Intel's idle power levels.

Under load Intel also has the clear advantage.

Idle Power Consumption

Gaming Performance Overclocking
POST A COMMENT

168 Comments

View All Comments

  • silverblue - Thursday, April 29, 2010 - link

    I agree. If it's a struggle to utilise all six cores at 100%, just add another program to the mix. This may just prove once and for all if a physical Stars core can beat a logical i- core, and thus whether AMD were right to launch Thuban in the first place. Reply
  • Scali - Friday, April 30, 2010 - link

    I'll say a few things to that...
    A physical Stars core actually has to beat TWO logical i-cores. After all, we have 6 Stars cores vs 8 logical i-cores.
    So if we were to say that the 4 physical cores on both are equal (which they're not, because the i-cores have an advantage), that leaves 2 physical cores against 4 logical cores.

    Another thing is that if you have to work hard to set up a multitasking benchmark that shows Thuban in a favourable light, doesn't that already prove the opposite of what you are trying to achieve?

    I mean, how realistic is it for a consumer processor to set up Virtual Box/VMWare benchmarks? Doesn't that belong in the server reviews (where as I recall, AMD's 6-cores couldn't beat Intel's 8 logical cores either in virtualization benchmarks)?
    Virtualization is not something that a consumer processor needs to be particularly good at, I would say. Gaming, video processing, photo editing. Now those are things that consumers/end-users will be doing.
    Reply
  • wyvernknight - Thursday, April 29, 2010 - link

    @mapesdhs
    Theres no such thing as an AM3 board with DDR2. Only an AM2+ board with DDR2 that has AM3 support. The MA770-UD3 you gave as an example is an AM2+ board with AM3 compatibility. "Support for Socket AM3 / AM2+ / AM2 processors". AM3 boards do not have support for AM2+ and AM2 processors.
    Reply
  • mapesdhs - Thursday, April 29, 2010 - link


    Strange then that the specs pages specifically describe the sockets as being AM3.

    Ian.
    Reply
  • Skyflitter - Thursday, April 29, 2010 - link

    Could someone please tell me the difference between the Phenom II X6 1090T & 1055T.

    I would like to put one of these new chips into my Gigabyte DDR2 MB but the Gigabyte web site says my board only supports the 1035T and the 1055T chips. My board is rated @ 140 W. ( GA-MA770-UD3 )

    I am currently running a Athlon 64 x2 6400+ ( 3.4Ghz ) and I do not want to loose to much clock speed by going with 1055T ( 2.8 Ghz ).

    Do all the new Phenom II X6 support DDR2?
    Reply
  • cutterjohn - Friday, April 30, 2010 - link

    I'm waiting for them to cough up a new arch that delivers MUCH better per-core performance.

    There is just no value proposition with their 6 core CPU that mostly matches a 5 core i7 920 which can be had for a roughly similar pricepoint, i.e. i7 930 $199 @ MicroCenter.

    Either way unless I win the giveaway :D, I'm now planning at least until next year to upgrade the desktop to see how Sandy Bridge comes out and IF AMD manages to get out their new CPU. I figure that I may as well wait now for the next sockets LGA2011 for Intel, and what I'm sure will be a new one for AMD with their new CPU. As an added bonus I'll be skipping the 1st generation of DX11 hw, as new architectures to support new APIs DX11/OGL4 tend to not be quite the best optimized or robust, especially apparently in nVidia's case this time. (Although AMD had an easier time of it as they made few changes from R7XX to R8XX as is usual for them. AMD need to really start spending some cash on R&D if they wish to remain relevant.)
    Reply
  • silverblue - Friday, April 30, 2010 - link

    The true point of the X6 is heavy multi-tasking. I'd love to see a real stress test thrown at these to show what they can do, and thus validate their existence. Reply
  • pow123 - Wednesday, May 05, 2010 - link

    You would have to be insane to pay $1000 for a chip that may be good for gaming. at $199 with slightly lower performance its a no brainer. When I build a system, I don't care if the frame rates etc is 10 to 15% better. Who cares ; the chip is fast and I have not problems playing high end games. I have no special setup and it does everything that my friends I7 can do. Good for me I get more pc for the buck . Go ahead and go broke buying just a motherboard and cpu when I can get a modern motherboard a cpu, 6gigs of ddr3 1600, a 1tb hd and a dvdrw. More for me. Reply
  • spda242 - Sunday, May 02, 2010 - link

    I would really like to have seen a World of Warcraft test with there CPUs like you did with the Intel 6-core.
    It would be interesting to see if WoW can use all Core's and to what performance.
    Reply
  • hajialibaig - Wednesday, May 05, 2010 - link

    Not sure why there is no Power vs. Performance vs. Price comparison of the different processors. As for the performance, it could be anything that you want, such as Gaming Performance or Video Encoding.

    Such a comparison should be interesting, since you may as well pay back the higher initial price via power savings.
    Reply

Log in

Don't have an account? Sign up now