The Test

For each benchmark we measured performance as well as average power consumption during the course of the benchmark, finally reporting performance per watt as one divided by the other. Both Cool 'n Quiet and EIST were enabled on all processors.

CPU: Intel Core 2 Duo E6600 (2.40GHz/4MB)
AMD Athlon 64 X2 5000+ (2.6GHz/512KBx2)
AMD Athlon 64 X2 5000+ EE "Brisbane"
AMD Athlon 64 X2 4800+ EE "Brisbane"
AMD Athlon 64 X2 EE 4600+ (2.4GHz/512KBx2)
AMD Athlon 64 X2 EE SFF 3800+ (2.0GHz/512KBx2)
Motherboard: eVGA NVIDIA nForce 680i
ASUS M2N32-SLI Deluxe
Chipset: nForce 680i
nForce 590 SLI
Chipset Drivers: NVIDIA 9.53
Hard Disk: Seagate 7200.9 300GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Card: NVIDIA GeForce 8800 GTX
Video Drivers: NVIDIA ForceWare 97.44
Resolution: 1600 x 1200
OS: Windows XP Professional SP2

Idle Power Consumption

Power Consumption  

Brisbane Performance Issues Demystified: Higher Latencies to Blame Media Encoding Performance & Power Consumption


View All Comments

  • OcHungry - Wednesday, January 3, 2007 - link

    I was wondering if there was a way to PM or email you.
    I would like to bring to your attention a few concerns regarding the forum and the moderators that you need to be aware of. Is there an email or can I PM you in any way?
    I would appreciate your respond.
    Bayad shoma bedaneed.
  • SilverMirage - Saturday, December 23, 2006 - link

    I understand that we are looking at limiting games by the CPU and not the graphics card, but lets be a little realistic...some of those performance per watt figures aren't what an end user is going to be seeing when comparing his options. Reply
  • Wwhat - Saturday, December 23, 2006 - link

    Perhaps a weird thought but AMD bought that new ZRAM process that theoretically could put a huge cache on a small space with the flaw that it had so-so latency, then they later bought version 2 of the same but so far they didn't use it.
    Now what if this is an experiment with that technology, or a preamble, because at some time you would expect them to start using stuff they bought the license off, although at the time of the ZRAM announcement people were projecting use in a very far future that might not be what AMD has in mind, or the situation might have changed as to the performance of current versions of ZRAM.
    What do you think, any link to it?
  • Tujan - Friday, December 22, 2006 - link

    If you look,you'll see a story here in for the first AMD X2s dated June 2005. The processor there was the 'top notch,for something unheard of in the processor industry. And even so doing with DDR400 memory.

    What is strange in these processor scenarios. There is Moores law. And there is the 'business quarter law. The article here,of wich I am commenting to,if the first of the detail in the "new processors"-the 65Nm processors. The AMDX2xs where also "new" when the 2005 articles was published. Today,as of reading this article,the "old"(and then top notch)is no longer being supported ..or as per use,no longer maintaining the subject cache sizes. AND well,point me if I am not correct,but the "old"processors are no longer going to be manufactured.

    Well I have usually considered that in these events of "new,and old"technology that somehow,if something is 18 months past.I can actually afford it. This changes a little with the new Intel setups. Since the Intel lineups finally seem to break this cycle of the previous as I explained here.

    For an entire year,I saw article after article,putting AMDs top notch as base line for performance. Ney user could address something of a culminating relationship to what had performance and what they could afford. And AMDs "top notch" an "industry standard" wich of course nobody could afford. At least if somebody was passing the buck,it wasn't happening with me.

    Annyway,I would just like to say I wish AMD good luck.Yet I cannot be ashamed to say that I can now put together a system for less than 1000$ with the same parts as that '"top notch"industry standard seen for the permiating part of 2005.

    Am I better for thinking that way ? 18 months passed,and my dollars are spendable but not supported ? I dont think that anybody could consider an AMD setup a 'low-end setup if for example the 4800+1Mb l2 can be had for 230$.

    That is the enevitable consequence is the final exhaustion of the supply of the component. Yet I could say that I have a "top notch" 2005 version of AMD technology. With 2007 PARTS! Being on the exhausted end,I dont know who could feel better about this.

    Wish AMD luck. Still with their record,I probably should say that I do not wish to look forward to exhaustion,at the same time as extingusishment,as in the mentioning of taking a break to what one pays for.

  • mino - Thursday, December 21, 2006 - link

    Just my pint into the fire:

    X2 4200+EE & GF6150 board (MSI K9NGM2-FID)
    $240 (170+75)

    E6300 $ G965 board (ASUS P5B-VM)
    $285 (185+100)


    Anything cheaper is K8 vs. Netburst so Intel is no contender.
    Anything more expensive is K8 vs. C2D clock/clock so AMD is no contender (4800+65nm is more expensive than E6400 it matches by perf.)

    For decent IGP-free boards the difference is comparable.

    So, going for
    stock performance the choice is simple:
    <$300 for CPU+MB combo go AMD X2
    >$300 for CPU+MB combo go Intel C2D

    for overclocking:
    <$240 for CPU+MB combo go AMD X2
    $240-$285 -> wait and then Intel C2D
    >$285 for CPU+MB combo go Intel C2D

    for power consumption (i.e. PC's for office use):
    AMD X2 3800+EE to 5000+EE (anything above or down is a waste of money in this case)

    for single core:
    <$190 for CPU + MB combo -> go AMD A64
    $190-$230 for CPU + MB combo -> go AMD X2 at $240
    >$240 see dual-core recommendations

    That IMO sums up the whole Desktop PC market as of now.
  • mino - Thursday, December 21, 2006 - link

    $190-$230 for CPU + MB combo -> go AMD X2 at $240

    should be:

    $190-$230 for CPU + MB combo -> go AMD X2 3800+ at $200
  • mino - Thursday, December 21, 2006 - link

    The prices are Newegg based. Reply
  • Shintai - Thursday, December 21, 2006 - link

    Anand, can you test the 65nm K8 with lower res in games and a broader selection of games so we can more truely see the difference?">

    If these numbers hold water, then 65nm K8s is a disaster in terms of gaming performance.
  • mongoosesRawesome - Thursday, December 21, 2006 - link

    I'd be interested in seeing these same performance/watt graphs using the 965 chipset. The 680i is a power hog. Reply
  • mino - Thursday, December 21, 2006 - link

    The same power hog is 590SLI for AMD.

    Actually 965 vs. RD580 would hurt Intel even more ... So, go figure.

Log in

Don't have an account? Sign up now