Final Words

Now that the pieces are falling into place we are able to understand a bit more about the implications of AMD's move to 65nm. It's clear that these first 65nm chips, while lower power than their 90nm counterparts, aren't very good even by AMD's standards. Already weighing in at the high end of the voltage spectrum, we hope to see more overclockable, lower power offerings once AMD's 65nm ramp really starts up. It's a constantly evolving process and if this is the worst we will see, it's not terrible; AMD can only go up from here, but it does mean that you shouldn't hold your breath waiting for the right 65nm AMD to come along.

Performance and efficiency are still both Intel's fortes thanks to its Core 2 lineup, and honestly the only reason to consider Brisbane is if you currently have a Socket-AM2 motherboard. It is worth mentioning that AMD still has the lowest overall power use with its Athlon 64 X2 EE SFF processor, but in terms of performance per watt efficiency it's not all that great. We would really like to see an EE SFF successor built on AMD's 65nm process, but we have a feeling it will be a little while before we are graced with such a delicate creature.

The step back in performance with Brisbane is truly puzzling; while none of our individual application benchmarks showed a tremendous loss in performance, it's a very unusual move for AMD. The last thing AMD needs to do is take away performance, and based on its current roadmaps the higher latency L2 cache makes no sense at all. Either AMD has some larger L2 cache variants in the works that we're not aware of, or AMD's cache didn't take very kindly to the 65nm shrink. As soon as we get the official word as to why L2 access latencies jumped 66% with Brisbane we'll be sure to report it; until then we can only wonder.

We long for the good old days, when a die shrink meant ridiculously overclockable processors, back before a die shrink was coupled with a sneaky decrease in performance. While Brisbane is far from a Prescott, it's not exactly what we were hoping for from AMD's first 65nm Athlon 64 X2. Hopefully they can work out some of the process' kinks in time for the K8L launch.

Gaming Performance & Power Usage - Continued
Comments Locked

52 Comments

View All Comments

  • OcHungry - Wednesday, January 3, 2007 - link

    I was wondering if there was a way to PM or email you.
    I would like to bring to your attention a few concerns regarding the forum and the moderators that you need to be aware of. Is there an email or can I PM you in any way?
    I would appreciate your respond.
    Bayad shoma bedaneed.
    tashakore
  • SilverMirage - Saturday, December 23, 2006 - link

    I understand that we are looking at limiting games by the CPU and not the graphics card, but lets be a little realistic...some of those performance per watt figures aren't what an end user is going to be seeing when comparing his options.
  • Wwhat - Saturday, December 23, 2006 - link

    Perhaps a weird thought but AMD bought that new ZRAM process that theoretically could put a huge cache on a small space with the flaw that it had so-so latency, then they later bought version 2 of the same but so far they didn't use it.
    Now what if this is an experiment with that technology, or a preamble, because at some time you would expect them to start using stuff they bought the license off, although at the time of the ZRAM announcement people were projecting use in a very far future that might not be what AMD has in mind, or the situation might have changed as to the performance of current versions of ZRAM.
    What do you think, any link to it?
  • Tujan - Friday, December 22, 2006 - link

    If you look,you'll see a story here in Anandtech.com for the first AMD X2s dated June 2005. The processor there was the 'top notch,for something unheard of in the processor industry. And even so doing with DDR400 memory.

    What is strange in these processor scenarios. There is Moores law. And there is the 'business quarter law. The article here,of wich I am commenting to,if the first of the detail in the "new processors"-the 65Nm processors. The AMDX2xs where also "new" when the 2005 articles was published. Today,as of reading this article,the "old"(and then top notch)is no longer being supported ..or as per use,no longer maintaining the subject cache sizes. AND well,point me if I am not correct,but the "old"processors are no longer going to be manufactured.

    Well I have usually considered that in these events of "new,and old"technology that somehow,if something is 18 months past.I can actually afford it. This changes a little with the new Intel setups. Since the Intel lineups finally seem to break this cycle of the previous as I explained here.

    For an entire year,I saw article after article,putting AMDs top notch as base line for performance. Ney user could address something of a culminating relationship to what had performance and what they could afford. And AMDs "top notch" an "industry standard" wich of course nobody could afford. At least if somebody was passing the buck,it wasn't happening with me.

    Annyway,I would just like to say I wish AMD good luck.Yet I cannot be ashamed to say that I can now put together a system for less than 1000$ with the same parts as that '"top notch"industry standard seen for the permiating part of 2005.

    Am I better for thinking that way ? 18 months passed,and my dollars are spendable but not supported ? I dont think that anybody could consider an AMD setup a 'low-end setup if for example the 4800+1Mb l2 can be had for 230$.

    That is the enevitable consequence is the final exhaustion of the supply of the component. Yet I could say that I have a "top notch" 2005 version of AMD technology. With 2007 PARTS! Being on the exhausted end,I dont know who could feel better about this.

    Wish AMD luck. Still with their record,I probably should say that I do not wish to look forward to exhaustion,at the same time as extingusishment,as in the mentioning of taking a break to what one pays for.

  • mino - Thursday, December 21, 2006 - link

    Just my pint into the fire:

    X2 4200+EE & GF6150 board (MSI K9NGM2-FID)
    $240 (170+75)

    E6300 $ G965 board (ASUS P5B-VM)
    $285 (185+100)

    Conclusion:

    Anything cheaper is K8 vs. Netburst so Intel is no contender.
    Anything more expensive is K8 vs. C2D clock/clock so AMD is no contender (4800+65nm is more expensive than E6400 it matches by perf.)

    For decent IGP-free boards the difference is comparable.


    So, going for
    stock performance the choice is simple:
    <$300 for CPU+MB combo go AMD X2
    >$300 for CPU+MB combo go Intel C2D

    for overclocking:
    <$240 for CPU+MB combo go AMD X2
    $240-$285 -> wait and then Intel C2D
    >$285 for CPU+MB combo go Intel C2D

    for power consumption (i.e. PC's for office use):
    AMD X2 3800+EE to 5000+EE (anything above or down is a waste of money in this case)

    for single core:
    <$190 for CPU + MB combo -> go AMD A64
    $190-$230 for CPU + MB combo -> go AMD X2 at $240
    >$240 see dual-core recommendations

    That IMO sums up the whole Desktop PC market as of now.
  • mino - Thursday, December 21, 2006 - link

    $190-$230 for CPU + MB combo -> go AMD X2 at $240

    should be:

    $190-$230 for CPU + MB combo -> go AMD X2 3800+ at $200
  • mino - Thursday, December 21, 2006 - link

    The prices are Newegg based.
  • Shintai - Thursday, December 21, 2006 - link

    Anand, can you test the 65nm K8 with lower res in games and a broader selection of games so we can more truely see the difference?

    http://www.firingsquad.com/hardware/amd_athlon_64_...">http://www.firingsquad.com/hardware/amd_athlon_64_...

    If these numbers hold water, then 65nm K8s is a disaster in terms of gaming performance.
  • mongoosesRawesome - Thursday, December 21, 2006 - link

    I'd be interested in seeing these same performance/watt graphs using the 965 chipset. The 680i is a power hog.
  • mino - Thursday, December 21, 2006 - link

    The same power hog is 590SLI for AMD.

    Actually 965 vs. RD580 would hurt Intel even more ... So, go figure.

Log in

Don't have an account? Sign up now