Socket-AM2, AM2+ and AM3: Backwards Compatibility

AMD fixed the cache size issue, it fixed the power consumption problem, and we even got higher clock speeds with Phenom II. What I didn’t expect was something more. AMD has always been a manufacturer for the customers. Over the past couple of years the problem has been that their processors haven’t really been desired by consumers, but prior to that the AMD that we know and love designed processors for today’s applications with a minimal number of platform changes between processors.

Phenom II carries AMD’s consumer focused nature to the next level. Today’s Phenom II parts are designed for Socket-AM2+ motherboards. AMD doesn’t qualify any of them for use on Socket-AM2 motherboards, but there’s nothing stopping a motherboard maker from enabling support on a standard AM2 motherboard. You will need a BIOS update.

Next month, AMD will launch the first Socket-AM3 Phenom II processors. The main difference here is that these parts will support DDR3 memory. Oh no, another socket, right? Wrong.

Socket-AM3 Phenom II parts will also work in Socket-AM2+ motherboards, the two are pin-compatible. When in an AM2+ board, these upcoming Phenom II processors will work in DDR2 mode, but when in an AM3 board they will work in DDR3 mode. How cool is that?

This unique flexibility is largely due to the work that was done on the DDR2 and DDR3 specs at JEDEC. The number of signaling pins and the signaling pins themselves between DDR2 and DDR3 don’t actually change on the memory controller side; the main differences are routing and termination at the memory socket side. AMD just needed a physical memory interface on Phenom II that could operate at both 1.8V (DDR2) and 1.5V (DDR3) as well as work with timings for either memory technology. The potential was there to do this on the first Phenom, it just wasn’t ready in time, but with the Socket-AM3 Phenom II processors you’ll be able to do it.

While I’m not sure how practically useful the AM3/AM2+ flexibility will be, I’d rather have it than not. Being able to take one CPU and stick it in two different sockets, each with a different memory technology, and have it just work is the most customer-centric move I’ve ever seen either company make. AMD told me that this plan was in the works before the original Phenom ever launched, somewhere in the 2004 timeframe. AMD was active in JEDEC on making the DDR2 and DDR3 specs similar enough that this one-CPU, two-sockets approach could work.

One of the biggest risks AMD faced when it chose to integrate the memory controller was what would happen if there was a sudden shift in memory technology. With the upcoming Socket-AM3 versions of Phenom II, that risk is completely mitigated by the fact that a single chip can work with either memory technology. It gives OEMs a tremendous amount of flexibility to ship systems with either DDR2 or DDR3 memory depending on which is more cost effective. It also ensures a much smoother transition to DDR3.

The downside for AMD is that because Socket-AM3 Phenom II chips are right around the corner, it makes little sense to buy one of these Socket-AM2+ Phenom II processors - at least not until we know the pricing and availability of the Socket-AM3 versions.

Slower North Bridge Frequency for AM2+, Faster when AM3 Arrives

An extra benefit of the Socket-AM3 Phenom II processors is that their uncore (memory controller + L3 cache) will be clocked at 2.0GHz instead of 1.8GHz like the two processors launching today. By comparison the Phenom 9850 and 9950 both have a 2.0GHz uncore clock; AMD had to go down to 1.8GHz to launch the Phenom II at 2.8GHz and 3.0GHz today.

As 45nm yields improve AMD will increase the uncore frequency, but today it's at 1.8GHz and the AM3 chips will have it at 2.0GHz. The Core i7 runs its uncore at 2.13GHz for the 920 and 940, and 2.66GHz for the 965.

45nm and Low Power Consumption Phenom vs. Phenom II - Clock for Clock
Comments Locked

93 Comments

View All Comments

  • Proteusza - Thursday, January 8, 2009 - link

    No, I said I hoped it could at least compete with a Core 2 Duo.

    if its too much to hope that a 2 year younger, 758 million transistor CPU could compete clock for clock with a first gen Core 2 Duo, then AMD has truly fallen to new lows. It has more transistors than i7, and yet it cant compete with a Core 2 Duo let alone i7. What happened to the sheer brilliance of the A64 days? It could beat the pants off any Pentium 4. Now the best AMD can do is barely acceptable performance at a higher clockspeed than Intel needs, all the while using a larger die than Intels.

    This keeps them in the game, but it means I wont bother buying one. Why should I?
  • coldpower27 - Thursday, January 8, 2009 - link

    Those days are over, their success was also contigent with Intel stumbling a bit and they did that with P4, with Intel firing on all cylinders, AMD at acceptable is just where they are supposed to be.
  • Denithor - Thursday, January 8, 2009 - link

    It wasn't so much of a stumble, more like a face-plant into a cactus. Wearing shorts and a tshirt.

    Intel fell flat with Netburst and refused to give up on it for far too long (Willamette -> Northwood -> Prescott -> Cedar Mill). I mean, the early days of P4 were horrible - it was outperformed by lower-clocked P3 chips until the increased clockspeed was finally too high for architectural differences to negate.

    Into this mix AMD tossed a grenade, the A64 - followed by the X2 on the same architecture. With its IMC and superior architecture there was no way Netburst could compete. Unfortunately, AMD hasn't really done anything since then to follow through. And even today's PII isn't going to change things dramatically for them, they're still playing second fiddle to Intel's products (which means they're forced into following Intel's lead in the pricing game).
  • JKflipflop98 - Thursday, January 8, 2009 - link

    Damn it feels good to be a gangsta ;)
  • Kob - Thursday, January 8, 2009 - link

    Thanks for the meaningful comparison with such a wide range of processors. However, I wonder why the benchmarks are so much tilted toward the graphics/gaming world. I think that many in the SOHO world will benefit from test results of other common applications/fields such as VS Compilation, AutoCAD manipulation, Encryption, simple database indexing and even a Chess game.
  • ThePooBurner - Thursday, January 8, 2009 - link

    In the article you compare this to the 4800 series of GPUs. I actually see this as the 3800 series. It works out perfectly. The 2900 came along way late and didn't deliver, used to much power, didn't overclock well, and was just all around a looser of a card. Then the 3800 came along. Basically the same thing, but with a die shrink that allowed it to outstretch, just enough, it's predecessor. It was the first card where they got the mix right. After that came the 4800 with a big boost and even more competition. This is what i now see happening with the CPU line. The Phenom 1 was the 2900, and the Phenom II is the 3800. Getting the mix right and getting ready for the next big swing. But, as you point out, Intel isn't likely to sit back, and we can all agree that they are a much different competitor than Nvidia is.
  • Denithor - Thursday, January 8, 2009 - link

    ...and just like the 3800 series, it falls just short of the target.

    Remember? The 3870 couldn't quite catch the 8800GT and the 3850 couldn't quite match the 9600GT. While they weren't bad cards, they unfortunately also didn't give AMD the muscle to set pricing where they wanted it, instead they had to put pricing in line with how nVidia priced their offerings.

    Same is happening here, with AMD pricing their chips in line with Intel's Q9400/Q9300 processors. And they may have to drop those prices if Intel cuts the Q9550/Q9400 down another peg.
  • Griswold - Friday, January 9, 2009 - link

    Rubbish theory. First of all, these cards were actually available whereas the 8800GT was in extreme short supply and thus much more expensive for many weeks, even into 2008, because it literally made everything else nvidia had to offer obsolete. I couldnt get one and settled for a 3870 for that reason.

    Secondly, the 9600GT? Do you realize how much later that card came to the game than the 3850? It hit the market near the end of february. Thats almost 3 months after the launch of the 38xx part.

    The whole comparison is silly.
  • ThePooBurner - Friday, January 9, 2009 - link

    The 3800 line wasn't ever meant to beat the 8800 line. It just wasn't in the cards. It's purpose was to get the reins back under control. Cut the power and get back to a decent power/performance ratio as well as get equal power to a previous generation in a smaller package to help improve margins. It was a stage setter. From the first time i read about it i knew that it was just a setup for something more, something "bigger and better" that was going to come next. And then the 4800 came along and delivered the goods. I get this same feeling reading about the Phenom II. It's setting the stage. Getting about the same power (a small bump, just like the 3870 over the 2900) in a smaller package, a better power/performance ratio, etc.. This is simply a stage setting for the next big thing. The next CPU from AMD after this one is going to deliver. I'm sure of it.
  • Kougar - Thursday, January 8, 2009 - link

    If you tried Everest and Sandra, what about CPU-Z's cache latency tool? It's not part of the CPU-Z package anymore, but they still offer it. Link: http://www.cpuid.com/download/latency.zip">http://www.cpuid.com/download/latency.zip

    I thought this tool was very accurate, or is this not the case? It even detected the disabled L3 cache on a Northwood that turned out to be a rebadeged Gallatin CPU.

Log in

Don't have an account? Sign up now