K10: What's in a name?

There's been this confusion over codenames when it comes to what we should call AMD's next-generation micro-architecture. Originally it was referred to by much of the press (and some of AMD) as K8L, and more recently AMD took the stance that K8L was made up by the press and that K10 is the actual name of its next-generation micro-architecture. Lately we've been calling it Barcelona, as that is the codename attached to the first incarnation of AMD's next-generation micro-architecture, destined for the server market. The desktop versions we've been calling Agena (quad-core), Kuma (dual core) and Agena FX for the Socket-1207 quad-core version, once again because those are the product specific codenames listed on AMD's roadmaps.

But when we talk about architecture, is Barcelona based on K8L, K10, or is there even a proper name for what we're talking about? To find out we went straight to the source, AMD's CTO Phil Hester, and asked him to settle the score. According to Hester, K10 was never used internally, despite some AMD representatives using it in reference to Barcelona. By the same measure, K8L does not refer to the micro-architecture behind Barcelona. It sounds like neither K8L nor K10 are correct when referring to AMD's next-generation architecture, so we'll have to continue to use Agena/Kuma/Barcelona in their place.

What happened after K8?

As we're talking about names, there was a project after the K8 that for various reasons wasn't called K9. Undoubtedly there was an internal name, but for now we'll just call it the first planned successor to the K8. The successor to the K8 was originally scrapped, but the question is how far into its development was AMD before the plug was pulled? According to Phil Hester, the project after K8 was in its concept phase when it was canceled - approximately 6 months of time were invested into the project.

So what was the reason for pulling the plug? Apparently the design was massively parallel, designed for heavily multithreaded applications. AMD overestimated the transition to multithreaded applications and made significant sacrifices to single threaded performance with this design. Just as the clock speed race resulted in Intel running straight into a power wall, AMD's massively multithreaded design also ran into power consumption issues. The chip would have tremendous power consumption, largely wasted, given its focus on highly parallel workloads.

The nail in the coffin of AMD's ill fated project was its support for FB-DIMMs. AMD quickly realized that Fully Buffered DIMM was not going to come down in cost quickly enough in the near term to tie its next microprocessor design to it. AMD eventually settled on unbuffered and registered DDR2 instead of FBD.

Without a doubt, AMD made the right decisions with scrapping this project, but it sounds like AMD lost about half a year doing the project. Given that the first K8 was introduced back in 2003, one canceled project doesn't explain why we're here in 2007 with no significant update to the K8's micro-architecture. We couldn't get a straight answer from AMD as to why Barcelona didn't come earlier, but there are a number of possibilities that we have to consider.

Barcelona is AMD's first native quad-core design, which is more complicated than simply sticking two independent dual core die on the same package. AMD committed the cardinal sin in microprocessor design by executing two very complicated transitions at the same time. Not only did AMD build its first native quad-core design with Barcelona, but it also made significant changes to the architecture of each of its cores.

Intel's Mooly Eden, the father of Centrino, once imparted some very important advice to us. He stated plainly that when designing a microprocessor you can change the architecture, or you can change the manufacturing process, but don't do both at the same time. AMD has already started its 65nm transition with its current generation parts, so the comparison isn't totally accurate, but the premise of Mooly's warning still applies: do too much at the same time and you will run into problems, usually resulting in delays.

There's also this idea that coming off of a significant technology lead, many within AMD were simply complacent and that contributed to a less hungry company as a whole. We're getting the impression that some major changes are happening within AMD, especially given its abysmal Q1 earnings results (losing $611M in a quarter tends to do that to a company). While AMD appeared to be in a state of shock after Intel's Core 2 launch last year, the boat has finally started to turn and the company that we'll see over the next 6 - 12 months should be quite different.

AMD in Consumer Electronics New Details on Barcelona Emerge


View All Comments

  • strikeback03 - Friday, May 11, 2007 - link


    I think they're more concerned about selling stuff they have out today, which they aren't doing a great job of. What would happen if they showed a great product right around the corner? Q1 would look like a success compared to what they'd endure.

    This implies that actual performance numbers would make Barcelona more visible. But to factor into a buying decision they have to know Barcelona is coming, and anyone who knows that can probably guess it will be a significant step forward, based on it's need to compete with Intel. Soe either you don't know Barcelona is coming, in which case performance numbers don't matter; or you do know it is coming, in which case the only reason to buy AMD before then is because it's cheap.

    At least they stated that the new processors will be usable in the AM2 motherboards.
  • TA152H - Friday, May 11, 2007 - link

    You are using pretzel logic here.

    If you know Barcelona is a significant step forward, why do you need the results posted beforehand?

    Actually, performance would make Barcelona more visible, and if it were better than expected, you'd kill current sales. You can speculate on performance, but you really don't know. The only place you'd really want people to know beforehand is the server market, because people plan these purchases. And guess what? AMD released those numbers, and there were pretty high.

    It's also completely different to know something is coming out and guessing at the performance, than actually seeing the numbers and from that being thoroughly disgusted with the performance. I could live with any of the processors today, but once I see one get raped by the next generation, I don't want it. It hits you on a visceral level, and after that, it's difficult to go back to it. Put another way, say there is a girl you can out with today that's fairly attractive and would certainly add to your life. You could wait for one that will be more attractive later on, but you don't really need to since this new one is more than adequate. Now say you see this bombshell. Do you think you'd really want to go back to the one that wasn't so attractive?

    We're human, we respond to things on an emotional level even when we know we shouldn't. The head never wins against the heart. I'm not sure that's a bad thing either, life would be so uninteresting were it not so.
  • blppt - Friday, May 11, 2007 - link

    "AMD's reasoning for not disclosing more information today has to do with not wanting to show all of its cards up front, and to give Intel the opportunity to react."

    Come on....I'm sure Intel already has a pretty good idea of what they are up against. I'm sure Intel has access to information on their competitors that the general tech public doesn't.
  • michal1980 - Friday, May 11, 2007 - link

    All they said is there is new stuff coming. Trust me, if the cpu's they had right now were beating the pants off of intel, they would post the number. I'm not saying give us the freq, the cpu runs on. But if they knew that games run 50% faster, they would at least hint it.

    Nice things: looks like the new mobo chip runs cool, look at how small the hsf are on those chips.

    Not nice: how hot are these new cpus? look at all those fans, its like a tornado in the case.

    Note nice: No DATES? all that means is its even easier to push things back. Winter 2007, because early 2008
  • Ard - Friday, May 11, 2007 - link

    Excellent article as always, Anand. It's nice to finally get some info on AMD and find out that they're not throwing in the towel just yet. Some performance numbers would've been nice but I guess you can't have everything. I did have to laugh at the slide that said S939 will continued to be supported throughout 2007 though, considering you can't even buy new S939 CPUs. Reply
  • Beenthere - Friday, May 11, 2007 - link

    It's a known fact that Intel has had to try and copy the best features of AMD's products to catch up in performance to AMD. Funny how when Intel was secretive and blackmailing consumers for 30 years that was fine but when AMD doesn't give away all of their upcoming product technical info. for Intel to copy, that's not good -- according to some. With Intel being desperate to generate sales for their non-competitive products over the past 2-3 years, they decided to really manipulate the media - and it's worked. The once secretive Intel is the best friend a hack can find these days. They'll tell a hack anything to get some form of media exposure.

    I find AMD's release of info. just fine. If it were not for AMD all consumers would be paying $1000 for a Pentium 90 CPU today and that would be the fastest CPU you could buy. People tend to forget all that AMD has done for consumers. The world would be a lot worse off than it is if it were not for AMD stepping to the plate to take on the bully from Satan Clara.

    Many in the media are shills and most of the media is manipulated by unscrupulous companies like Intel, Asus, and a long list of others. Promise a hack some "inside info." or insiders tour so they can get a scoop or a prototype piece of hardware that has been massaged for better performance than the production hardware and the fanboy hacks will write glowing opine about a companies products and chastise the competition every chance they get.

    Unfortunately what was once a useful service - honest product reviews -- is now a game of shilling for dollars. You literally can't believe anything reported at 99% of websites these days because it's usually slanted based on which way the money flows... It's no secret that Intel and MICROSUCKS are more than willing to lubricate the wheels of the ShillMeisters to get favorable tripe.
  • TA152H - Friday, May 11, 2007 - link


    What are you talking about? Intel invented the microprocessor (4004), invented the instruction set used today (8086) and has been getting copied by AMD for years.

    The Athlon was certainly nothing to copy, you could just as easily say they copied the Pentium III (and did a bad job of it, whereas the Core is much better than the Athlon). What's so unique about the Athlon that could be copied anyway? It's a pretty basic design. It worked OK, I guess, but the performance per watt was always poor until the Pentium 4 came around and redefined just what poor meant.

    x86-64 is straightforward, and you can be sure Microsoft designed most of it. I'm not saying this as anything bad about AMD, because who better to design the instruction set than Microsoft? Intel and Microsoft do enough software to understand what is best, AMD is allergic to software, so I think this is a good thing.

    I agree, only slightly, that these review sites are ass-kissers by nature, because they need good relationships with the makers. I doubt they are getting kick-backs, but say Anand is more honest with his opinions (he always is about a lousy product, after the company comes out with a good one), he'd get cut off from some information or products from that same company. So, they kiss ass because if they write scalding and honest reviews they lose out and can't function as an information site as well. I don't like it, but can you blame him? In his situation, you'd have to do exactly the same thing - give a review in a delicate way without offending the hand that feeds you, but trying to get your point across anyway with the factual data. Tom Pabst was funny as Hell in his old reviews, he took a devil may care attitude, but nowadays even that site has accepted the reality of being on good terms with technology companies whenever possible. In the long run, it's worth it.
  • Viditor - Saturday, May 12, 2007 - link


    Intel invented the microprocessor (4004)

    Actually, most of the work was done at Fairchild Semiconductor...that's where both Gordon Moore (founder of Intel) and Jerry Sanders (founder of AMD) worked together.
    Moore left FS in 1968 to form Intel (along with Bob Noyce) and Sanders left in 1969 to form AMD.
    Intel began as a memory manufacturer, but Busicom contracted them to create a 4-bit CPU chip set architecture that could receive instructions and perform simple functions on data. The CPU becomes the 4004 microprocessor...Intel bought back the rights from Busicom for US$60,000.
    Interestingly, TI had a system on a chip come out at the same time, but they couldn't get it to work properly so Intel got the money (and the credit).

    What's so unique about the Athlon that could be copied anyway? It's a pretty basic design

    You're kidding, right??
    1. Athlon had vastly superior FP because of it's super-pipelined, out-of-order, triple-issue floating point unit (it could operate on more than one floating point instruction at once)
    2. Athlon had the largest level 1 cache in x86 history
    3. When it was first launched, it showed superior performance compared to the reigning champion, Pentium III, in every benchmark
    4. Three generalized CISC to RISC decoders
    5. Nine-issue superscalar RISC core
    Just look at the reviews during release (you might think it's si,ilar to the C2D reviews...)
    http://www.aceshardware.com/Spades/read.php?articl...">Aces Hardware


    x86-64 is straightforward, and you can be sure Microsoft designed most of it.

    That's just silly...while I'm sure MS had plenty of input, there are no chip architects on their staff that I'm aware of (in other words nobody their COULD design it).
    It's like saying that when a Pro driver gives feedback to the engineers on what he wants, he's the one who designed the car...don't think so.
  • TA152H - Sunday, May 13, 2007 - link

    What's your point about the 4004? You're giving commonly known information that in no way changes the fact that Intel invented the first microprocessor. It wasn't for themselves, initially, but it was their product. AMD didn't create it, and they didn't create the other microprocessors they were a second source from Intel. Look at their first attempt at their own x86 processor to see how good they were at it, the K5. It was late, slower than expected, and created huge problems for Compaq, which had bet on them. Jerry Sanders was smart enough to by NexGen after that.

    You are clearly clueless about microprocessors if you think any of those things you mention about the Athlon are in any way anything but basic.

    The largest L1 cache is a big difference??? Why that's a real revolution there!!!! They made a bigger cache! Holy Cow! Intel still hasn't copied that, by the way, so even though it's nothing innovative, it was still never copied.

    The FP unit was NOT the first pipelined one, the Pentium was and the Pentium Pro line was also pipelined, or superpipelined as you misuse the word. Do you know what superpipelined even means? It means lots of stages? Are you implying the Athlon was better in floating point because it had more floating point stages? Are you completely clueless and just throwing around words you heard?

    Wow, they had slightly better decoding than the P6 generation!!!! Wow, that's a real revolutionary change.

    You're totally off on this. They did NOTHING new on it, it was four years later than the Pentium Pro, and barely outperformed it, and in fact was surpassed by the aging P6 architecture when the Coppermine came out. It was much bigger, used much more power, and had relatively poor performance for the size and power dissipation. The main problem with the P6 was the memory bandwidth too, if it had the Athlon's it would have raped it, despite being much smaller. I don't really call that a huge success. Although, it does have to be said the Athlon was capable of higher clock speeds on the same process. Still, it was hardly an unqualified success like the Core 2, which is good by any measure.

    The Core 2 is MUCH faster than the Athlon 64, and isn't a much larger and much more power hungry beast. In fact, it's clearly better in power/performance than the Athlon 64. The Athlon was dreadful in this regard.

    I was talking about the instruction set with regards to Microsoft, which should have been obvious since x86-64 is an instruction set, not an architecture. And yes, they did design most of it, if not all. Ask someone from Microsoft, and even if you don't know one, use some common sense. Microsoft writes software, and compilers, and have to work with the instruction set. They are naturally going to know what works best and what they want, and AMD has absolutely no choice but to do what Microsoft says to do. Microsoft is holding a Royal Flush, AMD has a nine high. Microsoft withholding support for x86-64 would have made it as meaningless as 3D Now! They knew it, AMD knew it, and Microsoft got what they wanted. Anything else is fiction. Again, use common sense.
  • hubajube - Friday, May 11, 2007 - link


    What are you talking about?
    Dude, WTF are YOU talking about? Allergic to software? Is that an industry phrase? YOU have NO idea what AMD did or didn't do in regards to X86-64 so how can you even make a comment on it?

Log in

Don't have an account? Sign up now