AMD - The Road Ahead

by Anand Lal Shimpi on May 11, 2007 5:00 AM EST
AMD in Consumer Electronics

The potential of Fusion extends far beyond the PC space and into the embedded space. If you can imagine a very low power, low profile Fusion CPU, you can easily see it being used in not only PCs but consumer electronics devices as well. The benefit is that your CE devices could run the same applications as your PC devices, truly encouraging and enabling convergence and cohabitation between CE and PC devices.

Despite both sides attempting to point out how they are different, AMD and Intel actually have very similar views on where the microprocessor industry is headed. Both companies have stated to us that they have no desire to engage in the "core wars", as in we won't see a race to keep adding cores. The explanation for why not is the same one that applied to the GHz race: if you scale exclusively in one direction (clock speed or number of cores), you will eventually run into the same power wall. The true path to performance is a combination of increasing instruction level parallelism, clock speed, and number of cores in line with the demands of the software you're trying to run.

AMD has been a bit more forthcoming than Intel in this respect by indicating that it doesn't believe that there's a clear sweet spot, at least for desktop CPUs. AMD doesn't believe there's enough data to conclude whether 3, 4, 6 or 8 cores is the ideal number for desktop processors. From our testing with Intel's V8 platform, an 8-core platform targeted at the high end desktop, it is extremely difficult finding high end desktop applications that can even benefit from 8 cores over 4. Our instincts tell us that for mainstream desktops, 3 - 4 general purpose x86 cores appears to be the near term target that makes sense. You could potentially lower the number of cores needed if you combine other specialized hardware (e.g. an H.264 encode/decode core).

What's particularly interesting is that many of the same goals Intel has for the future of its x86 processors are in line with what AMD has planned. For the past couple of IDFs Intel has been talking about bringing to market a < 0.5W x86 core that can be used for devices that are somewhere in size and complexity between a cell phone and an UMPC (e.g. iPhone). Intel has committed to delivering such a core in 2008 called Silverthorne, based around a new micro-architecture designed for these ultra low power environments.

AMD confirmed that it too envisions ultra low power x86 cores for use in consumer electronics devices, areas where ARM or other specialized cores are commonly used. AMD also recognizes that it can't address this market by simply reducing clock speed of its current processors, and thus AMD mentioned that it is working on a separate micro-architecture to address these ultra low power markets. AMD didn't attribute any timeframe or roadmap to its plans, but knowing what we know about Fusion's debut we'd expect a lower power version targeted at UMPC and CE markets to follow.

Why even think about bringing x86 cores to CE devices like digital TVs or smartphones? AMD offered one clear motivation: the software stack that will run on these devices is going to get more complex. Applications on TVs, cell phones and other CE devices will get more complex to the point where they will require faster processors. Combine that with the fact that software developers don't want to target multiple processor architectures when they deliver software for these CE devices, and by using x86 as the common platform between CE and PC software you end up creating an entire environment where the same applications and content can be available across any device. The goal of PC/CE convergence is to allow users to have access to any content, on any device, anywhere - if all the devices you're trying to gain access to content/programs on happen to all be x86, it makes the process much easier.

Why is a new core necessary? Although x86 can be applied to virtually any market segment, the range of usefulness of a particular core can extend throughout an order of magnitude of power. For example, AMD's current desktop cores can easily be scaled up or down to hit TDPs in the 10W - 100W range, but they would not be good for hitting something in the sub-1W range. AMD can easily address the sub-1W market, but it will require a different core from what it addresses the rest of the market with. This philosophy is akin to what Intel discovered with Centrino; in order to succeed in the mobile market, you need a mobile specific design. To succeed in the ultra mobile and handtop markets, you need an ultra mobile/handtop specific processor design as well. Both AMD and Intel realize this, and now both companies have publicly stated that they are doing something about it.

Merging CPUs and GPUs Why is Barcelona late?
Comments Locked

55 Comments

View All Comments

  • Regs - Friday, May 11, 2007 - link

    Tight lipped does make AMD look bad right now but could be even worse for them after Intel has their way with the information alone. I'm not talking about technology or performance, I'm talking about marketing and pure buisness politics.

    Intel beat AMD to market by a huge margin and I think it would be insane for AMD to go ahead and post numbers and specifications while Intel has more than enough time to make whatever AMD is offering look bad before it hits the shelves or comes into contact with a Dell machine.

  • strikeback03 - Friday, May 11, 2007 - link

    quote:

    Apparently Intel suspects something is going on as well. One look at the current prices of the E6600 C2D should confirm this, as its currently half the price of what it was a month ago. Unless, there is something else I am missing, but the Extreme CPUs still seem to be hovering around ~$1000 usd.


    Intel cut the price of all the C2D processors by one slot in the tree - the Q6600 to the former price of the E6700, the E6700 to the former price of the E6600, the E6600 to the former price of the E6400, etc. Anandtech covered this a month or so ago after AMD cut prices.

    quote:

    After a while this could be a problem for the consumer base, and may ressemble something along the lines of how a lot of Linux users view Microsoft, wit htheir 'Monopoly'. In the end, 'we' lose flexability, and possibly the freedom to choose what software that will actually run on our hardware. This is not to say, I buy into this beleif 100%, but it is a distinct possibility.


    I wonder as well. Will it be relatively easy to mix and match features as needed? Or will the offerings be laid out that most people end up paying for a feature they don't want for each feature they do?
  • yyrkoon - Friday, May 11, 2007 - link

    quote:

    I wonder as well. Will it be relatively easy to mix and match features as needed? Or will the offerings be laid out that most people end up paying for a feature they don't want for each feature they do?


    Yeah, its hard to take this peice of 'information' without a grain of salt added. On one hand you have the good side, true integrated graphics (not this shitty thing of the past, hopefully . . .), with full bus speed communication, and whatnot, but on the other hand, you cut out discrete manufactuers like nVidia, which in the long run, we are not only talking about just discrete graphics cards, but also one of the best/competing chipset makers out there.
  • Regs - Friday, May 11, 2007 - link

    The new attitude Anand displays with AMD is more than enough and likely the whole point of the article.

    AMD is changing for a more aggressive stance. Something they should of done years ago.

  • Stablecannon - Friday, May 11, 2007 - link

    quote:

    AMD is changing for a more aggressive stance. Something they should of done years ago.


    Aggressive? I'm sorry could you refer me to the article that gave you that idea. I must have missed while I was at work.
  • Regs - Friday, May 11, 2007 - link

    Did you skim?

    There were at least two whole paragraphs. Though I hate to qoute so much content, I guess it's needed.

    quote:

    Going into these meetings, in a secluded location away from AMD's campus, we honestly had low expectations. We were quite down on AMD and its ability to compete, and while AMD's situation in the market hasn't changed, by finally talking to the key folks within the company we at least have a better idea of how it plans to compete.



    quote:

    There's also this idea that coming off of a significant technology lead, many within AMD were simply complacent and that contributed to a less hungry company as a whole. We're getting the impression that some major changes are happening within AMD, especially given its abysmal Q1 earnings results (losing $611M in a quarter tends to do that to a company). While AMD appeared to be in a state of shock after Intel's Core 2 launch last year, the boat has finally started to turn and the company that we'll see over the next 6 - 12 months should be quite different.

  • sprockkets - Friday, May 11, 2007 - link

    What is there that is getting anyone excited to upgrade to a new system? We need faster processors and GPUs? Sure, so we can play better games. That's it?

    Now we can do HD content. I would be much more excited about that except it is encumbered to the bone by DRM.

    I just wish we had a competent processor that only needs a heatsink to be cooled.

    quote:

    AMD showed off the same 45nm SRAM test vehicle we saw over a year ago in Dresden, which is a bit bothersome.


    Not sure what you are saying since over a year ago they would have been demoing perhaps 65nm cells, but whatever.

    And as far as Intel reacting, they are already on overdrive with their product releases, FSB bumps, updating the CPU architecture every 2 years instead of 3, new chipsets every 6 months, etc. I guess when you told people we would have 10ghz Pentium 4's and lost your creditbility, you need to make up for it somehow.

    Then again, if AMD shows off benchmarks, what good would it do? The desktop varients we can buy are many months away.
  • Viditor - Saturday, May 12, 2007 - link

    quote:

    Not sure what you are saying since over a year ago they would have been demoing perhaps 65nm cells, but whatever

    In April of 2006, AMD demonstrated 45nm SRAM. This was 3 months after Intel did the same...
  • sprockkets - Friday, May 11, 2007 - link

    To reply to myself, perhaps the Fusion project is the best thing coming. If we can have a standard set of instructions for cpu and gpu, we will no longer need video drivers, and perhaps we can have a set that works very low power. THAT, is what I want.

    Wish they talked more of DTX.
  • TA152H - Friday, May 11, 2007 - link

    I agree with you about only needing a heat sink, I still use Pentium IIIs in most of my machines for exactly that reason. I also prefer slotted processors to the lame socketed ones, but they cost more and are unnecessary so I guess they aren't going to come back. They are so much easier to work with though.

    I wish AMD or Intel would come out with something running around 1.4 GHz that used 10 watts or less. I bought a VIA running at 800 MHz a few years ago, but it is incredibly slow. You're better off with a K6-III+ system, you get better performance and about the same power use. Still, it looks like Intel and AMD are blind to this market, or minimally myopic, so it looks like VIA/Centaur is the best hope there. The part I don't get is why they superpipeline something for high clock speed when they are going for low power. It seems to me an upgraded K6-III would be better at something like this, since by comparison the Pentium/Athlon/Core lines offer poor performance for the power compared to the K6 line, considering it's made on old lithography. So does the VIA, and that's what it's designed for. I don't get it. Maybe AMD should bring it back as their ultra-low power design. Actually, maybe they are. On a platform with reasonable memory bandwidth, it could be a real winner.

Log in

Don't have an account? Sign up now