Back to Article

  • itsthejedi - Monday, October 29, 2012 - link

    Anything that can help their position in the marketplace sounds good to me. Reply
  • Marburg U - Monday, October 29, 2012 - link

    What position?
    I'm seriously asking. What do they want to be when they grow up?
    I have no idea, do they?
  • Symmetry81 - Monday, October 29, 2012 - link

    Well, they did want to sell Bulldozers to people who wanted a good performance/chip price ratio... but it looks like there aren't actually any of sort of customer anymore even if they execution hadn't been botched.

    Now they're looking for people who want performance/power and both have highly threaded workloads and don't need all their threads to be memory coherent. Which is actually a real market! Not the biggest, but Facebook will certainly want to talk to them for instance.
  • BallBond - Tuesday, October 30, 2012 - link

    Love my job, since I've been bringing in $5600… I sit at home, music playing while I work in front of my new iMac that I got now that I'm making it online(Click on menu Home)
  • hrga - Tuesday, October 30, 2012 - link

    If you quadruple that i'm willing to have a same job Reply
  • hrga - Tuesday, October 30, 2012 - link

    ROFL. You're right about that we we should really asked them what they wanna be when they grew up.

    But this time they're back on their long executing pipeline plan where they want a play a main roll in server market ever since they bought Alpha and released it as their first commercially successful desktop CPU -- K7 which later war renamed to widely popular Athlon brand.

    Ever since they're announced their intentions to supplement oldish Intel's IA32 x86 instructions with now commonly known x86-64 instructions their intentions to grab a part of server market share were ridiculized.

    The only problem i see with this long perpetuatiing intentions is that AMD deliberately neglecting desktop market as irrelevant to them while their competitiveness resembles to that they had in K5 vs. Pentium era but that doesn't seem bother them at all.
  • rangerdavid - Friday, November 02, 2012 - link

    "Ridiculized" is an awesome new word, props for that - but "perpetuatiing." should be spelled "perpetuatiiiing."

    Thank you.
  • jjj - Monday, October 29, 2012 - link

    Cost wise it might be cheaper to go with an ARM core but AMD's only advantage would be the fabric.A custom core could offer more of an advantage over the other ARM guys and they could have a go at consumer markets too (plus it would be far more interesting ).. Reply
  • RussianSensation - Monday, October 29, 2012 - link

    "AMD is not designing its own ARM cores at this point or acquiring an ARM architecture license to do so in the future.. The decision to use ARM’s own core design—the Atlas 64-bit core that is a successor to the ARM A15—is apparently driven by AMD’s time-to-market goals."
  • hrga - Tuesday, October 30, 2012 - link

    That eetimes article is really uncomprehensively written ... "merchant chip", "ARM A15" ... i know for "merchant's goods", "marketed chip product", and "Cortex-A15" which is ARM core design. I wonder if Lisa Su written article for them.
  • MySchizoBuddy - Monday, October 29, 2012 - link

    a custom core can come later own just like how Apple did it. Reply
  • Beenthere - Monday, October 29, 2012 - link

    This is a good step forward for AMD in a developing market segment. It will be another revenue stream which is smart Biz. Reply
  • Jaybus - Wednesday, October 31, 2012 - link

    The market is still unclear at this point. The Samsung Exynos 4210 has 2 A9 cores with a TDP of 4.5 W and at 1.2 GHz achieves 1,380 MIPS in the 7zip LZMA compression benchmark. The two core i3 2100 at 3.1 GHz with a TDP of 65 W achieves 8,800 MIPS in the same test. But the competition is really the Xeon E3 1220L which has 2 cores and a 17 W TDP.

    What I see is that Intel has had a power efficiency issue, whereas the ARM chips have a performance issue. But Intel is clearly getting better with, for example, the Xeon E3 1220L and its 17 W TDP at 2.3 GHz (3.5 GHz turbo). It would take at least 12 1.2 GHz A9 cores to match the performance of the E3 1220's 2 cores. Does the new 64-bit ARM core in the AMD chip gain enough in performance to compensate for the gains Intel has made in power efficiency?

    The low power server market is really just not that clear. It is a risky move, which is why AMD is not designing its own ARM-based core just yet. But they have lost their competitveness in the high performance server market, so they are trying to branch out. There could be gains to be made in the ultra-low power server market, such as NAS boxes, media servers, etc.

    Also, I wonder if AMD might be thinking about an ARM based APU SoC for the mobile market, since Apple seems to be at odds with Samsung these days.
  • fancarolina - Monday, October 29, 2012 - link

    I hope they make this chip. This would be ideal for Windows 8 tablets. It would be able to turn off the x86 cores while in Metro and not doing anything heavy. Then when you fired up an x86 Application it can flip them on and you have the power of a full processor. You would get the best of both worlds ARM efficiency on the go and x86 application support when you want it. Reply
  • KitsuneKnight - Monday, October 29, 2012 - link

    I don't believe AMD is going to be making any ARM+X86 CPUs (at least not based on this announcement). Maybe in the future... but I don't think that'd be an ideal situation. Certainly wouldn't be easy at all to handle. Reply
  • MySchizoBuddy - Monday, October 29, 2012 - link

    How many cores will the opteron chips actually have? Reply
  • kukreknecmi - Monday, October 29, 2012 - link

    ""I do think that a Jaguar based Opteron would likely be the best route for AMD, but it would also likely require a bit more effort than integrating an ARM core.""

    What does "require a bit more effort" here means? Arent Jaguars already developed and ready to go? Does Jaguar needs special treatment to become a server chip like Atom based ones? From what i know, Jaguar is more or less can be produced in recent times, yet Arm based ones will be more than 1 year.

    So again, why the Jaguar's way needs more effort ? Or AMD is up to something that ArmV8 based chip will be a beast and probably wipe Jaguar, so there is no need to insist?
  • mayankleoboy1 - Monday, October 29, 2012 - link

    would it make sense to make a GPU from gluing together many ARM cores together with high bandwidth interconnects ? Reply
  • Ryan Smith - Tuesday, October 30, 2012 - link

    In short, no. A GPU is fundamentally composed of many very simple math units, along with specialty units like texture units and ROPs. Not only would ARM cores be a poor choice for the math units (they're relatively complex) but it would still lack the kind of specialty units you need to flesh out the rest of the GPU. Reply
  • KitsuneKnight - Tuesday, October 30, 2012 - link

    For a GPU? It'd likely be pretty horrible performance (or if you scaled it up enough, ignoring scaling limitations, horribly massive power consumption). While you might be able to get a lot of ARM cores for the same power as laptop/desktop GPUs, they'd likely take up a huge amount of physical space, and not have as good performance.

    The ARM cores would have a massive amount of silicon duplicated that wouldn't be used very much. It'd beat the GPU for workloads the GPU sucks at (ones GPUs aren't regularly used for... non-graphic stuff), but... it won't necessarily beat a similar x86 chip at that workload.

    It'll be interesting to see where ARM actually manages to go, but I doubt it's as different from x86 as a lot of people like to assume. Intel has shown an amazing ability to produce an x86 chip that's actually in the same ballpark as ARM chips... we still need to see ARM produce (well, design) a chip with high performance (as in, something that's usable for random laptop/desktop workloads... not just light tablet workloads).
  • nofumble62 - Tuesday, October 30, 2012 - link

    It does not have PCIe gen 3. Why it is called high speed? Reply
  • Jaybus - Wednesday, October 31, 2012 - link

    No. Intel already tried something similar with Larrabee. Reply
  • iwod - Monday, October 29, 2012 - link

    We now know ARM's memory controller is its weakest area, do the license means AMD will be using the same controller for server applications? Or would they design something different.

    I think there are increasing number roles in Web Server market where there is Low CPU usage with High Memory usage such as caching servers. Which is a perfect fit for ARM typed CPU.

    And it is interesting that it was AMD who states they want x86 from top to bottom. And x86 will shrinks and grows. Then they came up with x64. Intel decided the other way and wanted IA-64 aka Itanium and ARM Xscale CPU for the low end. It was x64 that force Intel to rethink and license it, then sold Xscale and now literally gave up on Itanium. Now they swap position and It is Intel who thinks x86 should be used from top to bottom and AMD decided to use ARM, how time changes.
  • booleanalgebra - Monday, October 29, 2012 - link

    Not sure about AMD but this does lend credibility to the minions of ARM server guys like AppliedMicro and Calxeda who have been at it for many years already.

    AMD was one of the two (main) members of the exclusive x86 club. For them to abandon that exclusivity and go with standard ARM cores is a testament in itself. But giving up that exclusivity also means they need to compete with the likes of AppliedMicro who have a head start on them. There will also be other ARM server players coming out of the woodwork so AMD will have to count on better execution - not their strong forte.
  • andrewaggb - Tuesday, October 30, 2012 - link

    AMD isn't competing well in any market right now that I can see, other than bargain laptops and desktops. Their graphics lead (as far as perf/watt/$$ is lost as nvidia really turned it around).

    By all indications haswell's integrated graphics may catch up (at least in benchmarks) to amd's integrated graphics.

    They're almost hopelessly behind intel in single threaded performance, they have no smartphone wins, no windows 8 tablet wins (that I know of), almost no ultrabook-a-likes,

    I don't know that switching to ARM will help them either. Sure you can use somebody else's cpu designs, but it's not a competitive advantage. You're likely using the same foundry as everybody else as well. Your graphics are good, but not especially low power, and you've conceded the high end to intel.

    I don't want AMD to go away, but entering a very competitive market that Texas Instruments just pulled out of seems risky to me. Unless you're sure you can one up qualcomm, nvidia and apple, what's the point? I doubt they'll be one-upping anybody using stock core designs.

    Hopefully I'm wrong.
  • dvinnen - Tuesday, October 30, 2012 - link

    I would feel better about this venture if they were designing their own chips. I can imagine an ARM processor with a AMD memory controller and HyperTransport links built in could be a cool and viable product.

    Not seeing that with just rebranding an ARM core as an Opteron
  • rrohbeck - Tuesday, October 30, 2012 - link

    Once they make their own designs, I can see interesting options:
    An ARM APU with a Radeon GPU for graphics performance and GPGPU (HSA)
    An ARM CPU that combines an ARM instruction decoder with a high performance back end
    An x86 Opteron with a low end GPU and built-in ARM core as a IPMI BMC
    And my favorite: A CPU that supports both x86 and ARM instructions. "Just" add an ARM instruction decoder to an Opteron.
  • hrga - Tuesday, October 30, 2012 - link

    I like this part of IPMI BMC but that doesn't seems viable as this kind of specific CPu would still require Serial Port Interface + Super I/O + switchable logic and all that doesn't work on low voltages 1.2-1.4V as CPU ... some IPMI substitute could be blueprinted this way

    My favorite would be CPU that doesn't "just support both instruction set" [ARMv8 and x86-64] but something that i could use w/o OS reboot as power sawing feature. But this even crappily new Win8 doesn't support yet (and probably wont!).
    So at least they could in new desktop socket that will be brought for HSA, probably with Excavator @20nm, along integrate two "el cheapo" Jaguar cores to do the similar feature and yet every OS could do that w/o any issue as both share same x86-64 architecture.

    Any good market thing with ARMv8 is if they go same way as nvidia announced their Tesla line should go. To integrate few ARMv8 cores directly onto GP-GPU which then could boot directly w/o need of any traditional x86-64/IA64 CPU.
  • nofumble62 - Tuesday, October 30, 2012 - link

    Dirk got fired because he hold on to x86 too long and let Nvidia jumped on ARM.

    I guess that has always been the board intention. So they fired all the people don't go along with them.

    Does AMD have a chance with ARM? yes.
    Do they have enough time? No. They will run out of cash way before the first chip is launched.
  • Metaluna - Tuesday, October 30, 2012 - link

    As I see it, Intel is really the last of the companies that can sell CPUs as their defining product and profit center. ARM is a race to the bottom profit-wise. You can differentiate a little bit with custom architectures, but ARM cores are pretty much a commodity. That's not a bad thing necessarily, but it means that CPUs become just a small part of an overall product line. Nobody thinks of Qualcomm, Broadcom, Samsung, Apple, etc. as a processor company, nor is anyone paying a premium for ARM-based processors.

    What I'm saying is, if AMD is going to go this route, it needs to become a much more diversified company than it currently is.
  • melgross - Tuesday, October 30, 2012 - link

    This is AMD after all. So whatever they come out with will be a year late, underpowered, and use too much current.

    This just looks like a desperate move by them, as they know they will always be behind Intel in that space.
  • nukunukoo - Friday, November 09, 2012 - link

    Seriously, will the company survive to see that happen in 2014? Reply

Log in

Don't have an account? Sign up now