Back to Article

  • spidey81 - Thursday, February 02, 2012 - link

    With AMD's focus going the direction of mobile/AIO or server parts will the consumer market ever see anything directed at the desktop enthusiast marker? I guess I'm still hopeful to see a trickle down from the server marker or desktop innovation trickling down to the mobile sector as has been in the past. Maybe it's just time to jump ship to intel for my next gaming/oc rig. Reply
  • arjuna1 - Thursday, February 02, 2012 - link

    You ninja'd my post, but, exactly the same feeling, but you know what?? if they dare to give me the middle I will give them the middle finger, no problem in making my next build intel/nvidia Reply
  • spidey81 - Thursday, February 02, 2012 - link

    I just upgraded from a PII X3 720 to an FX8120. It's frustrating to know that even with it clocked at 4.5Ghz I'm still not going to get the performance I would have with a 2500K. I've never built with anything other than AMD and really don't plan on changing that. However, It's getting increasingly difficult to support them. Reply
  • just4U - Thursday, February 02, 2012 - link

    Hey Spidey.. your not missing to to much. I've built several i5/i7 setups and use one everyday. But I've also picked up an FX6100 and it's pretty good to. I don't mind switching back and forth and while there may be a slow down in some games .. some apps.. I don't notice it unless I am actually looking at the numbers. They all seem fast overall. Reply
  • Sabresiberian - Thursday, February 02, 2012 - link

    My experience is different. I have built 2 computers, one on the i7 920 and one on the Phenom II 955. The difference is clear and significant playing World of Warcraft, and any other MMORPG.

    My experience is mirrored by Anandtech and Tomshardware benchmarks.

    Now, if you aren't a person that actually uses all the performance he can get, the Phenom II is fine, but even having been an AMD fan for years, I won't go back and cut my nose off to spite my face, as they say.

    AMD has chosen a different path than I would like for them to have, but I'm not going to fault them for it. I'm disappointed as an enthusiast builder, but I certainly recognize there is a far wider market than CPUs for people like me. However, it also means they no longer are interested in supplying what I want, so we must part ways.

  • Spoelie - Friday, February 03, 2012 - link

    Have been an AMD/ATI loyalist for a long time, and have only built AMD/ATI setups *for my personal use*. But I always only upgraded to a product that was either very competitive (Tbird, A64, X2, PhII early on) or dirt cheap and very overclockable (Tbred, Barton for example) - holding out the times AMD wasn't very competitive (kept my X2 pretty long, skipped PhI).

    The thing is that my 3 year old DDR2 Deneb@3.3ghz has never felt inadequate at all, helped by an SSD and yearly GPU upgrades.

    When the time comes however, I'll have no qualms switching to Intel's latest and greatest, in the same spirit as Sabresiberian
  • wumpus - Wednesday, June 27, 2012 - link

    >The difference is clear and significant playing World of Warcraft, and any other MMORPG.

    WoW was released in something like 2007. I very much doubt a modern CPU would notice the difference (I used to play Dungeons and Dragons online (2008) with a 2GHz Sempy, and it ran just fine). Methinks you have different GPUs and that might just make the difference (WoW used to be famous for not stressing the GPU, I doubt they have changed it).

    Still, as someone who has always liked AMD more than Intel, I suspect I will wind up buying more Intel processors in the future (the fact that every single Intel processor I've bought has been deliberately crippled annoys the bejesus out of me).

    Face it, the desktop is "dying" (read becoming a mature tech that doesn't obsolete itself every Thursday). Don't expect every high tech company to want to swoop down and grab a piece of the pie anymore. Intel will have a hard enough time with every "tock" competing with the previous "tick".
  • GotThumbs - Friday, February 03, 2012 - link

    I think one of the key factors your leaving out...Is what is the cost/price difference. I've built all AMD systems since my first Pentium II build, and been quite happy with the system and the performance I've gotten, while still having some cash left in my pockets. I'm even looking at down-sizing my system to an APU on an ITX board with an SSD. Today's CPU's meet probably 95% of the markets needs. It's only a select few who need hard-core performance on an hourly basis and can justify spending huge amounts of money to have a high-powered system. Higher CPU speeds is not the only focus in today's market. Battery life and user experience is what matters. If you can get the same experience with a lower speed processor...then whay pay more....bragging rights only takes you so far.

    I think AMD has matured and is no longer concerned with competing with Intel on having the biggest and baddest CPU's. Most general consumers barely use 1/3 the capacity of their systems.
  • bill4 - Thursday, February 02, 2012 - link

    Its funny how people hate AMD so much they automatically push this "AMD is getting out of the high end!" agenda in post along the internet. It's not commentary, it's your hope.

    Nvidia is the only one getting out of the high end since they dont even have a competitor to Tahiti.

    Amd Bulldozer was definitely a play at the high end, it's a huge ambitious chip, it just sucks.

    Get your head out of your ass Nvidia fanboys, AMD is not going anywhere no matter how hard you wish it/
  • arjuna1 - Thursday, February 02, 2012 - link

    Hey bill, I've been building AMD/ATI since the K6/9800, why don't you just stfu, learn to read before opening your mouth. Reply
  • spidey81 - Thursday, February 02, 2012 - link

    There's nothing about what I asked that suggested AMD pander to the so called "high end". But they have been going for the mainstream enthusiast market with their unlocked CPU's. I don't have an unlimited budget and the most I've spent on a CPU or GPU is $300. I'm just hoping I'll see some better competition in the "value" enthusiast segment. Reply
  • Sabresiberian - Friday, February 03, 2012 - link

    I certainly would have agreed at the time I built my systems that the cost/performance value favored AMD's Phenom II. I even recommended tit over the 1366 based platform to people building new computers, because they could not afford spending the extra money.

    However, I was an early adopter of Nehalem, and paid a premium for what I got. Today isn't yesterday, we have an extensive Sandy Bridge lineup to choose from, and AMD's offerings don't hold up.

    As an example, I direct you to this article:

    The article is gaming oriented, and many readers don't fit in that category, but even so I think it is of interest to those building or buying a new computer for any purpose. The fact is, AMD isn't just "losing" the performance "war", it is losing the cost effectiveness war as well. (I don't want downplay the fact that gaming benchmarks can have little to do with other needs, but if you really check prices and benchmarks for more business-oriented applications I think you'll find I'm not far off the mark there, too, for desktop CPUs.)

    I suggest that far more than 5% of users would benefit from using an Intel offering over an AMD at any price point.

  • wumpus - Wednesday, June 27, 2012 - link

    First, the last Intel CPU I've used was a P3 base Celery. Looks I'll be going to Intel sooner or later.

    Two exceptions:
    First, those of us with AM3+ sockets will have to decide if a new motherboard is worth it. This changes the value calculations enough for the 5%.

    Second, I suspect Anandtech readers get roped into "supporting" more computers than they use outright. Even on the desktop, llanos-type CPUs will become more and more appropriate for any purchase not based on geek lust.
  • riottime - Thursday, February 02, 2012 - link

    i'll wait until steamroller to upgrade my amd build. i plan to get ivy to upgrade my intel build. i will skip 7xxx series graphic cards and stick with my 5xxx series for both my amd and intel builds. they're still good. ;) Reply
  • Taft12 - Thursday, February 02, 2012 - link

    "will the consumer market ever see anything directed at the desktop enthusiast marker?"

    Yes -- high-end gaming cards.

    Rory Read and Anand made it very clear. The war is over for the high-end desktop. Intel won. But that market barely exists anymore - at least not in volumes that AMD or Intel care about, despite the high margins of i7 and FX CPUs

    Neither will be investing much R&D there anymore. Look on the bright side, at least your Sandy Bridge system is future proof in the sense that neither company will produce something any better to tempt you to upgrade.
  • SlyNine - Thursday, February 02, 2012 - link

    LOL, you talk as if it ever was a big market for it. Sorry to ruin your whole post by saying nothing has changed. Reply
  • EyelessBlond - Friday, February 03, 2012 - link

    It has, though. I used to spend a decent amount of money on the CPU for my PC, and upgraded every couple of years. Nowadays, when I scrape together a few hundred dollars to upgrade my system I'm looking at a third/fourth/sixth monitor, an SSD, a good mechanical keyboard/gaming mouse, a better GPU. The CPU just isn't that important any more; now that a good quad/hex-core from two years ago can keep my system running strong, I can focus on other priorities. Reply
  • jabber - Friday, February 03, 2012 - link

    Spot on. As soon as dual cores appeared it was game over in terms of CPU power needs for most folks.

    I now go longer and longer between CPU upgrades. I also don't even bother looking at anything higher than $150 for my needs. I have better things to do than run bechmarks all day. I'd rather spend the money as you say on a better monitor or something I'll really notice like a SSD.

    I used to buy Opterons/DDR500/Raptors etc. but now I just don't find that stuff really matters now.

    Ultimate performance items contribute very little to the overall average user experience.
  • SlyNine - Saturday, February 04, 2012 - link

    Yea but I'm talking about the ultra high end 1000$ CPU market. when you could buy 90% of that speed for 200$. That market was never big.

    I do agree though. The important segment, mid range, has dropped off because older CPU's are still fast enough. Now I have a 2600k and the only real reason I upgraded from my I7 860 was because my Mobo was giving me minor troubles and it was hard to justify buying a new one. So ya times in general are changing I just don't agree that the ultra high end has changed much.
  • EyelessBlond - Saturday, February 04, 2012 - link

    Oh, but it has. There are so many other things you can be spending your high-end dollar on nowadays: Eyefinity setups, expensive keyboards/mice/HD audio, tablets, home automation, SSD RAID setups, etc. All of these things will bring more to the high-end desktop than that $1000 CPU; all that is good for is CPU-intensive tasks, and if you have the workload that necessitates that then you can purchase time on an Amazon EC2 and get things done far faster than a single CPU desktop. Reply
  • Sabresiberian - Friday, February 03, 2012 - link

    The market barely exists??

    Won't be spending much on R&D??

    You people just make this stuff up as you go along. The enthusiast market is flourishing. Dozens of companies do billions of dollars worth of business in the world selling components to enthusiast builders.

    Even where the market share for enthusiast builds (somehow separated from "performance" builds in the xbitlabs article) is predicted to decline, it shows a stable number of computers being built in that category because the market overall increases. The research company also offers the opinion that the high-end market will "always" be a good market.

    Is the high-end consumer market the bulk of Intel sales? Of course not, but make no mistake - they DO make good money off of high-end CPUs.

    Would you throw away a billion dollars worth of business in one aspect of what you because you do 10 billion in another area? Somehow, I don't think that's what Intel has in mind.
  • wumpus - Saturday, April 14, 2012 - link

    I suspect that intel merely wanted to deny AMD the market. Starving AMD means they won't have the R&D resources needed to attack the high-margin server market.

    This seems to have been intel's strategy since the first celeron (no enthusiast would buy a K6 over an overclocked celery. No corporate customer would be caught dead buying a celery. Win-win!).
  • Malih - Thursday, February 02, 2012 - link

    since the low-end server CPUs uses AM3+ socket, we'll have to wait and see how the CPUs perform when used as desktop system, hopefully AnandTech will review them for high-end desktop use. Reply
  • Master_Sigma - Friday, February 10, 2012 - link

    "Desktop loyalists" gave AMD the finger a long time ago when they got drunk off of the bullshit Intel benchmarketing Koolaid. Enthusiasts decided that completely synthetic marketing aids that have nothing to do real world performance were all that mattered, and unfortunately AMD doesn't have the chops to tailor their CPUs around those the way Intel can.

    AMD has finally figured that out and is giving "loyalists" exactly what they want. Have fun with your Intel monopoly. You won.
  • arjuna1 - Thursday, February 02, 2012 - link

    They are just going to leave the desktop market out in the cold??

    S*** people, get ready for sky high Intel cpu prices, developing at a crawl pace and working in a locked and limited environment.
  • bill4 - Thursday, February 02, 2012 - link

    You wish. Reply
  • Impulses - Thursday, February 02, 2012 - link

    Prices might go up some, but Intel doesn't really gain much by squeezing a stagnant desktop market that's barely growing... They'd just give AMD an opening to jump back into it, Intel's smarter than that.

    AMD has been largely irrelevant on the desktop since the A64 and prices haven't really gone up for the mid and high end parts... They haven't gone down either but we've been getting new architectures from Intel faster than ever (comparing the last 5 years vs the previous 5-10).

    Besides, Intel still develops new architectures in a top down fashion, introducing them on the desktop first and then optimizing them for mobile. Until that changes I'm not gonna cru that the sky is falling...
  • wumpus - Saturday, April 14, 2012 - link

    Look at the GPU market. They use fabs a generation behind intel's and cost almost as much as the last generation for the same performance. Moore's law may allow you to get twice the transistors on tomorrow's chip, just don't expect to afford it.

    You will see sky high intel prices and slow growth with or without AMD. Of course, I can only hope it will only be as bad as the last 5 years (GHz holding ... holding .... holding ...).

    Finally, why would anyone expect a public corporation to act anything like a psychopath is beyond me. Simply assume they will slit your throat for a buck regardless of "what you did for them" and you won't be disappointed. Fanboy all you want, but they could care less about you.
  • Schmich - Thursday, February 02, 2012 - link

    So the reports of AMD leaving Global Foundries were false?

    "Intel is doing something similar with Haswell."
    What a missed opportunity! You can have said "Intel is doing that as well with Haswell" =D
  • bleh0 - Thursday, February 02, 2012 - link

    It just isn't viable for AMD to attempt to compete with Intel within the consumer high end desktop x86 market. More studies are showing day by day that the average consumer is moving towards a more mobile lifestyle and AMD is doing what is can to move the company in that direction. Why should AMD waste the resources and manpower on high end x86.

    Also, Intel has to compete against itself in pricing and people just can vote with their wallets if the prices get too high.
  • another voice - Thursday, February 02, 2012 - link

    intel also has to compete with itself on performance.

    Anyone looking at the highish end cpu (i5 or more) already has a cpu.
    intel can only charge lots for its new cpu if they are significantly faster than whatever the customer currently owns, cause if its too much for a small performance gain then that generation will get skipped and customer waits for a new generation thus intel sells less.
  • Impulses - Thursday, February 02, 2012 - link

    Yup, even if AMD shifts focus away from the desktop (which they should've done years ago) Intel won't have free reign to squeeze the market, one slip still gives a more agile AMD room to jump back in... At worst we'll see Intel's tick tock strategy shift to a slower pace, but Intel still derives much of the efficiency of newer mobile parts out of introducing newer smaller processes so... Reply
  • chizow - Friday, February 03, 2012 - link

    Thank god someone else gets it. I've been hearing this "we need competition and AMD for cheap CPUs" meme repeated for the last 5 years. In the meantime, AMD still doesn't have a CPU that convincingly beats what Intel was offering then and yet, Intel continues to release newer, faster CPUs every year.

    But yes its just as you said, Intel is still competing with themselves and needs to provide incentive for users to actually "upgrade" to a faster CPU. Its not like CPUs expire or even "die" after a few years.

    Its really very similar to other markets, like Apple with iPhone or Samsung with Galaxy. Or Madden or Modern Warfare. Even without significant competition, people will buy the latest and greatest but there needs to be enough reason to buy the next iteration.
  • chizow - Friday, February 03, 2012 - link

    It makes you wonder though why AMD has no mobile strategy and no interest in even entertaining it. Reply
  • Beenthere - Thursday, February 02, 2012 - link

    WHERE does anyone see AMD say they will not continue to deliver excellent desktop CPUs? WHERE? Show me WHERE you see this written or stated by an AMD exec? WHERE exactly did people come up with this nonsense idea?

    Let me guess when AMD said they were not going to compete directly with Intel everyone concluded that AMD was no longer going to produce high performance desktop CPUs. Well if you did then you thought WRONG. In addition to all of their current products they are also going to offer ULV products for tablets and other devices. These are additional revenue streams not a replacement for desktop CPU sales.

    PLEASE stop whining. Vishera will be out this Fall and there are more desktop CPUs to follow.
  • Risforrocket - Thursday, February 02, 2012 - link

    Well if AMD will not continue to design and produce high end CPUs then I will probably stop using the CPUs they do make in my high performace desktop. This is, I think, what people are trying to say, and they are right in saying just that. Reply
  • arjuna1 - Thursday, February 02, 2012 - link

    Both the 2012 and 2013 client roadmaps show only Vishera as a performance desktop part, other than that, 3rd gen bulldozer, "steamroller", comes in 2013 as an APU part, unless the cpu part is a performing champ and the gpu part is a high end 7k series AMD will effectively, as Anand stated, abandoning the high end race.

    I fully understand that as a company the have nothing to catch up with intel and the focus is placed on other markets, but after all this years of supporting them one can only feel butt hurt to see them leaving us in the air.

    As it looks the AM3+ platform is dead already, I don't know about you but I don't have bottomless pockets to have the luxury of investing in a new platform with it's end of the road already in sight.
  • Beenthere - Thursday, February 02, 2012 - link

    There will be more desktop CPUs after Vishera. The AM3+ platform isn't dead because it will run Vishera. After that we'll see. AMD may have a few tricks up their sleeve as they are making Opterons to run on AM3+ sockets... You do recall Opteron 165's were fine OC'ers, right?

    Butt hurt? You must be kidding. I don't know how people reach such absurd conclusions about AMD not continuing with high end desktop CPUs. Yes mainstream will migrate to APUs - as I have said for over a year, but AMD will continue to offer top end desktop CPUs also.
  • arjuna1 - Thursday, February 02, 2012 - link

    Actually, their idea is: their high end desktop parts are now their low end server parts, that's what they did with BD and we all know the end result, don't we??

    If Vishera is an APU I highly doubt it will run off AM3+, which is fine anyway, I wasn't expecting the AM socket to last forever, but the new trend is obvious, AMD is no longer interseted in the high end desktop market, and I'm not interested in APUs, and won't be couple of years still.

    Sabresiberian summed it up pretty well:

    they no longer are interested in supplying what I want, so we must part ways.
  • mak360 - Friday, February 03, 2012 - link

    dude, your like a broken record "we must part ways", be gone and don`t let the door hit your ass on way out lol.

    if you're an high end user, you should be with intel anyway, so whats your point?
  • A5 - Thursday, February 02, 2012 - link

    The fact that their "enthusiast" desktop CPU will be 32nm through the end of 2013 essentially signals that they are giving up on that market.

    The Opteron 165 came out 6 years ago - the fact that that is the chip you have to point to is pretty telling, no?
  • Beenthere - Thursday, February 02, 2012 - link

    Sorry but your beliefs are incorrect at this time.

    I don't have the Opteron 165. It was a reference to prior Opteron success for desktop use.

    The point was that AMD is making faster Opterons that use AM3+ sockets. Why would they do this when they already have C32 and G34 sockets?

    Look at the roadmap slide carefully with Piledriver, Steamroller and Excavator all bringing ~15% increase each year and fitting into AM3+ sockets. This ain't rocket science.

    28 nm will not offer any big gains over 32nm so that's not even an issue. There is diminishing returns with each step to smaller traces. AMD has said they ain't going to push the trace size but instead they will optimise the cores for better performance.
  • Impulses - Friday, February 03, 2012 - link

    I'll never understand blind brand loyalty, specially for CPUs where there isn't much else to take into account besides performance per dollar... At least when it comes to other products the brand loyalists have additional arguments to stand on like build quality, support, reliability, etc.

    I've had one AMD system, an A64 3000+ Winchester, from the only brief period in history where they were on their game and able to out-execute and out-preform Intel. Every other desktop's been Intel based, they were almost always the smarter purchase. Altho if I was gonna replace my current netbook with another sub-$500 system I'd definitely opt for Brazos right now.

    Shifting focus is a smart move for AMD, who cares if a few enthusiasts get butt hurt and a couple others keep hopelessly calling for high end parts? The mobile market's growing faster, it's already larger than the high end desktop market, and enthusiasts & halo products don't drive sales like they used to.

    Plus, quite frankly, an Intel dominated mobile market is a heck of a lot scarier than an Intel dominated enthusiast market. Just look at what Intel's been pushing lately, ultrabooks are sexy but they're also a tool to drive the price of the average laptop up... Why do you think Atom hasn't seen a significant redesign by now? Except for Brazos, Intel has effectively been driving laptops upmarket while keeping Atom stagnant to prevent anyone else from eating into their profits.

    If Intel dominates the enthusiast market a few of us might suffer a little, but not much because there's little incentive for Intel to suck a shrinking market dry. If Intel dominates mobile, EVERYONE loses.
  • IlllI - Friday, February 03, 2012 - link

    all one has to do is take a look at ARM. they made a massive amount of money last year. Reply
  • mak360 - Friday, February 03, 2012 - link


    I don`t think i could have said it any better
  • DanNeely - Friday, February 03, 2012 - link

    Because the dual/quad socket C32/G34 parts are inherently more expensive. For 1 socket boxes it's an unneeded additional expense. Reply
  • B-Unit1701 - Friday, February 03, 2012 - link

    Are those the same kind of roadmaps that showed Phenom stomping A64? Or Bulldozer way outclassing Phenom? This ain't rocket scinence.

    I bleed AMD green, but they havent delivered on speed improvement shown on roadmaps in 6+ years, I have a real hard time buying it today.

    And your kidding yourself if you think $300 Black Edition CPUs are part of the 'high end' of the market.
  • silverblue - Saturday, February 04, 2012 - link

    Adding 15% extra performance to multithreading would make for an excellent chip indeed, however it's not exactly multithreading where AMD really needs to work its magic, but singlethreading, and let me tell you, adding 15% year on year will POSSIBLY bring them to SB-level singlethreaded IPC by, hmm, 2014? don't need to remind you that Intel aren't going to sit still over this period.

    Bulldozer may be forward thinking, but there's no arguing that it's a server CPU designed specifically for heavy, multiple workloads, workloads that a lot of people aren't going to see on the desktop. Still, I'd like to see a thorough benchmark of multiple programs at the same time and how Bulldozer handles them. Unfortunately, it won't change the fact that the current architecture has very slow cache, requires very fast RAM to perform decently, isn't exactly the most frugal architecture out there, and bottlenecks graphics cards roughly at the same point that Phenom II did.
  • Impulses - Thursday, February 02, 2012 - link

    " Thankfully, Rory isn't HPing the company. "

    Is that a new business catchphrase or just Anand's wit?
  • BitJunkie - Friday, February 03, 2012 - link

    It's so funny that tech companies are so far behind other organisations that have "engineering" at their core when it comes to execution. Just because tech is tech doesn't mean these companies can forget everyone else's lessons learned:

    1) Microsoft with Vista: it was only after their code turned into a complete pile of poorly engineered crap all hacked together by a bunch of mavericks wanting to get an "i made this moment". That they went back to the drawing board and actually started engineering their software and processes. This is not just about the process of coding but the process of designing and engineering and how it is managed. Its about culture.

    2) Back in the tech bubble / Vista days, everyone was claiming that you could be a project manager and you didn't need domain experience. Sure, thats true: but not if you want rock solid, fail free and optimised execution. You need to apply project management theory and controls to the process and have someone who can see 10 steps ahead and implement quality decision making taking this stuff into account. Not someone with an over inflated ego with one project cycle under their belt claiming to be gods gift to project management.

    Seems to me as though someone has sat down in AMD and realised that nobody is going to invest in their platform if they cant reliably expect the next iteration of their product to arrive on time and with good preformance.

    They are adopting a strategy that allows them to execute in a reliable way, but are they going to sort out the process, the systems and the company culture?
  • haplo602 - Friday, February 03, 2012 - link

    Funny how AMD anticipated some of my changes in hardware preference. I moved to a socket F dual opteron board. in 2 years time, I'll change it for another dual opteron board.

    The other end goes into my living room as a HTPC next to the TV for the wife (Internet) and kids (low end gaming).
  • R3MF - Friday, February 03, 2012 - link

    i am disappointed in their desktop roadmap.

    ditching sepang/terramar means no on-die PCIe 3.0 controller, and relying on an aging chipset stuck with off-die PCIe 2.0. also means no triple-channel memory in the enthusiast space either.

    and not only for 2012, but for much of 2013 as well!
  • R3MF - Friday, February 03, 2012 - link

    steve from H thinks its AM2, which i believe the socket that high-end desktop will migrate too from AM3+ next year, so good upgrade path.

    but is he right?
    and am i right?
  • BPB - Friday, February 03, 2012 - link

    The simple fact of the matter is the vast majority of folks today don't need anything near high end CPUs. If you are gaming at 1920x1200 or lower, middle of the road cpus and gpus do the trick fine. From what I've read most folks are 1920x1080 or lower, with many at 1680x1050 or lower. My 3 year old AMD box does fine for almost all current games (1920x1200). It's only a couple that recently came out that have me considering a new video card. Even then I'll probably get a 6800 series from AMD, unless a cheaper 7000 series card comes out soon. Reply
  • Yuriman - Sunday, February 05, 2012 - link

    Very true, but Intel beats AMD in price/performance AND power/performance and the disparity is only going to grow if AMD keeps adding transistors on old processes. I'm glad AMD is ditching high end CPUs, there's a lot of R&D money there that would be of great benefit elsewhere.

    It's definitely time for both companies to start building down. I want a netbook that doesn't suck, and since the PC gaming market is so stagnant, any gaming PCs I build will likely be small, quiet, and sip power.
  • rocketbuddha - Friday, February 03, 2012 - link

    Anand, did you get clarity from AMD as to what would be the non-SOI 28nm node to be in use?
    GF or TSMC?

    If Brazos 2.0 is going to be in 40nm then that means that AMD mistakenly bet that GF 28nm will be ready in 2012. Else what is the use of introducing Brazos 2 in 40nm node while TSMC 28nm is now manufacturing Krait and soon TI OMAP5.
    So AMD either can trust GF will get 28nm in 2013 or use TSMC 28nm (which it has some familiarity manufacturing Discrete Graphics chips). Last year there were some articles in Extremetech by Joel Hruska

    as to AMD going to TSMC for 28nm non SOI APU manufacturing..

    If true Jaguar series will be 28nm TSMC HKMG, while for Kaveri (Trinity successor) as well as BD2 and successors AMD has no choice but to use GF SOI processes.

    Assuming that Steamroller is BD2, Piledriver is enhanced BD, looks like even in 2013 AMD expects the best SHP SOI process available at Global Foundries would be 28nm. Man! and Intel is supposed to go to 22nm in the middle of this year. Even in 2013 AMD is not sure GF would have a 20nm SOI node ready. Sucks! Let us hope STMicro comes close enough to providing AMD with a choice of foundries.
  • sapi3n - Friday, February 03, 2012 - link

    I built a phenom II X6 and AMD 5870 system w/ 16 gigs ram, an SSD, and several 1TB drives, which does crunch pretty well in After Effects, but I wasn't able to get an HDMI capture card to work (Blackmagic prefers intel?) - so do not have that capability, no Mercury Engine Playback support in Premiere (Cuda only), and no 64-bit firewire support on the motherboard (the Asus M4A89TD- Pro's integrated ports do not work with my camera) - very frustrating to have all these problems that wouldn't exist with an intel/nvidia build. Reply
  • jabber - Friday, February 03, 2012 - link

    So what you are saying didn't do your homework?

    Not really the fault of AMD is it?
  • sapi3n - Saturday, February 04, 2012 - link

    no way to test blackmagic without taking the plunge - by the way their support is horrible - fair warning - otherwise, i was under the impression that all cards with current open GL specs would help accelerate Adobe software, which is true - but premiere's mercury playback engine is cuda only - if i'd have known that, I would have purchased differently. Reply
  • sapi3n - Saturday, February 04, 2012 - link

    what about buying a game that has physX support, with features turned off for your $500 AMD card - you're fu$$ed. Reply
  • jabber - Monday, February 06, 2012 - link

    Once again...if you are spending more than $10 you do your homework to check it will do the job you want it for.

    That's just basic common sense.

    I guess you found that microwave oven you bought to clean the dishes didn't work out so well?

    I bet if you looked up on certain IT forums you would have found plenty of folks who also didn't do their homework and also were struggling with AMD/Blackmagic.

    Really in this day and age there isn't an excuse for it.
  • wifiwolf - Friday, February 03, 2012 - link

    That would be as bad as if I bought a i5 2400 and a Nvidia 560 card
    but really wanted run render a lot (which seems you want to anyway) and use 4 monitors at the same time. In that case your current build would be best.

    Let's just say you can't have it all without paying twice as much.
  • Marburg U - Friday, February 03, 2012 - link

    Do you really think AMD can compete with designers as such as samsung\qualcom\TI\Boradcom in the ARM universe? Preposterous.

    For sure they won't pursue the SoC market... they haven't got any single IP which may be of some use in the SoC niche. Nvidia and Intel, for example, have just spent billions in aquisitions of wireless solutions. AMD have no money to spend on anything.

    High end desktop\workstation\server... they will never be able to compete with intel.

    AMD have only 2 proficiencies: 1) ATI and 2) an x86 license. it's quite clear that the only thing they can do is low cost APUs for consumer personal computers and GPUs.
  • BitJunkie - Friday, February 03, 2012 - link

    I think you are missing the point. There is a big difference between the technology IN the product they are delivering and the WAY they end up at that product.

    The problem with most tech companies is that they get stuck on the technology and don't land a reliable way of delivering a product.

    I'm not going to put a single penny in AMD until I can see the following:

    1) They have a tight spec and set of objectives for each product and iteration.
    2) They set up a proper matrix organisation with competent technical and commercial managers and execute each product in a safe and reliable way.
    3) They set up a process of feeding back lessons learned from one project cycle to the next and implement a reliable continuous improvement programme
    4) They start USING tech as a way of delivering their products, not as a centrepiece

    A silicon fabrication process node is not a product, the product is the design, and the logic included in it which just so happens to be delivered on a process.

    If they get that right, then they will be able to pick their technology and their process and be able to excel in any space.

    Is it really that hard to see?
  • Yahma - Friday, February 03, 2012 - link

    Now that competition in the enthusiast market is effectively dead, Intel has no reason to continue to innovate at the high end, let alone lower prices.

    We will go back to the days of the 386/486 when Intel ruled, and prices stayed high, while there were little improvements at the top end.

    Its already starting to happen. Ivy Bridge is supposedly only a few % faster on the CPU side than its predecessor Sandy Bridge. Ivy Bridge is likely to be the last iteration on the high end in a long time. Haskell will probably be delayed/cancelled or neutered. Prices on anything faster than an Intel 2500k will be insane!

    Thats what no competition brings.
  • Impulses - Friday, February 03, 2012 - link

    Umm, IB wasn't supposed to be significantly faster than SB, just more efficient... Look up Intel's tick tock strategy. They're at the same pace they've been at for a few years now. If Intel stops innovating on the high end they just risk AMD catching up and they cannibalize their mobile sales unto which every new design trickles down... So that's not gonna happen. They could squeeze the desktop market and raise prices as they keep bringing out new designs, but they'd be squeezing a shrinking market dry for short term gain instead of using it as a proving grounds for mobile. Reply
  • seapeople - Saturday, February 04, 2012 - link

    Intel's competitor in the desktop/laptop space hasn't been AMD for a while now, it's been the threat of smartphones/tablets or other small, low power devices. Intel has acknowledged this with their ultrabook push, and the logical course of action is to decrease the power consumption of their top line processors rather than increase performance.

    Think about it, if Ivy Bridge can perform the same as Sandy Bridge while using 25% less power, then the 6 hour battery life on that shiny ultrabook becomes closer to 8 hours, OR the battery size is reduced, the ultrabook still has 6 hours of battery life, but instead weighs 1.8 lbs instead of 2.0 lbs. Those are tangible product benefits that will make people more likely to buy Intel in today's world, whereas if Ivy Bridge instead kept the same power envelope and upped performance by 25% then you'd have the same ultrabook that's just 25% faster and nobody would care because it's already light years faster than an Ipad anyway.

    I predict in 5 years the newest Intel top-end processor will only be 10-20% faster than Sandy Bridge but will use one-third to one-half the power.
  • tipoo - Monday, February 06, 2012 - link

    "I predict in 5 years the newest Intel top-end processor will only be 10-20% faster than Sandy Bridge but will use one-third to one-half the power."

    Err, no. I'd bet dollars to donuts their next "tock" will beat that 10-20% already. Just because power consumption is going down doesn't mean they can't improve performance too, we have more than enough proof of that. 5 years is a lifetime in the processor industry, 5 years ago we had Core Solo and Core Duo and were just transitioning away from the Pentiums.
  • name99 - Friday, February 03, 2012 - link

    "AMD's Heterogenous Systems Architecture (HSA) plans to change that. AMD wants to see the creation of a virtual ISA that will be the backbone of a software layer that can schedule application workloads on any combination of underlying CPU/GPU hardware, regardless of the ISA of the hardware."

    Wasn't this program called Java 20 years ago? How did that turn out?
    Oh, right, because it now also targets GPUs, this time it's going to be different.
    Good luck guys, but I can't say I'm optimistic about your chances.
  • arjuna1 - Friday, February 03, 2012 - link

    Ah crap, meant "Kaveri", not Vishera. Reply
  • arjuna1 - Friday, February 03, 2012 - link

    Yeah yeah, the butt hurt one reporting, you can't understand because you weren't into brand loyalty, I did have an "arbitrary affinity" for AMD products, heck, I still do, but I certainly did get into building systems recently.

    I'm glad you want an APU in your next desktop build, I do not, if I did I would be looking to purchase a laptop or any other mobile device.

    I want to empathize again, I get it that it's healthy for AMD to focus on where the money is, but doing so at the expense of they desktop offerings is forgetting about those of us who helped build and sustained AMD phenom II point.

    There is no other projected desktop CPU other than Vishera, and that on itself is a low end server part being sold as a high end part, the 3rd generation BD will go to Kaveri, which is an APU.

    If that does not sound like forfeiting the game, I don't what does, even Anand acknowledges AMD is out of the high end desktop CPU race.
  • B-Unit1701 - Friday, February 03, 2012 - link

    AMD is not 'forgetting' those who got them here, they are finnaly ending their torture. For years we've been waiting with huge erections for the unveiling of AMDs newest, and then get let down with sad, sad performance. They are ending the cycle of pain and going towards what they do well instead. I say good deal, make some money and improve. Reply
  • mhahnheuser - Friday, February 03, 2012 - link

    What is missed in all this discussion is that Llano is a far far better purchase out of the box than Sandybridge, not just better, AMD first daylight second and Intel some distance back in the dust. It's only when you add a third party GPU that SB even gets in the picture.

    Why would AMD want to put all this effort and expense chasing ultra low volume, outright single core IPC performance and get peanuts for it when it can get uber-prices for stand alone GPU's of which they are currently pulling around 50% of the Intel platform. It's the GPU which provides the uber performance not the CPU, and a 7970, 50 0r 90 will make more than double the money for AMD than a Bulldozer CPU that can manage to outperform a SB-E by 10 FPS with the same high end GPU.

    They're (AMD) in a pretty good space to persue more lucrative market segments.
  • frozentundra123456 - Saturday, February 04, 2012 - link

    I see no reason at all to purchase llano on the desktop. Just get an Intel CPU for better performance and add a discrete 50.00 card if you want good graphics. If I were buying a laptop I would probably get llano, but on the desktop it is mediocre at everything, and a discrete card is a much better solution. Reply
  • seapeople - Saturday, February 04, 2012 - link

    That's the same thing as saying a 10-speed bicycle is a far far better purchase out of the box than a Porsche 911. It's only until you add a third party combustible hydrocarbon that the Porsche 911 becomes a better means of transport. Reply
  • mhahnheuser - Friday, February 03, 2012 - link

    ...and just for good measure, with Intel chasing the gamer enthusiast market is actually doing AMD's work for them in driving aplications toward increased GPU performance, which long term has got to be in AMD's interest, given their future plans for more GPU utilisation, so theirs is not a dumb strategy to let Intel do most the heavy lifting in the enthusiast segment. Reply
  • dicobalt - Saturday, February 04, 2012 - link

    Intel will be rolling out 14nm in 2014. AMD really needs to get Globalfoundries to speed up. It's really disappointing to hear AMD isn't interested in competing on process. I can't see AMD as being able to compete. Reply
  • rocketbuddha - Saturday, February 04, 2012 - link

    Worse is that AMD has no advantages other than being fabless.
    a) Its die-size at similar nodes is bigger than competition
    b) Reduces yields.
    c) No longer a tight coupling with the foundries. Going forward I see GF not neccessarily accomodating AMD's concerns on Node race.

    While I am disappointed I think Rory Read made the right decision with the cards he has been dealt with. AMD should be extra smart....
  • tipoo - Monday, February 06, 2012 - link

    I don't think AMD has that kind of pull at GloFo anymore. They got rid of most of the controlling share, if I recall. Reply
  • Rictorhell - Saturday, February 04, 2012 - link

    Wow. I've been a computer fan for years now and suddenly, people care more about cellphones and smartphones then they do about actual computers. When did that happen? How is it that the internet has been popular for years now, but yet, very little has actually changed? The technology still feels slow to me, in many ways.

    We have Facebook and Twitter and Youtube, yeah, but are these really exciting and amazing advancements in technology? I've read and heard reports that there just aren't many people taking up computer science and computer programming and from what I have seen, I can kind of believe those reports.

    Unless, the majority of people that are really interested in working in the computer field are focusing mainly on videogames or IT.

    I don't really have anything against smartphones, except the fact that to really get a good amount of use and enjoymment out of them, you have to sign a contract with Verizon or some other service provider. I love PDAs and I still use one, I just wish there was an interest in developing modern PDAs that did not require monthly payments or contracts.

    I like the idea of things like the Ipod Touch and the Samsung Galaxy Player, but both of those pieces of hardware are basically inferior to their smartphone counterparts, created by the same companies.

    What if I want a device with the same quality of screen as the Iphone 4 or the Samsung Galaxy Nexus, but I just don't want to sign a contract? If I am willing to pay for a quality gadget, I would hope that someone out there would be willing to build it, but that doesn't seem to be the case.

    I kind of see tablets as a bigger and better version of PDA, but I need something with a decent amount of storage space and processing power.

    The upcoming WIndows 8 tablets might be just what I am waiting for, but I am hoping for some actual innovation and creativity and not just a more mobile version of what I already have.
  • honsonic - Sunday, February 05, 2012 - link

    well its just sad this may lead to intel dominating a certain market and controlling the prices, also developing tech in the future won't be hindered by this. Reply
  • polyzp - Monday, February 06, 2012 - link

    We will see awesome competition at the 600-800 dollar range from amd. 17w trinity looks like it will be the 17w king. Quadcore with discrete graphics performance in an ultrathin form factor and crazy good battery life. It will be interesting to see if ivy bridge ulv graphics even comes close.
  • Onslaught2k3 - Tuesday, February 07, 2012 - link

    AMD has about until late 2014 to turn things around before Intel is able to go fabless. Once Intel goes fabless they have the opportunity to either cut costs or make even more money by contracting fabrication to TSMC or any other semiconductor firm on a cheaper bill. I honestly DO hope AMD does make a comeback with either excavator or piledriver. Bulldozer certainly dozed off for AMD.... Reply
  • wumpus - Saturday, April 14, 2012 - link

    Highly competitive GPUs (usually win everything between chipset based to ulta-high end)
    Uncompetitive CPUs (with the rare competitive fusion chips, and others are often worth the good deals you can get at microcenter).
    Zero-content marketing giberish. I don't think anyone can compete with AMD's buzzword/sentence, nor have a chance to compete with them on new "long term plans" that mean absolutely nothing.

Log in

Don't have an account? Sign up now