First Tunisia, then Tahoe?

As a slightly off-topic but important sidenote, I thought it would be appropriate to let everyone know how AMD wanted this review to happen, and how certain folks within AMD were champions for the right cause and made it actually happen.

AMD knew it wouldn't be able to trounce Core 2 with Phenom, especially not at 2.3GHz, so it wanted to control the benchmarking that was done on Phenom. For the first time in as far as I can remember, AMD wanted all benchmarking on Phenom to be done at a location in Tahoe, of course on AMD's dime. AMD would fly us out there, we would spend a couple of days with a pre-configured system and we'd head home to write our stories.

Now I championed for this sort of early-access to Phenom months ago. I've visited AMD alone three times this year primarily to talk about Phenom, and each time I left without being able to report so much as a single benchmark to you all (everyone remembers those articles right?). I tried and tried to get AMD to part with some early Phenom data, because they were losing the confidence of their fan base and that's a sad thing to see for a company that really took care of this community when we needed it most.

After Tahoe AMD would eventually sample Phenom parts so we could test in our own labs, but there was no word on exactly when that would be. Chances are you would've seen a handful of numbers here today if we had gone to Tahoe with a full review of the chip hitting sometime in December.

Needless to say, I wasn't happy. I refused to go to Tahoe.

Don't get me wrong, a free trip to Tahoe is a wonderful thing, but Phenom deserved better. It deserved dedicated testing, it deserved a thorough review, not a quick glance over a couple of days. And I had a feeling that you all would agree. The time for AMD-sanctioned testing expired months ago, if Phenom was launching this week, we were going to have a proper review of it.

These days, AMD seems to be learning a little too much from the ATI way of doing things. If AMD had its way, today's Phenom review would have been done from beautful Lake Tahoe, on a system that AMD built, running at a frequency that isn't launching. Now there's nothing wrong with allowing us to preview Phenom under closed conditions, after all, Intel does it, but that's simply not acceptable for a review of a product that's four days away from being in stores. You all want to see a thorough review of Phenom, not some half-assed preview, definitely not after waiting this long for it.

An AMD rep, familiar with the Tahoe trip, asked me, somewhat surprised, "what, Intel doesn't work like this?".

Sorry to say, Intel doesn't. Today Intel let us preview the Core 2 Extreme QX9770 processor, do you want to know how they did it? The FedEx guy dropped off a chip. No flights to Tahoe, no hotel rooms, no expenses at all. Don't get me wrong, I felt like an idiot turning down a free trip to Tahoe, but it was for AMD's own good. We've all seen the financials, these aren't times to be wasting money on silly trips around the country, it costs less than $30 to ship a CPU and that's all we need.

I get the point of Tahoe, it's to control the benchmarking, making sure we wouldn't be comparing a 2.4GHz Phenom to a 3.0GHz Penryn, but honestly folks - would we really do that to begin with? And I get the idea to wine and dine the press, with hopes of more pleasant reviews with better relationships - but this isn't a product to toy with. We're here to do our jobs and that is to review the product that will carry AMD for the next twelve months, and honestly we can't do that from some lodge somewhere away from our testbeds.

This isn't the first time AMD has heard of this from me, and there are many within AMD who feel the same way. The reason you're finding this rant in here today is because I am concerned for the future of the company. Competition is a good thing, we need to keep it around, but AMD needs to learn from its competitors. Intel and NVIDIA don't try things like this, business is always first with them, frivolous pleasures come next.

To AMD: if you want to be Intel, start acting like it.

Intel Responds with...really? Socket-AM2+, Not So Positive?


View All Comments

  • hoelder - Wednesday, December 12, 2007 - link

    I remember to have to save food of my mouth to buy the first ill conceived Pentium or the 486. How Intel set those prices evades me. They maximized their profits with no competition. Yes I know Intel produces a faster chip, if faster is the right word. However, when it comes down to competition and the markets, AMD is the strategically right choice. Unless of course you think that the Walmart buy cheap toys from China idea is your consumer ideology then you should stick with Intel, that actually is going like Nixon to China or you believe in the consumer choices should keep competition open. Reply
  • mpholland - Friday, November 30, 2007 - link

    Maybe AMD just had to release these early to make a little capital this year. Personally I think that it is good that AMD let people know SOMETHING is close. I am hoping that with a little tweaking AMD and some MB partners can get performance up a little and be competitive with Intel sometime in 2008. I have seen just simple driver tweaks work wonders on other hardware, maybe just a little more time can help. Reply
  • Clauzii - Thursday, November 22, 2007 - link

    -- Reply
  • Clauzii - Thursday, November 22, 2007 - link

    By reading through the benchmarks, where a single core (of a quad) is compared to all four cores running, it looks like the 8 core version of the "Phenom" would scale even better than the four-core one. The Barcelona btw. also shows this behavior, by having one core being just a little faster than the Opteron core, but the four running in tandem scales very well.

    Because of the not-so-good MHz numbers, it might not take AMD to new glorious heights for now, but when (soon?) 8 cores arive, AMD MIGHT be able to do better, because of their better core-scaling factor.

    Looks like they HAVE to do something like gluing together two Phenoms to at least do 8-cores before intel, and before intel gets TOO fast for AMDs liking and the ability to catch up slips away.

    Unless AMD already is working on a true 8-core design, which would probably scale even better than a glued one. And by incooperating knowledge on multithreading from ATIs designing of GPUs they might be able to do something even more serious in the future.

    But for know, intel is still in the lead.
  • praeses - Thursday, November 22, 2007 - link

    I really wish that AMD never went down the road of L3 cache for these processors. As the majority of applications still used today in the desktop/workstation market are going to be only one or two cores, the shared cache itself probably causes more of a hindrance.

    Personally I would have liked to see 128k L1 and 1MB of L2 for the higher models, and simply the 512K L2 for the lower models. The tweaks to the individual cores would almost enable them to catch up clock per clock with Intel without this L3 cache latency getting in the way, and that way powering down the individual cores would also power down all the cache they would be using as well. I realize that routing the L2 cache in larger quantities is trickier and consumes more die space than L3 but they should also be able to gain significantly cost measures in those produced without L3 and be able to compete better in the $180 or so market.

    Granted if a single application was single threaded and the only one taxing the system while taking advantage of all the L3 at once, and the other 3 cores were sleeping, it would be a slight disadvatage, but that's an extreme situation.
  • WorkIsAFullTimeHobby - Thursday, November 22, 2007 - link

    I think Anandtech power consumption graphs are way off. Phenom power consumption sould be compared to Penryn power consumption plus NSB power consumption. Does any body see any mention of this fact and do the graphs properly account for this?

    Phenom effectively has the north side memory controller bus built in. After looking at the Architecture now I know why Intel is always trying to increase MB bus speed. They only have one external bus to feed and comminicate between all the CPU's.
  • redzo - Wednesday, November 21, 2007 - link

    After all of this being said i have only a few words for you:
    - AMD's faith it's in the hands of its ability to cut down the prices even more.
    - Cheaper by 13% it is just not enough ! ! ! unless they cut down the prices even more they'll loose more customers!
    - Of course that they can trust in their marketing strategy( true quad core ), but not for long: It's PERFORMANCE and PRICE/PERFORMANCE that matters and not STYLE or FASHION.
  • jwizmo - Wednesday, November 21, 2007 - link

    One thing to keep in mind here is that the Quad-Core AMD chip is a 64-Bit part, like it's predecessor. To perform these tests without giving it a chance in 64-Bit mode is not a fair comparison. I would like to see some subset of these same tests with Windows Vista Ultimate 64-Bit. The drivers should be out there for that configuration. Let's see how it performs in that mode, I don't think there will be much competition from Intel then. When everything finally goes 64-Bit AMD will have the advantage, since they got in at the ground floor. Reply
  • newuser2 - Wednesday, November 21, 2007 - link

    I was checking holidays buyers guide and then I noticed that in the tests you were using for intel a motherboard that costs $335 and DDR3 memory which is $580 (I think I must be wrong here), while for AMD you used a $180 board and DDR2 memory for $121. That would be comparing a $915 platform vs $301 one, am I correct? Don't misunderstand me, I just think I didn't understand what you used to test what. Reply
  • strikeback03 - Wednesday, November 21, 2007 - link

    I wondered too, but I guess they figured the newest AMD chipset was most appropriate for the test. Seeing as availability there is shady as well, there aren't many options. Not sure why they didn't choose a cheaper Intel board and DDR2 (since no DDR3 for AMD) but at those speeds their tests don't seem to show much extra performance for DDR3 over DDR2 anyway. Reply

Log in

Don't have an account? Sign up now