Power Consumption

Most impressive is AMD's ability to run six 45nm cores at the same power consumption as four 45nm cores. The Phenom II architecture in general does reasonably well at idle, but without power gating AMD can't compete with Intel's idle power levels.

Under load Intel also has the clear advantage.

Idle Power Consumption

Gaming Performance Overclocking
Comments Locked

168 Comments

View All Comments

  • Scali - Tuesday, April 27, 2010 - link

    I think the same argument goes for the server market...
    AMD cannot compete with Intel on performance, so they have to compete on price, meaning low profits.
    AMD *could* compete with Intel in the server market, but that was before Nehalem. Now Intel has some very strong offerings in the server market...
    So it's the same story either way... should AMD really try to build such large high-end CPUs, when they know they won't be able to compete on performance anyway, and performing on price is dangerous when your competitor is always a manufacturing node ahead.

    AMD *could* have had the Atom market, but they actually killed off their Geode line of CPUs just before Atom was introduced and pretty much owned the netbook market.
  • realitycheck - Tuesday, April 27, 2010 - link

    Everyone keeps mentioning the fact that AMD needs a new architecture and that they cant remain competitive with the current yet aging Phenom core, and AMD is wasting time and money by just throwing more cores at the problem.

    I think you guys are missing the bigger picture, the Phenom II x6 is meant solely as a stop gap, an effort to remain somewhat competitive and keep revenue flowing while they finish baking Bobcat and Bulldozer, due out next year. AMD knows that the Phenom II its a show stopper, or even a headlining act, thats why these CPU's were released without any major fanfare, they arent trying to fool anyone into thinking these chips are something they're not, they're simply trying to hold down the fort.

    I think that once complete, the Bobcat / Bulldozer line will be what puts AMD back on the competitive front. just simply looking at the code names they chose, the last time AMD got bold and used suggestively strong code names, tack hammer / claw hammer / sledge hammer, the end products, Turion 64 / Athlon 64 / Opteron lived up to the expectations, and in fact they proved to be game changers. I believe the same will hold true with Bobcat / Bulldozer.

    Another point i'd like to make is, over the past 10 years AMD has matured quite a bit, growing up from being a completely non-threatening low end sub-generic CPU builder, with seemingly no chance of ever amounting to much more than that in a tightly controlled Intel world. From those dark days AMD has grown to become a global player in the market and a competitive threat to intel. they've come along way from the AMD of the 90's. Most people think that the whole Barcelona / Phenom fiasco, with the TLB erratum and delayed launch cycles has spelled the end of a competitive AMD. I think that whole fiasco is exactly what AMD needed, people learn the most from their failures, not from their victories, just look at intel, they had all but stagnated all thru the 90's and the first half of the 2000's. It wasnt until AMD came along and kicked them right square in their out of shape and boated ass and then out ran them for a few years, that they go back in shape so to speak and started making competitive and compelling products again. So only time will tell if AMD has what it takes to be a competitor, or just a fluke...

    As for this review, why did you use such old drivers for the AMD chipset and graphics card? i mean seriously, the chipset drivers you used were developed a couple years before the 890FX chipset you were using was released? how about using 10.3 over 8.11 and 9.12? or is there something special about the 8.11 and 9.12 drivers? given the gap between your versions and the current, i'd says your somewhat kneecapping the tests..
  • Scali - Wednesday, April 28, 2010 - link

    "I think you guys are missing the bigger picture, the Phenom II x6 is meant solely as a stop gap, an effort to remain somewhat competitive and keep revenue flowing while they finish baking Bobcat and Bulldozer, due out next year. "

    I don't think we're missing the bigger picture, we're pointing out exactly the same:
    X6 is a stop-gap, what AMD *really* needs to become competitive again, is Bobcat/Bulldozer. We're just saying that they need it NOW, rather than next year (and they have to be a success aswell, not the fiasco that Phenom was... a new architecture is no guarantee for success obviously).
    AMD has been struggling ever since the Core was introduced in 2007(!). They REALLY need a good, new architecture now, it is long overdue.
  • gruffi - Wednesday, April 28, 2010 - link

    I think you are missing a lot. What the hell are you talking about? Intel is throwing more transistors at the problem, not AMD. Compare the cores, Intel needs >50% more transistors per core. They need 8 threads for Nehalem to get full performance, AMD only 6 for Thuban. Intel is just brute-force. Already today AMD has a smarter and more effective design. And the really interesting designs (Bulldozer, Llano, Bobcat) are not even launched. The only problem for AMD is software related, not hardware. The widely used compilers under Windows, MSC and ICC, optimize much more for Intel. And your knowledge about servers also seems to be outdated. Compare Magny-Cours and Gulftown. The Opteron is more powerful and more power-efficient. And it is still 45 nm. You really want to claim AMD isn't executing well? Then you must be joking or an Intel flamer. X6 and i7 are not for people that utilize one or two threads most of the time. Thuban is not meant to be a mainstream CPU. Therefore, Deneb, Propus and Regor are there. Thuban is for highly threaded workloads. And then an X6 1090T ($295) can outperform an i7 860 ($284) and also an i7 870 ($562). If you ask me, the X6 1090T offers very good value for such scenarios. The X4 1055T ($199) offers even more value.
  • Scali - Thursday, April 29, 2010 - link

    Intel doesn't "need" more transistors, they are just in the position that they can implement a lot more cache than AMD, without getting into latency problems or getting too much power consumption.
    This 6-core 125W TDP processor has problems keeping up with Core i7 8xx series, which are only 95W TDP.
    So how is AMD more energy-efficient? Only if you compare it unfavourably.
  • magic box - Sunday, September 4, 2011 - link

    if you wanna pay for intel advertisment ya pay more then.

    it like people pay for HP or Dell a rip off

    when we all no the asus or the msi in the corner that nobody talks about is so much better.

    us ur head

    apple ipad the over sized ipod touch that people love to buy only because of advertisment yet you can buy a windows touch tablet at harf the price.

    the intel chip has had bad times too the have sold bad chips with cores that dont work and so had amd they are both the same.

    but if you wanna get sucked into intels rip offs then thats ur own fult

    it will be funny when i have the amd 8 core and lock 4 of the core and then ill compare it to any intel chip by the time intel make anythink better then that people will not be paying $3000 for the same thing at $400
  • magic box - Sunday, September 4, 2011 - link

    you talk about intel quadcore being better then AMD 6 core you really need to look at locking and unlocking locking 2 of the 6 AMDs cores it will crap all over the intel
  • strikeback03 - Tuesday, April 27, 2010 - link

    I would guess the hex-core is a lot more useful in servers, and there is a consumer version of the chip since the bulk of the development is expected to be funded from server sales.
  • JGabriel - Tuesday, April 27, 2010 - link

    It might be better to think about this way:

    Most office and browser applications run quite fast enough, even on slower chips. A 67% performance difference will frequently amount to only a few microseconds, or less, in execution time.

    As for gaming, casual and moderate gamers will be happy as long as they hit 30 frames per second at 1080p at moderate detail, which mid-range dual cores can handle in most games.

    So the big differentiator is multi-media: very intensive applications where speed or extra cores, depending on the app, can significantly reduce runtime. For instance, the hypothetical 67% performance improvement mentioned above can change a 90 minute encoding job into a 54 minute job. Or result in lower times while using Photoshop. Or let you run a scheduled anti-virus scan in the background without impacting your gaming or video performance.

    For many people, possibly most, more cores (or more threads) is more useful than clock speed, because they'll only *notice* the difference when running processor-intensive, heavily-threaded apps, or when heavily multi-tasking. And that is becoming increasingly more common as multi-media becomes a primary home use for PC's

    If that's not a factor in their typical usage, then users are probably better off with an i3, or - and here comes the extra core advantage again - an Athlon II X3, which performs similarly to the i3, but for $40-$50 less.

    As for profit margins, AMD just posted one of its best quarters ever, so the "more cores for the dollar" strategy, which provides more processing power where users are most likely to actually notice it, seems to be working to some extent. No, AMD won't overtake Intel, but it can be a profitable mainstream niche - and Intel is not likely to challenge them too seriously, for the time being, given its current problems with anti-trust in the US and Europe.

    .
  • strikeback03 - Tuesday, April 27, 2010 - link

    Do AMD's numbers for the quarter include graphics sales? As that should have been very profitable for them, being mostly the only game in town and able to sell above MSRP.

Log in

Don't have an account? Sign up now