Final Words

If you step back and look at it, the triple-core Phenom story isn't unexpected at all. In applications where quad-core benefits, triple-core does too and in those applications where it doesn't, we don't see much from the new Phenom X3. In video encoding and 3D rendering tasks we see triple-core do quite well, but quad-core does even better. Take this train of thought one step further and you come to a very interesting conclusion: AMD's triple-core Phenom is a quick and dirty way of using Phenom to compete in the dual-core space.

AMD doesn't have the resources to spin a dual-core Phenom die, so what better way of repurposing the quad-core die (especially if one core is defective) than to make a Phenom chip with less than four cores. Sure it's not the most efficient way to manufacture, but AMD doesn't have the luxury of producing a number of different Phenom die at this point. The triple-core Phenom strategy makes perfect sense if you're AMD, the question is: does it make sense if you're an end user?

Let's start at the Phenom X3 8750; it's priced too closely to the X4 9750 to make sense, if you need more than two cores spend the extra $20 and get a quad-core (or give up 200MHz and get a quad-core X4 9550 at the same price) and if you don't need more than two cores then you're looking at the wrong CPU to begin with.

The Phenom X3 8650 manages to perform at about the level of a 2.00GHz - 2.66GHz Core 2 Duo processor in many applications, the problem is that it needs to compete with a 3.00GHz Core 2 Duo to make economic sense. In many cases, the 8650 is competitive but with higher power consumption it's hard to call it a winner here.

The Phenom X3 8450 on the other hand is a little too slow for most applications, it's often times no faster than the Athlon X2 5600+ despite a higher IPC and having a third core. AMD needs frequency; the X3s should start at 2.4GHz and then we might be having a very different discussion, but right now the best AMD can muster is to only hold on while competing with Intel.

For any sort of 3D rendering (or other application that scales well with four cores), AMD's triple-core CPUs can offer mostly competitive performance with Intel's equivalently priced dual-core CPUs. However, as we showed early on in this article, many applications don't scale well beyond two cores and thus in the rest of our tests AMD is competitive but can't clearly be recommended.

Now if we look at the platform, AMD does actually have an advantage. The AMD 780G's integrated graphics is a far better solution for the casual gamer than what Intel offers with its G35 but on top of that, 780G offers full H.264/VC-1/MPEG-2 decode acceleration making it a far better platform for watching Blu-ray movies. With the format war over and Blu-ray drives unbelievably affordable right now, this is a serious issue for Intel.

If you're building something with integrated graphics for use as a casual gaming box or HTPC, then your best bet is AMD despite the slower CPU. Intel's G45 chipset should resolve the HD movie playback issue by also accelerating H.264/VC-1/MPEG-2 and alleviate some of the integrated graphics gaming issues with a faster 3D core, but the platform isn't due out until later in Q2 so until then there's very little choice.

The balance here is very interesting: Intel has CPU superiority with platform deficiency, and AMD has platform superiority with a serious CPU deficiency. The problem is that, in theory, G45 will fix a major issue with Intel's platforms but what will AMD do for its CPUs?

Power Consumption
POST A COMMENT

45 Comments

View All Comments

  • JarredWalton - Wednesday, April 23, 2008 - link

    I've seen nothing to suggest a faster HyperTransport bus would help AMD much. You need to compare at the same CPU speed; if you raise the HT bus to 250 MHz that represents a 25% overclock of the CPU as well, so of course it helps performance a lot. Try comparing:

    Athlon X2 4600+ 2.4GHz
    Run at 200 HTT and 12X CPU vs. 240 HTT and 10X CPU

    Athlon X2 4800+ 2.5GHz
    Run at 200 HTT and 12.5X CPU vs. 250 HTT and 10X CPU
    (Note: the 12.5X multiplier vs. 10X may have an impact - half multipliers may not perform optimally.)

    Athlon X2 5000+ 2.6GHz
    Run at 200 HTT and 13X CPU vs. 260 HTT and 10X CPU

    Now, the one thing you'll also have to account for is memory performance. At default settings (i.e. DDR2-800), you get different true memory speeds. The 12X CPU will end up at a true DDR2-800; the 12.5X will end up at DDR2-714 (CPU/7 yields 357MHz base memory speed); the 13X will result in DDR2-742 (again, CPU/7 yields 371 MHz base memory speed). For the "overclocked HT bus" setups, you'll need to select the same memory dividers to get apples-to-apples comparisons, which depending on motherboard may not be possible.

    Unless you can do all of the above, you cannot actually make any claims that HyperTransport bus speeds are the limiting factor. I imagine you may see a small performance boost from a faster HT bus with everything else staying the same, but I doubt it will be more than ~3% (if that). HT bus only communicates with the Northbridge (chipset), and the amount of traffic going through that link is not all that high. Remember, on Intel that link to the chipset also has to handle memory traffic; not so on AMD platforms.
    Reply
  • ghitz - Wednesday, April 23, 2008 - link

    The e8400 performance/power usage is outstanding and will be great value once the G45 boards trickle in. I can't wait for those G45s!
    Reply
  • ap90033 - Wednesday, April 23, 2008 - link

    So AMD STILL hadnt caught up. Thanks Good to know. Not that Im suprised.... Reply
  • natebsi - Wednesday, April 23, 2008 - link

    The bottomline is: AMD's newest CPU's are bested in nearly every single benchmark by an Intel CPU thats been out like, what, a year?

    I have no love/hate relationship with either Intel or AMD, but thats just sad. I predict many more losing quarters for them, though I don't know how many more they can take...
    Reply
  • Griswold - Thursday, April 24, 2008 - link

    Thanks for that null-posting. Reply
  • najames - Wednesday, April 23, 2008 - link

    As a long time AMD only user, I just bought an Intel Q6600 on impusle from Frys.com for only $180. I was looking at a 780G solution and thought, I'll get the Intel quad and a similar Intel based solution for doing video processing work. Oops, I found out the only current Intel mATX is the G35 is from Asus, ONE BOARD, huge selection to choose from huh?

    I'll either sell/return the unopened CPU or buy a P35 board and graphics card. I could deal with a slightly slower AMD 9550 CPU and a better platform instead, tough choice.
    Reply
  • strikeback03 - Wednesday, April 23, 2008 - link

    I needed parts for a new system for the lab last week, I went with the non-integrated graphics and add-on card. Integrated graphics would have been fine for the application, but when the board plus card cost less than the ASUS G35 board (and are full-size ATX as well, which is useful) then the decision wasn't too hard. Reply
  • Staples - Wednesday, April 23, 2008 - link

    Intel graphics have always been terrible. AMD definitely has the advantage for integrated graphics and even know their CPUs can not compete, I still find myself considering one just for their graphic options. I am glad that this review points it out bringing to light that Intel graphics are just not acceptable. Whether Intel will change is a big unknown, probably not.

    I find the added emphasis over the last year of power consumption a great one. With the price of energy these days, it is something I factor into my purchase. SSE4 and a lower power consumption is the reason I am holding out for a Q9450. Hopefully by the time it actually goes into mass production (hopefully in the next two months), a decent integrated option will be out for the platform.
    Reply
  • 0roo0roo - Thursday, April 24, 2008 - link

    terrible? i used an intel 950 integrated graphics with some 1080p content, it decoded just fine with an e2200. Reply
  • derek85 - Thursday, April 24, 2008 - link

    Terrible? Yes, terrible. Besides the lame hardware they can't even write proper drivers, see how many randering problems they have in the games. Reply

Log in

Don't have an account? Sign up now