Media Encoding Performance

We'll start with our DivX test; this is the same benchmark we've been running for years, we've simply updated to DivX 6.7. The codec was set to Unconstrained quality, with the quality/performance slider at 5 and enhanced multi-threading enabled. The rest of the codec settings remained at their defaults.

DivX 6.7 w/ Xmpeg 5.0.3 - Video Encoding  

Despite the move to four cores and the improvements to the K8 architecture, the Phenom, even at 2.4GHz, is slower than the Core 2 Quad Q6600. Clock for clock, Intel has a 24% performance advantage here.

AMD did make some progress however, if we look back at some of our older numbers the gap at 3.0GHz between dual-core chips was almost 38%.

The situation gets even more bleak once you take into account that the Phenom 9700 will most likely ship when Intel's Q9450 is also available which extends Intel's lead to over 30%.

AMD has always been much more competitive at encoding using Microsoft's Windows Media Video codec:

Windows Media Encoder 9 - Video Encoding  

Windows Media Encoder performance is virtually identical between the Phenom and Core 2 Quad at the same clock speed. However, once you take price into account, Intel starts to pull ahead; the Q6600 is priced competitively with the Phenom 9600 and manages a 7% performance advantage over the 9600. It's not much, but the Q6600 is also cheaper.

Our final encoding test is an increasingly popular format: x264. We encode the same .avi file from our WME test but this time using the x264 codec and AutoMKV. We didn't encode audio and left all program settings at its defaults, the only thing we changed was we asked that the final file size be 100MB (down from 500MB).

AutoMKV x264 - Video Encoding  

Much like our WME results, clock for clock AMD's Phenom actually equals the performance of the Core 2 Quad. Take price into account and Intel is still the right buy; it's tough to say what will happen when the Phenom 9700 and 9900 eventually launch because they may be competing against Penryn at that time, which in this case would be the Q9450, a more formidable opponent.

General Application Performance 3D Rendering Performance
Comments Locked

124 Comments

View All Comments

  • B166ER - Monday, November 19, 2007 - link

    Had to reply and clarify. The phrase refers to current setups in which quadcore gaming is not a primary reason to purchase said processors. Alan Wake, Crysis, and others, while being able to take advantage of quadcore setups, are not out yet, and I would guess that less than 3% of games out there now are quadcore capable, and scaling of such games is probably hardly optimized even if there were more supply. You speak in a future context in which these games will be available, even still the abundance of them will still be in wait.

    Nonetheless, Anand that might be your best review in my book to date. It speaks honestly and depicts a very seemingly structureless company that only has time on its side to pull itself up from potential disaster. AMD has not shown too many positive strides as a company lately, and mindless spending on what should have been a direct "let em have at it" approach only shows what was speculated previously: the company has a vacuum in leadership that needs to be filled by capable hands. And we all know proper administration starts from the very top; Hectors step down will not be mourned by me. Dirk has his cup filled, but his past credentials speak highly and show him capable. I can only wish well.
    Phenom needs to be priced competitively, simple enough. and it needs higher quality yields for overclocking. Its amazing how they can stay just one step behind in almost every step currently. I hope the 7 series mobos bring about better competition vs Intel and Nvidia boards. We as a community need this to happen.
  • leexgx - Monday, November 19, 2007 - link

    there are about 2-3 games i think that use quad
  • wingless - Monday, November 19, 2007 - link

    Thats such a mixed bag it makes me sick. Phenoms are mediocre at best at almost everything but somehow magically rape Intel in Crysis. I'll have nightmares. WTF is going on internally in those four cores to make this happen? I hope software manufacturers code well for AMD so they can shine. The pro-Intel software market is huge and thats where the fight is. Unfortunately it doesn't look good for AMD there either because programmers hate having to learn new code.
  • defter - Monday, November 19, 2007 - link

    Rape Intel in Crysis??

    Crysis was one of the few benchmarks where fastest Phenom was faster than slowest Core2 Quad. Still the Phenom's advantage was less than a percent.

    I think it's better to say that Crysis is the benchmark where Phenom doesn't utterly suck (it just sucks a lot).
  • eye smite - Tuesday, November 20, 2007 - link

    You really think those shining numbers are realistic from pre production sample cpu's? I think you should all wait til sites have full production MB's and cpu's and can give real data with all their tests, then you can decide it's a steaming sack of buffalo droppings. Until then, you'll just sound like a racaous bunch of squabbling crows.
  • JumpingJack - Monday, November 19, 2007 - link

    Ohhhh, here we go again with the 'It's note coded well for AMD' conspiracy theories.
  • wingless - Monday, November 19, 2007 - link

    AMD recently released new software libraries for these processors....
  • JumpingJack - Monday, November 19, 2007 - link

    Yeah, great for the FPU library which they already compiled into their PGI for the SPEC.ORG runs, which consquently are slowly getting the non-compliant branding pasted all over them.
  • TSS - Monday, November 19, 2007 - link

    "It turns out that gaming performance is really a mixed bag; there are a couple of benchmarks where AMD really falls behind (e.g. Half Life 2 and Unreal Tournament 3), while in other tests AMD is actually quite competitive (Oblivion & Crysis)."

    the UT series and halflife 2 are both very CPU intensive, while oblivion and especially crysis are videocard killers. it's hard to say but it sucks in games as well. you'd wanna see a difference in crysis though, make it scale to about 640x480. it's just too demanding on the graphics card to compare at 1024x768. in lament terms, in HL2 the graphics card is usually picking his nose waiting for the CPU, so a stronger CPU will make a dig difference. in crysis especially it's exactly the other way around, so that's why scores are closer together.

    why is this true? in half life 2, there are 50 frames between the best and worst of the line up. in UT3, there are 50 frames between the best and the worst of the lineup (meaning regardless of clockspeeds or architecture. the frames per second is also measured in the hundreds, even by such a new and graphic intensive game as UT3 (though it should be noted as far as i know the beta demo did NOT ship with the highest resolution textures to keep file size down). now crysis has about 9 frames per second difference between a 2,2 ghz phenom and a 2,66 ghz intel proc, and oblivion manages about 16. same system different game it can only be concluded that the game is much more graphic card intensive, which shouldn't be hard to imagine since crysis is well known to be the graphics card killer of the moment.... and oblivion of the last generation (HDR and AA anybody?).

    AMD's 10-30% slower in every other test they are as well in gaming. i belive the difference would've shown more if they used a SLI or crossfire solution, though i understand that's not possible with the chipsets and drivers existing at the moment.

  • MDme - Monday, November 19, 2007 - link

    Time to upgrade.....to the dark side.

Log in

Don't have an account? Sign up now