General Use Performance

We'll start out our tests with the 7-zip benchmark, a CPU bound multithreaded integer workload that looks at 7-zip compression/decompression algorithms where the IO subsystem is removed from the equation:

7-zip Benchmark

7-zip is almost the perfect scenario for AMD's Vishera: a heavily threaded integer benchmark. Here the FX-8350 is able to outperform the Core i7 3770K. In fact, all of the Vishera parts are able to outperform their price competitive Ivy Bridge alternatives. The old Core i7 920 does pretty well here thanks to its 8-thread architecture.

Next up is Mozilla's Kraken JavaScript benchmark. This test includes some forward looking js code designed to showcase performance of future rich web applications on today's software and hardware. We run the test under IE10:

Windows 8 - Mozilla Kraken Javascript Benchmark

If the 7-zip benchmark is the best case scenario for AMD, Mozilla's Kraken test is among the worst. Largely dominated by single threaded performance, the FX-8350 is significantly slower than a Core i3 3220. Only Intel's old Core i7 920 is slower here, and that's a chip that debuted in 2008.

Although not the best indication of overall system performance, the SYSMark 2012 suite does give us a good idea of lighter workloads than we're used to testing.

SYSMark 2012 - Overall

Overall performance according to SYSMark 2012 is within striking distance of Ivy Bridge, at least for the FX-8350. AMD seems to have equalled the performance of last year's 2500K, and is able to deliver almost 90% of the performance of the 3750K. It's not a win by any means, but AMD is inching closer.

SYSMark 2012 - Office Productivity

SYSMark 2012 - Media Creation

SYSMark 2012 - Web Development

SYSMark 2012 - Data/Financial Analysis

SYSMark 2012 - 3D Modeling

SYSMark 2012 - System Management

Par2 File Recovery Performance

Par2 is an application used for reconstructing downloaded archives. It can generate parity data from a given archive and later use it to recover the archive

Chuchusoft took the source code of par2cmdline 0.4 and parallelized it using Intel’s Threading Building Blocks 2.1. The result is a version of par2cmdline that can spawn multiple threads to repair par2 archives. For this test we took a 708MB archive, corrupted nearly 60MB of it, and used the multithreaded par2cmdline to recover it. The scores reported are the repair and recover time in seconds.

Par2 - Multi-Threaded par2cmdline 0.4

Crank up the threads and once again you see Vishera do quite well. The FX-8350 outpaces the Core i5 3570, and the FX-4300 falls only slightly behind the Core i3 3220.

Excel Math Performance

Microsoft Excel 2007 SP1 - Monte Carlo Simulation

Introduction Video Transcoding & Visual Studio 2012 Performance
POST A COMMENT

241 Comments

View All Comments

  • CeriseCogburn - Tuesday, October 30, 2012 - link

    Funny how the same type of thing could be said in the video card wars, but all those amd fanboys won't say it there !

    Isn't that strange, how the rules change, all for poor little crappy amd the loser, in any and every direction possible, even in opposite directions, so long as it fits the current crap hand up amd needs to "get there" since it's never "beenthere". LOL
    Reply
  • whatthehey - Tuesday, October 23, 2012 - link

    We've heard all of this before, and while much of what you say is true, and ignoring the idiotic "Windoze" comments not to mention the tirade on "evil Intel", Anand sums it up quite clearly:

    Vishera performance isn't terrible but it's not great either. It can beat Intel in a few specific workloads (which very few people will ever run consistently), but in common workloads (lightly threaded) it falls behind by a large margin. All of this would be fine, were it not for the fact that Vishera basically sucks down a lot of power in comparison to Ivy Bridge and Sandy Bridge. Yes, that's right: even at 32nm with Sandy Bridge, Intel beats Vishera hands down.

    If we assume Anand's AMD platform is a bit heavy on power use by 15W (which seems kind as it's probably more like 5-10W extra at most), then we have idle power slightly in Intel's favor but load power favors Intel by 80W. 80W in this case is 80% more power than the Intel platform, which means AMD is basically using a lot more energy just to keep up (and the Sandy Bridge i5-2500K uses about 70W less).

    So go ahead and "save" all that money with your performance-for-dollar champion where you spend $200 on the CPU, $125 on the motherboard (because you still need a good motherboard, not some piece of crap), coming to $325 total for the core platform. Intel i5-3570K goes for $220 most of the time (e.g. Amazon), but you can snag it for just $190 (plus $10 shipping) from MicroCenter right now. As for motherboards, a decent Z77 motherboard will also set you back around $125.

    So if we go with a higher class Intel motherboard, pay Newegg pricing on all parts, and go with a cheaper (lower class) AMD motherboard, we're basically talking $220 for the FX-8350 (price gouging by Newegg), $90 for a mediocre Biostar 970 chipset motherboard, and a total of $310. If we go Intel it's $230 for the i5-3570K, and let's go nuts and get the $150 Gigabyte board, bringing us to $380. You save $70 in that case (which is already seriously biased since we're talking high-end Gigabyte vs. mainstream Biostar).

    Now, let's just go with power use of 60W Intel vs. 70W AMD, and if you never push the CPUs you only would spend about $8.75 extra per year leaving the systems on 24/7. Turn them off most of the day (8 hours per day use) and we're at less than $3 difference in power costs per year. Okay, fine, but why get a $200+ CPU if you're going to be idle and power off 2/3 of the day?

    Let's say you're an enthusiast (which Beenthere obviously tries to be, even with the heavy AMD bias), so you're playing games, downloading files, and doing other complex stuff where your PC is on all the time. Hell, maybe you're even running Linux with a server on the system, so it's both loaded moderately to heavily and powered on 24/7! That's awesome, because now the AMD system uses 80W more power per day, which comes out to $70 in additional power costs per year. Oops. All of your "best performance-for-the-dollar" make believe talk goes out the window.

    Even the areas where AMD leads (e.g. x264), they do so by a small to moderate margin but use almost twice as much power. x264 is 26% faster on the FX-8350 compared to i5-3570K, but if you keep your system for even two years you could buy the i7-3770K (FX is only 3% faster in that case) and you'd come out ahead in terms of overall cost.

    The only reason to get the AMD platform is if you run a specific workload where AMD is faster (e.g. x264), or if you're going budget and buying the FX-4300 and you don't need performance. Or if you're a bleeding heart liberal with some missing brain cells that thinks that support one gigantic corporation (AMD) makes you a good person while supporting another even more gigantic corporation (Intel) makes you bad. Let's not use products from any of the largest corporations in the world in that case, because every one of them is "evil and law violating" to some extent. Personally, I'm going to continue shopping at Walmart and using Intel CPUs until/unless something clearly better comes along.
    Reply
  • DarkXale - Tuesday, October 23, 2012 - link

    I would also add in the cost of getting a 100W more powerful power supply. (At least)

    The cost of the better cooling (either via better/more fans or better case), And the 'cost' of having a system with a higher noise profile.
    Reply
  • Finally - Tuesday, October 23, 2012 - link

    That talk suffers from the same inability to consider any other viewpoint but that of the hardware fetishist.

    If you are fapping to benchmarks in your free time you are the 1%.
    The other 99% couldn't care less which company produced their CPU, GPU or whatever is working the "magic" inside their PC.
    Reply
  • dananski - Tuesday, October 23, 2012 - link

    I agree with you but stopped reading at "uses 80W more power per day" because you have ruined your trustworthyness with unit fail. Reply
  • CeriseCogburn - Tuesday, October 30, 2012 - link

    Hey idiot, he got everything correct except saying 80W more every second of the day, and suddenly, you the brilliant critic, no doubt, discount everything else.
    Well guess what genius - if you can detect an error, and that's all you got, HE IS LARGELY CORRECT, AND EVEN CORRECT ON THE POINT concerning the unit error you criticized.
    So who the gigantic FOOL is that completely ruined their own credibility by being such a moronic freaking idiot parrot, that no one should pay attention to ?
    THAT WOULD BE YOU, DUMB DUMB !

    Here's a news flash for all you skum sucking doofuses : Just because someone gets some minor grammatical or speech perfection issue written improperly, THEY DON'T LOSE A DAMN THING AND CERTAINLY NOT CREDIBILITY WHEN YOU FRIKKIN RETARDS CANNOT PROVE A SINGLE POINT OF THE MANY MADE INCORRECT !

    It really would be nice if you babbling idiots stopped doing it. but you do it because it's stupid, it's irritating, it's incorrect, and you've seen a hundred other jerk offs like ourself pull that crap, and you just cannot resist, because that's all you've got, right ?

    LOL - now you may complain about caps.
    Reply
  • Siana - Thursday, October 25, 2012 - link

    It looks like extra 10W in idle test could be largely or solely due to mainboard. There is no clear evidence to what extent and whether at all the new AMD draws more power than Intel at idle.

    A high end CPU and low utilization (mostly idle time) is in fact a very useful and common case. For example, as a software developer, i spend most time reading and writing code (idle), or testing the software (utilization: 15-30% CPU, effectively two cores tops). However, in between, software needs to be compiled, and this is unproductive time which i'd like to keep as short as possible, so i am inclined to chose a high-end CPU. For GCC compiler on Linux, new AMD platform beats any i5 and a Sandy Bridge i7, but is a bit behind Ivy Bridge i7.

    Same with say a person who does video editing, they will have a lot of low-utilization time too just because there's no batch job their system could perform most of the time. The CPU isn't gonna be the limiting factor while editing, but when doing a batch job, it's usually h264 export, they may also have an advantage from AMD.

    In fact every task i can think of, 3D production, image editing, sound and music production, etc, i just cannot think of a task which has average CPU utilization of more than 50%, so i think your figure of 80Wh/day disadvantage for AMD is pretty much unobtainable.

    And oh, noone in their right mind runs an internet-facing server as their desktop computer, for a variety of good reasons, so while Linux is easy to use as a server even at home, it ends up a limited-scope, local server, and again, the utilization will be very low. However, you are much less likely to be bothered by the services you're providing due to the sheer number of integer cores. In case you're wondering, in order to saturate a well-managed server running Linux based on up to date desktop components, no connection you can get at home will be sufficient, so it makes sense to co-locate your server at a datacenter or rent theirs. Datacenters go to great lengths to not be connected to a single point, which in your case is your ISP, but to have low-latency connections to many Internet nodes, in order to enable the servers to be used efficiently.

    As for people who don't need a high end system, AMD offers better on-die graphics accelerator and at the lower end, the power consumption difference isn't gonna be big in absolute terms.

    And oh, "downloading files" doesn't count as "complex stuff", it's a very low CPU utilization task, though i don't think this changes much apropos the argument.

    And i don't follow that you need a 125$ mainboard for AMD, 60$ boards work quite well, you generally get away with cheaper boards for AMD than for Intel obviously even when taking into account somewhat higher power-handling capacity of the board needed.

    The power/thermal advantage of course extends to cooling noise, and it makes sense to pay extra to keep the computer noise down. However, the CPU is just so rarely the culprit any longer, with GPU of a high-end computer being the noise-maker number one, vibrations induced by harddisk number two, and only to small extent the CPU and its thermal contribution.

    Hardly anything of the above makes Piledriver the absolute first-choice CPU, however it's not a bad choice still.

    Finally, the desktop market isn't so important, the margins are terrible. The most important bit for now for AMD is the server market. Obviously the big disadvantage vs. Intel with power consumption is there, and is generally important in server market, however with virtualization, AMD can avoid sharp performance drop-off and allow to deploy up to about 1/3rd more VMs per CPU package because of higher number of integer cores, which can offset higher power consumption per package per unit of performance. I think they're onto something there, they have a technology they use on mobile chips now which allows them to sacrifice top frequency but reduce surface area and power consumption. If they make a server chip based on that technology, with high performance-per-watt and 12 or more cores, that is very well within realms of possible and could very well be a GREAT winner in that market.
    Reply
  • CeriseCogburn - Tuesday, October 30, 2012 - link

    more speculation from mr gnu
    This of course caps it all off - the utter amd fanboy blazing in our faces, once again the FANTASY FUTURE is the big amd win :

    " If they make a server chip based on that technology, with high performance-per-watt and 12 or more cores, that is very well within realms of possible and could very well be a GREAT winner in that market. "

    LOL - Why perhaps you should be consulting or their next COO or CEO ?

    I'm telling you man, that is why, that is why.
    Reply
  • Siana - Thursday, October 25, 2012 - link

    It looks like extra 10W in idle test could be largely or solely due to mainboard. There is no clear evidence to what extent and whether at all the new AMD draws more power than Intel at idle.

    A high end CPU and low utilization (mostly idle time) is in fact a very useful and common case. For example, as a software developer, i spend most time reading and writing code (idle), or testing the software (utilization: 15-30% CPU, effectively two cores tops). However, in between, software needs to be compiled, and this is unproductive time which i'd like to keep as short as possible, so i am inclined to chose a high-end CPU. For GCC compiler on Linux, new AMD platform beats any i5 and a Sandy Bridge i7, but is a bit behind Ivy Bridge i7.

    Same with say a person who does video editing, they will have a lot of low-utilization time too just because there's no batch job their system could perform most of the time. The CPU isn't gonna be the limiting factor while editing, but when doing a batch job, it's usually h264 export, they may also have an advantage from AMD.

    In fact every task i can think of, 3D production, image editing, sound and music production, etc, i just cannot think of a task which has average CPU utilization of more than 50%, so i think your figure of 80Wh/day disadvantage for AMD is pretty much unobtainable.

    And oh, noone in their right mind runs an internet-facing server as their desktop computer, for a variety of good reasons, so while Linux is easy to use as a server even at home, it ends up a limited-scope, local server, and again, the utilization will be very low. However, you are much less likely to be bothered by the services you're providing due to the sheer number of integer cores. In case you're wondering, in order to saturate a well-managed server running Linux based on up to date desktop components, no connection you can get at home will be sufficient, so it makes sense to co-locate your server at a datacenter or rent theirs. Datacenters go to great lengths to not be connected to a single point, which in your case is your ISP, but to have low-latency connections to many Internet nodes, in order to enable the servers to be used efficiently.

    As for people who don't need a high end system, AMD offers better on-die graphics accelerator and at the lower end, the power consumption difference isn't gonna be big in absolute terms.

    And oh, "downloading files" doesn't count as "complex stuff", it's a very low CPU utilization task, though i don't think this changes much apropos the argument.

    And i don't follow that you need a 125$ mainboard for AMD, 60$ boards work quite well, you generally get away with cheaper boards for AMD than for Intel obviously even when taking into account somewhat higher power-handling capacity of the board needed.

    The power/thermal advantage of Intel of course extends to cooling noise, and it makes sense to pay extra to keep the computer noise down. However, the CPU is just so rarely the culprit any longer, with GPU of a high-end computer being the noise-maker number one, vibrations induced by harddisk number two, and only to small extent the CPU and its thermal contribution.

    Hardly anything of the above makes Piledriver the absolute first-choice CPU, however it's not a bad choice still.

    Finally, the desktop market isn't so important, the margins are terrible. The most important bit for now for AMD is the server market. Obviously the big disadvantage vs. Intel with power consumption is there, and is generally important in server market, however with virtualization, AMD can avoid sharp performance drop-off and allow to deploy up to about 1/3rd more VMs per CPU package because of higher number of integer cores, which can offset higher power consumption per package per unit of performance. I think they're onto something there, they have a technology they use on mobile chips now which allows them to sacrifice top frequency but reduce surface area and power consumption. If they make a server chip based on that technology, with high performance-per-watt and 12 or more cores, that is very well within realms of possible and could very well be a GREAT winner in that market.
    Reply
  • Kjella - Tuesday, October 23, 2012 - link

    Except the "Any CPU is fine" market isn't about $200 processors, Intel or AMD. That market is now south of $50 and making them pennies with Celerons and Atoms competing with AMD A4 series. You're not spending this kind of money on a CPU unless performance matters. Funny that you're dissing the overclockability of the IVB while pushing a processor that burns 200W when overclocked, you honestly want THAT in your rig instead.

    Honestly, while this at least puts them back in the ring it can't be that great for AMDs finances. They still have the same die size and get to raise prices of their top level process or from $183 to $199, yay. But I guess they have to do something to try bringing non-APU sales back up, Bulldozer can not have sold well at all. And I still fear Haswell will knock AMD out of the ring again...
    Reply

Log in

Don't have an account? Sign up now