Last year's launch of AMD's FX processors was honestly disappointing. The Bulldozer CPU cores that were bundled into each Zambezi chip were hardly power efficient and in many areas couldn't significantly outperform AMD's previous generation platform. Look beyond the direct AMD comparison and the situation looked even worse. In our conclusion to last year's FX-8150 review I wrote the following:

"Single threaded performance is my biggest concern, and compared to Sandy Bridge there's a good 40-50% advantage the i5 2500K enjoys over the FX-8150. My hope is that future derivatives of the FX processor (perhaps based on Piledriver) will boast much more aggressive Turbo Core frequencies, which would do wonders at eating into that advantage."

The performance advantage that Intel enjoyed at the time was beyond what could be erased by a single generation. To make matters worse, before AMD could rev Bulldozer, Intel already began shipping Ivy Bridge - a part that not only increased performance but decreased power consumption as well. It's been a rough road for AMD over these past few years, but you have to give credit where it's due: we haven't seen AMD executing this consistently in quite a while. As promised we've now had multiple generations of each platform ship from AMD. Brazos had a mild update, Llano paved the way for Trinity which is now shipping, and around a year after Zambezi's launch we have Vishera: the Piledriver based AMD FX successor.

At a high level, Vishera swaps out the Bulldozer cores from Zambezi and replaces them with Piledriver. This is the same CPU core that is used in Trinity, but it's optimized for a very different purpose here in Vishera. While Trinity had to worry about working nicely in a laptop, Vishera is strictly a high-end desktop/workstation part. There's no on-die graphics for starters. Clock speeds and TDPs are also up compared to Trinity.

CPU Specification Comparison
CPU Manufacturing Process Cores Transistor Count Die Size
AMD Vishera 8C 32nm 8 1.2B 315mm2
AMD Zambezi 8C 32nm 8 1.2B 315mm2
Intel Ivy Bridge 4C 22nm 4 1.4B 160mm2
Intel Sandy Bridge E (6C) 32nm 6 2.27B 435mm2
Intel Sandy Bridge E (4C) 32nm 4 1.27B 294mm2
Intel Sandy Bridge 4C 32nm 4 1.16B 216mm2
Intel Lynnfield 4C 45nm 4 774M 296mm2
Intel Sandy Bridge 2C (GT1) 32nm 2 504M 131mm2
Intel Sandy Bridge 2C (GT2) 32nm 2 624M 149mm2

Vishera is still built on the same 32nm GlobalFoundries SOI process as Zambezi, which means there isn't much room for additional architectural complexity without ballooning die area, and not a whole lot of hope for significantly decreasing power consumption. As a fabless semiconductor manufacturer, AMD is now at GF's mercy when it comes to moving process technology forward. It simply has to make 32nm work for now. Piledriver is a light evolution over Bulldozer, so there's actually no substantial increase in die area compared to the previous generation. Cache sizes remain the same as well, which keeps everything roughly the same. These chips are obviously much larger than Intel's 22nm Ivy Bridge parts, but Intel has a full node advantage there which enables that.

Piledriver is a bit more power efficient than Bulldozer, which enables AMD to drive Vishera's frequency up while remaining in the same thermal envelope as Zambezi. The new lineup is in the table below:

CPU Specification Comparison
Processor Codename Cores Clock Speed Max Turbo L2/L3 Cache TDP Price
AMD FX-8350 Vishera 8 4.0GHz 4.2GHz 8MB/8MB 125W $199
AMD FX-8150 Zambezi 8 3.6GHz 4.2GHz 8MB/8MB 125W $183
AMD FX-8320 Vishera 8 3.5GHz 4.0GHz 8MB/8MB 125W $169
AMD FX-8120 Zambezi 8 3.1GHz 4.0GHz 8MB/8MB 125W $153
AMD FX-6300 Vishera 6 3.5GHz 4.1GHz 6MB/8MB 95W $132
AMD FX-6100 Zambezi 6 3.3GHz 3.9GHz 6MB/8MB 95W $112
AMD FX-4300 Vishera 4 3.8GHz 4.0GHz 4MB/4MB 95W $122
AMD FX-4100 Zambezi 4 3.6GHz 3.8GHz 4MB/4MB 95W $101

The table above says it all. TDPs haven't changed, cache sizes haven't changed and neither have core counts. Across the board Vishera ships at higher base frequencies than the equivalent Zambezi part, but without increasing max turbo frequency (in the case of the 8-core parts). The 6 and 4 core versions get boosts to both sides, without increasing TDP. In our Trinity notebook review I called the new CPU core Bulldozed Tuned. The table above supports that characterization.

It's also important to note that AMD's pricing this time around is far more sensible. While the FX-8150 debuted at $245, the 8350 drops that price to $199 putting it around $40 less than the Core i5 3570K. The chart below shows where AMD expects all of these CPUs to do battle:

AMD's targets are similar to what they were last time: Intel's Core i5 and below. All of the FX processors remain unlocked and ship fully featured with hardware AES acceleration enabled. Most Socket-AM3+ motherboards on the market today should support the new parts with nothing more than a BIOS update. In fact, I used the same ASUS Crosshair V Formula motherboard I used last year (with a much newer BIOS) for today's review:

The Test

For more comparisons be sure to check out our performance database: Bench.

Motherboard: ASUS Maximus V Gene (Intel Z77)
ASUS Crosshair V Formula (AMD 990FX)
Hard Disk: Intel X25-M SSD (80GB)
Crucial RealSSD C300
OCZ Agility 3 (240GB)
Samsung SSD 830 (512GB)
Memory: 4 x 4GB G.Skill Ripjaws X DDR3-1600 9-9-9-20
Video Card: ATI Radeon HD 5870 (Windows 7)
NVIDIA GeForce GTX 680 (Windows 8)
Desktop Resolution: 1920 x 1200
OS: Windows 7 x64/Windows 8 Pro x64

General Performance
POST A COMMENT

241 Comments

View All Comments

  • CeriseCogburn - Tuesday, October 30, 2012 - link

    Funny how the same type of thing could be said in the video card wars, but all those amd fanboys won't say it there !

    Isn't that strange, how the rules change, all for poor little crappy amd the loser, in any and every direction possible, even in opposite directions, so long as it fits the current crap hand up amd needs to "get there" since it's never "beenthere". LOL
    Reply
  • whatthehey - Tuesday, October 23, 2012 - link

    We've heard all of this before, and while much of what you say is true, and ignoring the idiotic "Windoze" comments not to mention the tirade on "evil Intel", Anand sums it up quite clearly:

    Vishera performance isn't terrible but it's not great either. It can beat Intel in a few specific workloads (which very few people will ever run consistently), but in common workloads (lightly threaded) it falls behind by a large margin. All of this would be fine, were it not for the fact that Vishera basically sucks down a lot of power in comparison to Ivy Bridge and Sandy Bridge. Yes, that's right: even at 32nm with Sandy Bridge, Intel beats Vishera hands down.

    If we assume Anand's AMD platform is a bit heavy on power use by 15W (which seems kind as it's probably more like 5-10W extra at most), then we have idle power slightly in Intel's favor but load power favors Intel by 80W. 80W in this case is 80% more power than the Intel platform, which means AMD is basically using a lot more energy just to keep up (and the Sandy Bridge i5-2500K uses about 70W less).

    So go ahead and "save" all that money with your performance-for-dollar champion where you spend $200 on the CPU, $125 on the motherboard (because you still need a good motherboard, not some piece of crap), coming to $325 total for the core platform. Intel i5-3570K goes for $220 most of the time (e.g. Amazon), but you can snag it for just $190 (plus $10 shipping) from MicroCenter right now. As for motherboards, a decent Z77 motherboard will also set you back around $125.

    So if we go with a higher class Intel motherboard, pay Newegg pricing on all parts, and go with a cheaper (lower class) AMD motherboard, we're basically talking $220 for the FX-8350 (price gouging by Newegg), $90 for a mediocre Biostar 970 chipset motherboard, and a total of $310. If we go Intel it's $230 for the i5-3570K, and let's go nuts and get the $150 Gigabyte board, bringing us to $380. You save $70 in that case (which is already seriously biased since we're talking high-end Gigabyte vs. mainstream Biostar).

    Now, let's just go with power use of 60W Intel vs. 70W AMD, and if you never push the CPUs you only would spend about $8.75 extra per year leaving the systems on 24/7. Turn them off most of the day (8 hours per day use) and we're at less than $3 difference in power costs per year. Okay, fine, but why get a $200+ CPU if you're going to be idle and power off 2/3 of the day?

    Let's say you're an enthusiast (which Beenthere obviously tries to be, even with the heavy AMD bias), so you're playing games, downloading files, and doing other complex stuff where your PC is on all the time. Hell, maybe you're even running Linux with a server on the system, so it's both loaded moderately to heavily and powered on 24/7! That's awesome, because now the AMD system uses 80W more power per day, which comes out to $70 in additional power costs per year. Oops. All of your "best performance-for-the-dollar" make believe talk goes out the window.

    Even the areas where AMD leads (e.g. x264), they do so by a small to moderate margin but use almost twice as much power. x264 is 26% faster on the FX-8350 compared to i5-3570K, but if you keep your system for even two years you could buy the i7-3770K (FX is only 3% faster in that case) and you'd come out ahead in terms of overall cost.

    The only reason to get the AMD platform is if you run a specific workload where AMD is faster (e.g. x264), or if you're going budget and buying the FX-4300 and you don't need performance. Or if you're a bleeding heart liberal with some missing brain cells that thinks that support one gigantic corporation (AMD) makes you a good person while supporting another even more gigantic corporation (Intel) makes you bad. Let's not use products from any of the largest corporations in the world in that case, because every one of them is "evil and law violating" to some extent. Personally, I'm going to continue shopping at Walmart and using Intel CPUs until/unless something clearly better comes along.
    Reply
  • DarkXale - Tuesday, October 23, 2012 - link

    I would also add in the cost of getting a 100W more powerful power supply. (At least)

    The cost of the better cooling (either via better/more fans or better case), And the 'cost' of having a system with a higher noise profile.
    Reply
  • Finally - Tuesday, October 23, 2012 - link

    That talk suffers from the same inability to consider any other viewpoint but that of the hardware fetishist.

    If you are fapping to benchmarks in your free time you are the 1%.
    The other 99% couldn't care less which company produced their CPU, GPU or whatever is working the "magic" inside their PC.
    Reply
  • dananski - Tuesday, October 23, 2012 - link

    I agree with you but stopped reading at "uses 80W more power per day" because you have ruined your trustworthyness with unit fail. Reply
  • CeriseCogburn - Tuesday, October 30, 2012 - link

    Hey idiot, he got everything correct except saying 80W more every second of the day, and suddenly, you the brilliant critic, no doubt, discount everything else.
    Well guess what genius - if you can detect an error, and that's all you got, HE IS LARGELY CORRECT, AND EVEN CORRECT ON THE POINT concerning the unit error you criticized.
    So who the gigantic FOOL is that completely ruined their own credibility by being such a moronic freaking idiot parrot, that no one should pay attention to ?
    THAT WOULD BE YOU, DUMB DUMB !

    Here's a news flash for all you skum sucking doofuses : Just because someone gets some minor grammatical or speech perfection issue written improperly, THEY DON'T LOSE A DAMN THING AND CERTAINLY NOT CREDIBILITY WHEN YOU FRIKKIN RETARDS CANNOT PROVE A SINGLE POINT OF THE MANY MADE INCORRECT !

    It really would be nice if you babbling idiots stopped doing it. but you do it because it's stupid, it's irritating, it's incorrect, and you've seen a hundred other jerk offs like ourself pull that crap, and you just cannot resist, because that's all you've got, right ?

    LOL - now you may complain about caps.
    Reply
  • Siana - Thursday, October 25, 2012 - link

    It looks like extra 10W in idle test could be largely or solely due to mainboard. There is no clear evidence to what extent and whether at all the new AMD draws more power than Intel at idle.

    A high end CPU and low utilization (mostly idle time) is in fact a very useful and common case. For example, as a software developer, i spend most time reading and writing code (idle), or testing the software (utilization: 15-30% CPU, effectively two cores tops). However, in between, software needs to be compiled, and this is unproductive time which i'd like to keep as short as possible, so i am inclined to chose a high-end CPU. For GCC compiler on Linux, new AMD platform beats any i5 and a Sandy Bridge i7, but is a bit behind Ivy Bridge i7.

    Same with say a person who does video editing, they will have a lot of low-utilization time too just because there's no batch job their system could perform most of the time. The CPU isn't gonna be the limiting factor while editing, but when doing a batch job, it's usually h264 export, they may also have an advantage from AMD.

    In fact every task i can think of, 3D production, image editing, sound and music production, etc, i just cannot think of a task which has average CPU utilization of more than 50%, so i think your figure of 80Wh/day disadvantage for AMD is pretty much unobtainable.

    And oh, noone in their right mind runs an internet-facing server as their desktop computer, for a variety of good reasons, so while Linux is easy to use as a server even at home, it ends up a limited-scope, local server, and again, the utilization will be very low. However, you are much less likely to be bothered by the services you're providing due to the sheer number of integer cores. In case you're wondering, in order to saturate a well-managed server running Linux based on up to date desktop components, no connection you can get at home will be sufficient, so it makes sense to co-locate your server at a datacenter or rent theirs. Datacenters go to great lengths to not be connected to a single point, which in your case is your ISP, but to have low-latency connections to many Internet nodes, in order to enable the servers to be used efficiently.

    As for people who don't need a high end system, AMD offers better on-die graphics accelerator and at the lower end, the power consumption difference isn't gonna be big in absolute terms.

    And oh, "downloading files" doesn't count as "complex stuff", it's a very low CPU utilization task, though i don't think this changes much apropos the argument.

    And i don't follow that you need a 125$ mainboard for AMD, 60$ boards work quite well, you generally get away with cheaper boards for AMD than for Intel obviously even when taking into account somewhat higher power-handling capacity of the board needed.

    The power/thermal advantage of course extends to cooling noise, and it makes sense to pay extra to keep the computer noise down. However, the CPU is just so rarely the culprit any longer, with GPU of a high-end computer being the noise-maker number one, vibrations induced by harddisk number two, and only to small extent the CPU and its thermal contribution.

    Hardly anything of the above makes Piledriver the absolute first-choice CPU, however it's not a bad choice still.

    Finally, the desktop market isn't so important, the margins are terrible. The most important bit for now for AMD is the server market. Obviously the big disadvantage vs. Intel with power consumption is there, and is generally important in server market, however with virtualization, AMD can avoid sharp performance drop-off and allow to deploy up to about 1/3rd more VMs per CPU package because of higher number of integer cores, which can offset higher power consumption per package per unit of performance. I think they're onto something there, they have a technology they use on mobile chips now which allows them to sacrifice top frequency but reduce surface area and power consumption. If they make a server chip based on that technology, with high performance-per-watt and 12 or more cores, that is very well within realms of possible and could very well be a GREAT winner in that market.
    Reply
  • CeriseCogburn - Tuesday, October 30, 2012 - link

    more speculation from mr gnu
    This of course caps it all off - the utter amd fanboy blazing in our faces, once again the FANTASY FUTURE is the big amd win :

    " If they make a server chip based on that technology, with high performance-per-watt and 12 or more cores, that is very well within realms of possible and could very well be a GREAT winner in that market. "

    LOL - Why perhaps you should be consulting or their next COO or CEO ?

    I'm telling you man, that is why, that is why.
    Reply
  • Siana - Thursday, October 25, 2012 - link

    It looks like extra 10W in idle test could be largely or solely due to mainboard. There is no clear evidence to what extent and whether at all the new AMD draws more power than Intel at idle.

    A high end CPU and low utilization (mostly idle time) is in fact a very useful and common case. For example, as a software developer, i spend most time reading and writing code (idle), or testing the software (utilization: 15-30% CPU, effectively two cores tops). However, in between, software needs to be compiled, and this is unproductive time which i'd like to keep as short as possible, so i am inclined to chose a high-end CPU. For GCC compiler on Linux, new AMD platform beats any i5 and a Sandy Bridge i7, but is a bit behind Ivy Bridge i7.

    Same with say a person who does video editing, they will have a lot of low-utilization time too just because there's no batch job their system could perform most of the time. The CPU isn't gonna be the limiting factor while editing, but when doing a batch job, it's usually h264 export, they may also have an advantage from AMD.

    In fact every task i can think of, 3D production, image editing, sound and music production, etc, i just cannot think of a task which has average CPU utilization of more than 50%, so i think your figure of 80Wh/day disadvantage for AMD is pretty much unobtainable.

    And oh, noone in their right mind runs an internet-facing server as their desktop computer, for a variety of good reasons, so while Linux is easy to use as a server even at home, it ends up a limited-scope, local server, and again, the utilization will be very low. However, you are much less likely to be bothered by the services you're providing due to the sheer number of integer cores. In case you're wondering, in order to saturate a well-managed server running Linux based on up to date desktop components, no connection you can get at home will be sufficient, so it makes sense to co-locate your server at a datacenter or rent theirs. Datacenters go to great lengths to not be connected to a single point, which in your case is your ISP, but to have low-latency connections to many Internet nodes, in order to enable the servers to be used efficiently.

    As for people who don't need a high end system, AMD offers better on-die graphics accelerator and at the lower end, the power consumption difference isn't gonna be big in absolute terms.

    And oh, "downloading files" doesn't count as "complex stuff", it's a very low CPU utilization task, though i don't think this changes much apropos the argument.

    And i don't follow that you need a 125$ mainboard for AMD, 60$ boards work quite well, you generally get away with cheaper boards for AMD than for Intel obviously even when taking into account somewhat higher power-handling capacity of the board needed.

    The power/thermal advantage of Intel of course extends to cooling noise, and it makes sense to pay extra to keep the computer noise down. However, the CPU is just so rarely the culprit any longer, with GPU of a high-end computer being the noise-maker number one, vibrations induced by harddisk number two, and only to small extent the CPU and its thermal contribution.

    Hardly anything of the above makes Piledriver the absolute first-choice CPU, however it's not a bad choice still.

    Finally, the desktop market isn't so important, the margins are terrible. The most important bit for now for AMD is the server market. Obviously the big disadvantage vs. Intel with power consumption is there, and is generally important in server market, however with virtualization, AMD can avoid sharp performance drop-off and allow to deploy up to about 1/3rd more VMs per CPU package because of higher number of integer cores, which can offset higher power consumption per package per unit of performance. I think they're onto something there, they have a technology they use on mobile chips now which allows them to sacrifice top frequency but reduce surface area and power consumption. If they make a server chip based on that technology, with high performance-per-watt and 12 or more cores, that is very well within realms of possible and could very well be a GREAT winner in that market.
    Reply
  • Kjella - Tuesday, October 23, 2012 - link

    Except the "Any CPU is fine" market isn't about $200 processors, Intel or AMD. That market is now south of $50 and making them pennies with Celerons and Atoms competing with AMD A4 series. You're not spending this kind of money on a CPU unless performance matters. Funny that you're dissing the overclockability of the IVB while pushing a processor that burns 200W when overclocked, you honestly want THAT in your rig instead.

    Honestly, while this at least puts them back in the ring it can't be that great for AMDs finances. They still have the same die size and get to raise prices of their top level process or from $183 to $199, yay. But I guess they have to do something to try bringing non-APU sales back up, Bulldozer can not have sold well at all. And I still fear Haswell will knock AMD out of the ring again...
    Reply

Log in

Don't have an account? Sign up now