Gaming Performance

AMD clearly states in its reviewer's guide that CPU bound gaming performance isn't going to be a strong point of the FX architecture, likely due to its poor single threaded performance. However it is useful to look at both CPU and GPU bound scenarios to paint an accurate picture of how well a CPU handles game workloads, as well as what sort of performance you can expect in present day titles.

Civilization V

Civ V's lateGameView benchmark presents us with two separate scores: average frame rate for the entire test as well as a no-render score that only looks at CPU performance.

Civilization V—1680 x 1050—DX11 High Quality

While we're GPU bound in the full render score, AMD's platform appears to have a bit of an advantage here. We've seen this in the past where one platform will hold an advantage over another in a GPU bound scenario and it's always tough to explain. Within each family however there is no advantage to a faster CPU, everything is just GPU bound.

Civilization V—1680 x 1050—DX11 High Quality

Looking at the no render score, the CPU standings are pretty much as we'd expect. The FX-8150 is thankfully a bit faster than its predecessors, but it still falls behind Sandy Bridge.

Crysis: Warhead

Crysis Warhead Assault Benchmark—1680 x 1050 Mainstream DX10 64-bit

In CPU bound environments in Crysis Warhead, the FX-8150 is actually slower than the old Phenom II. Sandy Bridge continues to be far ahead.

Dawn of War II

Dawn of War II—1680 x 1050—Ultra Settings

We see similar results under Dawn of War II. Lightly threaded performance is simply not a strength of AMD's FX series, and as a result even the old Phenom II X6 pulls ahead.

DiRT 3

We ran two DiRT 3 benchmarks to get an idea for CPU bound and GPU bound performance. First the CPU bound settings:

DiRT 3—Aspen Benchmark—1024 x 768 Low Quality

The FX-8150 doesn't do so well here, again falling behind the Phenom IIs. Under more real world GPU bound settings however, Bulldozer looks just fine:

DiRT 3—Aspen Benchmark—1920 x 1200 High Quality

Dragon Age

Dragon Age Origins—1680 x 1050—Max Settings (no AA/Vsync)

Dragon Age is another CPU bound title, here the FX-8150 falls behind once again.

Metro 2033

Metro 2033 is pretty rough even at lower resolutions, but with more of a GPU bottleneck the FX-8150 equals the performance of the 2500K:

Metro 2033 Frontline Benchmark—1024 x 768—DX11 High Quality

Metro 2033 Frontline Benchmark—1920 x 1200—DX11 High Quality

Rage vt_benchmark

While id's long awaited Rage title doesn't exactly have the best benchmarking abilities, there is one unique aspect of the game that we can test: Megatexture. Megatexture works by dynamically taking texture data from disk and constructing texture tiles for the engine to use, a major component for allowing id's developers to uniquely texture the game world. However because of the heavy use of unique textures (id says the original game assets are over 1TB), id needed to get creative on compressing the game's textures to make them fit within the roughly 20GB the game was allotted.

The result is that Rage doesn't store textures in a GPU-usable format such as DXTC/S3TC, instead storing them in an even more compressed format (JPEG XR) as S3TC maxes out at a 6:1 compression ratio. As a consequence whenever you load a texture, Rage needs to transcode the texture from its storage codec to S3TC on the fly. This is a constant process throughout the entire game and this transcoding is a significant burden on the CPU.

The Benchmark: vt_benchmark flushes the transcoded texture cache and then times how long it takes to transcode all the textures needed for the current scene, from 1 thread to X threads. Thus when you run vt_benchmark 8, for example, it will benchmark from 1 to 8 threads (the default appears to depend on the CPU you have). Since transcoding is done by the CPU this is a pure CPU benchmark. I present the best case transcode time at the maximum number of concurrent threads each CPU can handle:

Rage vt_benchmark—1920 x 1200

The FX-8150 does very well here, but so does the Phenom II X6 1100T. Both are faster than Intel's 2500K, but not quite as good as the 2600K. If you want to see how performance scales with thread count, check out the chart below:

Starcraft 2

Starcraft 2

Starcraft 2 has traditionally done very well on Intel architectures and Bulldozer is no exception to that rule.

World of Warcraft

World of Warcraft

Windows 7 Application Performance Power Consumption
Comments Locked

430 Comments

View All Comments

  • Iketh - Thursday, October 13, 2011 - link

    I play FSX and Starcraft 2... both require copious processing power.

    And I just built an i7-2600K system with a radeon 6870 and blu ray writer for.... $960
  • paultaylor - Thursday, October 13, 2011 - link

    While the benchmarks are very revealing of the "ahead of its time" nature of Bulldozer, I think AMD should've kicked off by focusing on server applications instead of desktop ones.

    Considering what I've seen so far I think some additional benchmarks on threading/scaling would come in handy – it would actually show the true nature of BD as, right now, it’s behaving like a quad-core processor (due to the shared nature of its architecture, I presume) in most cases, rather than an octacore. Charting that out might be very revealing. The situation now looks like Intel's 2nd (3rd?) generation hyperthreading quad-cores provide more efficient multithreading than 8 physical cores on an AMD FX.

    Don’t get me wrong, we’ve heard from the beginning that BD will be optimised for server roles, but then we’re outside the feedback loop. Shouldn’t someone inside AMD be minding the store and making sure the lower shelves are also stocked with something we want?

    A longer pipeline and the old “we’ll make it up in MHz” line reeks of Netburst, unfortunately, and we all know how that ended. Looking at the tranny count, it’s got almost twice as many as the Gulftown, with 27% bigger die size for the entire CPU… which will mean poorer yields and higher costs for AMD, not to mention that either the fabbing process is really being tweaked or the speed bumps will not come at all, as the TDP is already high-ish. Ironically it reminds me of Fermi. Speaking of which… BD may become the punchline of many jokes like “What do you get when you cross a Pentium 4 and a Fermi?”

    On the other hand it seems AMD has managed one small miracle, their roadmaps will become more predictable (a good thing from a business perspective) and that will exert a positive influence with system integrators. Planning products ahead of the game, in particular in this 12-month cycle, might do some good for AMD, if they survive the overal skepticism that BD is currently "enjoying".

    Other than that, another fine unbiased article.
  • rickcain2320 - Thursday, October 13, 2011 - link

    Bulldozer/Zambezi seems to look more like a server CPU repackaged as a consumer grade one. Excellent in heavily threaded apps, not so hot in single threads.

    One CPU that is promised but isn't here is the FX-4170. I would have liked to see some benchmarks on it.
  • gvaley - Thursday, October 13, 2011 - link

    We all get that. The problem is, with this power consumption, it can't make it into the server space either.
  • kevith - Thursday, October 13, 2011 - link

    Having waited so long for this, it´s a bit disappointing, when I compare price/performance.

    I went from C2D E 7300 acouple of years ago, and changed setup to Athlon II x2 250, and the performance difference made me regret right away.

    And now, I have to change my MB and memory to DDR3 no matter what I choose, Intel or AMD. So I´ve looked forward to this release.

    And it makes my choice very easy: I´l go back to Intel, no more AMD for me on the CPU side. And Ivy Bridge is coming, and will definetely smoke AMD.

    Which is sad, it would have been nice with some competition.
  • eccl1213 - Thursday, October 13, 2011 - link

    Earlier this week most sites reported that the FX and BD based Opteron 4200 and 6200 where both being released on Oct 12th.

    But I haven't found a single review site with interlagos benchmarks.

    Have those parts been further delayed? We know revenue shipment happened a while back but I'm not seeing any mention of them in the wild yet.
  • xtreme762 - Thursday, October 13, 2011 - link

    I haven't bought an Intel chip since 1997. But with this BS bulldozer launch, that is now going to change! amd should be ashamed of themselves. I for one will now sell all of my amd stock and purchase Intel. I will probably only end up with a few shares, but at this point, I cannot see supporting liars and fakes. And I will NEVER buy an amd product again, not a video card, cpu, mobo, not nothing! What a disappointment amd is.....
    All the amd crap I have will be tossed in the trash. I'm not even going to bother trying to sell it. WooHoo amd made a world record OC with a cpu not worth it's weight in dog poo!
  • connor4312 - Thursday, October 13, 2011 - link

    Very interesting review. I'd be interested to see Bulldozer's benchmarks when it's overclocked, which, if I am correct, is higher than any Intel CPU can go. AMD seems to have made a turnaround in this aspect - Intel CPUs were historically more overclock-able.
  • Suntan - Thursday, October 13, 2011 - link

    As always, a very detailed review. But what about the capability of the "value" chips? Namely, is it worth it to spend around $100 to replace an Anthlon X4 with an FX4100?

    There are a number of us that picked up the X4 a couple years back for its low cost ability to encode and do general NLE editing of video. Is it worthwhile to replace that chip with the FX4100 in our AM3+ mobos? And what kind of improvements will there be?

    As you rightly stated, a lot of us are attracted to AMD for their bang-for-buck. Just because the industry as a whole wants to bump up prices endlessly, there are still a lot of us that like to see good comparisons of the performance of CPUs available for around 1 Benjamin.

    -Suntan
  • Pipperox - Thursday, October 13, 2011 - link

    Frankly, it seems to me the disappointment of AMD fans to be quite excessive.
    Worst CPU ever?
    What was then Barcelona, which couldn't compete with Core 2?

    Bulldozer, set aside old single threaded applications, is slotting between a Core i5 2500 and Core i7 2600K.

    Which other AMD CPU outperforms in any single benchmark a Core i7 2600k?

    A higher clocked Thuban with 2 extra cores would have been hotter and more expensive to produce.

    Setting aside AMD's stupid marketing, the AMD FX-8150 is a very efficient QUAD core.
    The performance per core is almost as good as Sandy Bridge, in properly threaded applications.

    Then they came with the marketing stunt of calling it a 8 core.. it's not, in fact it doesn't have 8 COMPLETE cores; in terms of processing units, an 8 core Bulldozer is very close to a Sandy Bridge QUAD core.

    The only reason why Bulldozer's die is so large is the enormous amount of cache, which i'm sure makes sense only in the server environment, while the low latency / high bandwidth cache of Sandy Bridge is much more efficient for common applications.

    I think with Bulldozer AMD has put a good foundation for the future: today, on the desktop, there is no reason not to buy a Sandy Bridge (however i'm expecting Bulldozer's street price to drop pretty quickly).

    However IF AMD is able to execute the next releases at the planned pace (+10-15% IPC in 2012 and going forward every year) THEN they'll be back in the game.

Log in

Don't have an account? Sign up now