Gaming Performance

AMD clearly states in its reviewer's guide that CPU bound gaming performance isn't going to be a strong point of the FX architecture, likely due to its poor single threaded performance. However it is useful to look at both CPU and GPU bound scenarios to paint an accurate picture of how well a CPU handles game workloads, as well as what sort of performance you can expect in present day titles.

Civilization V

Civ V's lateGameView benchmark presents us with two separate scores: average frame rate for the entire test as well as a no-render score that only looks at CPU performance.

Civilization V—1680 x 1050—DX11 High Quality

While we're GPU bound in the full render score, AMD's platform appears to have a bit of an advantage here. We've seen this in the past where one platform will hold an advantage over another in a GPU bound scenario and it's always tough to explain. Within each family however there is no advantage to a faster CPU, everything is just GPU bound.

Civilization V—1680 x 1050—DX11 High Quality

Looking at the no render score, the CPU standings are pretty much as we'd expect. The FX-8150 is thankfully a bit faster than its predecessors, but it still falls behind Sandy Bridge.

Crysis: Warhead

Crysis Warhead Assault Benchmark—1680 x 1050 Mainstream DX10 64-bit

In CPU bound environments in Crysis Warhead, the FX-8150 is actually slower than the old Phenom II. Sandy Bridge continues to be far ahead.

Dawn of War II

Dawn of War II—1680 x 1050—Ultra Settings

We see similar results under Dawn of War II. Lightly threaded performance is simply not a strength of AMD's FX series, and as a result even the old Phenom II X6 pulls ahead.

DiRT 3

We ran two DiRT 3 benchmarks to get an idea for CPU bound and GPU bound performance. First the CPU bound settings:

DiRT 3—Aspen Benchmark—1024 x 768 Low Quality

The FX-8150 doesn't do so well here, again falling behind the Phenom IIs. Under more real world GPU bound settings however, Bulldozer looks just fine:

DiRT 3—Aspen Benchmark—1920 x 1200 High Quality

Dragon Age

Dragon Age Origins—1680 x 1050—Max Settings (no AA/Vsync)

Dragon Age is another CPU bound title, here the FX-8150 falls behind once again.

Metro 2033

Metro 2033 is pretty rough even at lower resolutions, but with more of a GPU bottleneck the FX-8150 equals the performance of the 2500K:

Metro 2033 Frontline Benchmark—1024 x 768—DX11 High Quality

Metro 2033 Frontline Benchmark—1920 x 1200—DX11 High Quality

Rage vt_benchmark

While id's long awaited Rage title doesn't exactly have the best benchmarking abilities, there is one unique aspect of the game that we can test: Megatexture. Megatexture works by dynamically taking texture data from disk and constructing texture tiles for the engine to use, a major component for allowing id's developers to uniquely texture the game world. However because of the heavy use of unique textures (id says the original game assets are over 1TB), id needed to get creative on compressing the game's textures to make them fit within the roughly 20GB the game was allotted.

The result is that Rage doesn't store textures in a GPU-usable format such as DXTC/S3TC, instead storing them in an even more compressed format (JPEG XR) as S3TC maxes out at a 6:1 compression ratio. As a consequence whenever you load a texture, Rage needs to transcode the texture from its storage codec to S3TC on the fly. This is a constant process throughout the entire game and this transcoding is a significant burden on the CPU.

The Benchmark: vt_benchmark flushes the transcoded texture cache and then times how long it takes to transcode all the textures needed for the current scene, from 1 thread to X threads. Thus when you run vt_benchmark 8, for example, it will benchmark from 1 to 8 threads (the default appears to depend on the CPU you have). Since transcoding is done by the CPU this is a pure CPU benchmark. I present the best case transcode time at the maximum number of concurrent threads each CPU can handle:

Rage vt_benchmark—1920 x 1200

The FX-8150 does very well here, but so does the Phenom II X6 1100T. Both are faster than Intel's 2500K, but not quite as good as the 2600K. If you want to see how performance scales with thread count, check out the chart below:

Starcraft 2

Starcraft 2

Starcraft 2 has traditionally done very well on Intel architectures and Bulldozer is no exception to that rule.

World of Warcraft

World of Warcraft

Windows 7 Application Performance Power Consumption
Comments Locked

430 Comments

View All Comments

  • TekDemon - Wednesday, October 12, 2011 - link

    Yeah I paid $179 for my i5 2500K and it hums along at 4.8Ghz (can hit 5Ghz+ but I wanted to keep the voltages reasonable). Clock for clock bulldozer is slower since it's only competitive when the higher clocked part is compared to a stock 2500K.
  • jleach1 - Friday, October 21, 2011 - link

    Their cores offer, what 75% the speed of a normal core?

    The fact is, this supposed "8" core processor performs worse than AMDs own 6 core processor. There's no way we can get away with calling it an 8 OR a 6 core.

    For all intents and purposes, it's a quad core.
  • estarkey7 - Wednesday, October 12, 2011 - link

    You took the words right out of my mouth! I am a big AMD fanboy, and I was waiting with baited breath to jump on the bulldozer bandwagon for my next rig (and I probably still will). But this is ridiculous! I'm a computer engineer and where the hell were the simulations AMD? Seems like you could have halved the L3 and kept in the extra FP resources and been better than what you are doing now.

    Also, don't bitch about that Windows 7 doesn't realize the architecture of Bulldozer, you knew that 18 months ago, so you should have been writing a patch so that would have been a non issue.

    The absolutely, positively only reason i will by an 8150-FX is that my current desktop is a dual core Athlon running at 2.2GHz. So to me, the performance increase over my current desktop would be massive. But on second thought, if I have stuck with such a slow system this long, I might another 3-5 months for Piledriver.
  • Taft12 - Wednesday, October 12, 2011 - link

    <i>The power consumption is absolutely through the roof -- unacceptable for 32nm, really!</i>

    Uhh, you did see the bar graph for idle power usage, right? And keep in mind this is an 8-core CPU compared to 4- and 6-core competitors.

    Like you, I'm also very interested in the 4- and 6-core Bulldozers. Anand let us down by only reviewing the flagship Llano. Hopefully he doesn't do the same with Bulldozer.
  • Tom Womack - Wednesday, October 12, 2011 - link

    Yes, the idle power is significantly worse than either of the Sandy Bridge platforms he's comparing it to
  • JasperJanssen - Wednesday, October 12, 2011 - link

    What Anand reviews is mostly down to what AMD will let him have -- even sites the size of Anandtech don't simply get to call and order parts from a catalogue for review samples.
  • Taft12 - Wednesday, October 12, 2011 - link

    AMD doesn't have much control over "review samples" that can be purchased at retail, as you can do with the A4-3300 et al. for weeks now
  • enterco - Wednesday, October 12, 2011 - link

    I read that 'at 1920x1200/1080 the gaming performance depends much mure on the GPU. Anyway, I'm happy with my i5-2500k ;-), Bulldozer does not seem to worth the wait.
  • ninjaquick - Wednesday, October 12, 2011 - link

    Blame shitty game developers.
  • AssBall - Wednesday, October 12, 2011 - link

    Kinda what I was thinking. When they are all developing games for a 6 year old 3 core PowerPC system with 512MB RAM (xbox) instead of a computer, its no bloody wonder.

Log in

Don't have an account? Sign up now