Gaming Performance

AMD clearly states in its reviewer's guide that CPU bound gaming performance isn't going to be a strong point of the FX architecture, likely due to its poor single threaded performance. However it is useful to look at both CPU and GPU bound scenarios to paint an accurate picture of how well a CPU handles game workloads, as well as what sort of performance you can expect in present day titles.

Civilization V

Civ V's lateGameView benchmark presents us with two separate scores: average frame rate for the entire test as well as a no-render score that only looks at CPU performance.

Civilization V—1680 x 1050—DX11 High Quality

While we're GPU bound in the full render score, AMD's platform appears to have a bit of an advantage here. We've seen this in the past where one platform will hold an advantage over another in a GPU bound scenario and it's always tough to explain. Within each family however there is no advantage to a faster CPU, everything is just GPU bound.

Civilization V—1680 x 1050—DX11 High Quality

Looking at the no render score, the CPU standings are pretty much as we'd expect. The FX-8150 is thankfully a bit faster than its predecessors, but it still falls behind Sandy Bridge.

Crysis: Warhead

Crysis Warhead Assault Benchmark—1680 x 1050 Mainstream DX10 64-bit

In CPU bound environments in Crysis Warhead, the FX-8150 is actually slower than the old Phenom II. Sandy Bridge continues to be far ahead.

Dawn of War II

Dawn of War II—1680 x 1050—Ultra Settings

We see similar results under Dawn of War II. Lightly threaded performance is simply not a strength of AMD's FX series, and as a result even the old Phenom II X6 pulls ahead.

DiRT 3

We ran two DiRT 3 benchmarks to get an idea for CPU bound and GPU bound performance. First the CPU bound settings:

DiRT 3—Aspen Benchmark—1024 x 768 Low Quality

The FX-8150 doesn't do so well here, again falling behind the Phenom IIs. Under more real world GPU bound settings however, Bulldozer looks just fine:

DiRT 3—Aspen Benchmark—1920 x 1200 High Quality

Dragon Age

Dragon Age Origins—1680 x 1050—Max Settings (no AA/Vsync)

Dragon Age is another CPU bound title, here the FX-8150 falls behind once again.

Metro 2033

Metro 2033 is pretty rough even at lower resolutions, but with more of a GPU bottleneck the FX-8150 equals the performance of the 2500K:

Metro 2033 Frontline Benchmark—1024 x 768—DX11 High Quality

Metro 2033 Frontline Benchmark—1920 x 1200—DX11 High Quality

Rage vt_benchmark

While id's long awaited Rage title doesn't exactly have the best benchmarking abilities, there is one unique aspect of the game that we can test: Megatexture. Megatexture works by dynamically taking texture data from disk and constructing texture tiles for the engine to use, a major component for allowing id's developers to uniquely texture the game world. However because of the heavy use of unique textures (id says the original game assets are over 1TB), id needed to get creative on compressing the game's textures to make them fit within the roughly 20GB the game was allotted.

The result is that Rage doesn't store textures in a GPU-usable format such as DXTC/S3TC, instead storing them in an even more compressed format (JPEG XR) as S3TC maxes out at a 6:1 compression ratio. As a consequence whenever you load a texture, Rage needs to transcode the texture from its storage codec to S3TC on the fly. This is a constant process throughout the entire game and this transcoding is a significant burden on the CPU.

The Benchmark: vt_benchmark flushes the transcoded texture cache and then times how long it takes to transcode all the textures needed for the current scene, from 1 thread to X threads. Thus when you run vt_benchmark 8, for example, it will benchmark from 1 to 8 threads (the default appears to depend on the CPU you have). Since transcoding is done by the CPU this is a pure CPU benchmark. I present the best case transcode time at the maximum number of concurrent threads each CPU can handle:

Rage vt_benchmark—1920 x 1200

The FX-8150 does very well here, but so does the Phenom II X6 1100T. Both are faster than Intel's 2500K, but not quite as good as the 2600K. If you want to see how performance scales with thread count, check out the chart below:

Starcraft 2

Starcraft 2

Starcraft 2 has traditionally done very well on Intel architectures and Bulldozer is no exception to that rule.

World of Warcraft

World of Warcraft

Windows 7 Application Performance Power Consumption
Comments Locked

430 Comments

View All Comments

  • Saxie81 - Wednesday, October 12, 2011 - link

    Ouch.... Not looking good. :S

    Thanks for the reply, again great review!!
  • velis - Wednesday, October 12, 2011 - link

    Ignoring the power consumption it seems to me that @4.6GHz it should start being quite competitive.
    So can we expect base clocks to rise once significant volume of these chips starts getting out and GloFo refines the process?
    I also must admit I didn't expect 2 bn transistors. All the time AMD was bragging about how much they saved and then we get this behemoth. No wonder they have process issues. Such big chips always do.
  • cfaalm - Wednesday, October 12, 2011 - link

    Well it is an 8-core, not a 4 core. 2x 995M (Sandybridge 4C) almost 2B, though I am sure the multply isn't exactly correct. A lot of it depens on the L3/L2 RAM amounts. The savings seem to be minimal.

    I am still confused about why they so deliberately chose to go with a relatively low single thread performance. My main application is multithreaded, but since it's such a mixed bag overall I am pretty unsure if this will be my next CPU, unless I get to see convincing Cubase 6 benchies. For an FX moniker it needs to perform better than this anyway.

    I'll throw in a lyric from The Fixx
    "It doesn't mean much now, it's built for the future."
  • TekDemon - Wednesday, October 12, 2011 - link

    Wow, no wonder they say you need water cooling or better to go 5Ghz+.
  • enterco - Wednesday, October 12, 2011 - link

    AMD should send a developer team to CryTek to help them release a patch able to use more cores :)
  • medi01 - Wednesday, October 12, 2011 - link

    Uhm, what about other numbers?
  • IlllI - Wednesday, October 12, 2011 - link

    this might be the final nail in the coffin. We might have to wait longer for it to be competitive? People have literally been waiting for -years- for amd to catch up.
    probably by the time piledriver(or whatever it'll be called) comes out, ib will be out (and even further behind intel)

    btw I think tomshardware tested it with windows 8 and it was still a turd.

    I seriously hope you can get some answers/reasons why amd released such a woeful product. Maybe this was why dirk was fired? All I know is after 7+ years of amd, my next processor will be intel
  • Ushio01 - Wednesday, October 12, 2011 - link

    Desktop CPU's are Halo parts and as such are irrelevant. It's the Server and OEM Laptop CPU's were AMD needs to perform and AMD's server share just keeps dropping.
  • lyeoh - Wednesday, October 12, 2011 - link

    Thing is I wouldn't want to use them in my servers: http://us.generation-nt.com/answer/patch-x86-amd-c...

    FWIW when the Athlon64s first came out, we bought a bunch of them, they were not bad, but there were clock issues - the TSCs weren't synchronized. So had to set idle=poll (and thus using more watts).

    You can blame the OS developers, but most people buy new hardware to run existing operating systems and programs on, not future unreleased ones.

    It sure is looking bad for them. I won't be buying AMD CPUs but I hope the fanboys keep them alive ;).
  • OCedHrt - Wednesday, October 12, 2011 - link

    "Other than the 8150, only the quad-core FX processors are able to exceed the 3.3GHz clock speed of the Phenom II X6 1100T."

    The 6 core FX is also clocked higher?

Log in

Don't have an account? Sign up now