Gaming Benchmarks

F1 2013

First up is F1 2013 by Codemasters. I am a big Formula 1 fan in my spare time, and nothing makes me happier than carving up the field in a Caterham, waving to the Red Bulls as I drive by (because I play on easy and take shortcuts). F1 2013 uses the EGO Engine, and like other Codemasters games ends up being very playable on old hardware quite easily. In order to beef up the benchmark a bit, we devised the following scenario for the benchmark mode: one lap of Spa-Francorchamps in the heavy wet, the benchmark follows Jenson Button in the McLaren who starts on the grid in 22nd place, with the field made up of 11 Williams cars, 5 Marussia and 5 Caterham in that order. This puts emphasis on the CPU to handle the AI in the wet, and allows for a good amount of overtaking during the automated benchmark. We test at 1920x1080 on Ultra graphical settings.

Discrete SLI, Average FPS, F1 2013

The FX-9590 seems to match the i3-4360, indicating that even more cores, more frequency and more PCIe lanes is not always a good thing. The i3-4360 is using PCIe 3.0 x8/x8, compared to PCIe 2.0 x16/x16 for the FX-9590, which should put them both equal in bandwidth.

Bioshock Infinite

Bioshock Infinite was Zero Punctuation’s Game of the Year for 2013, uses the Unreal Engine 3, and is designed to scale with both cores and graphical prowess. We test the benchmark using the Adrenaline benchmark tool and the Xtreme (1920x1080, Maximum) performance setting, noting down the average frame rates and the minimum frame rates.

Discrete SLI, Average FPS, Bioshock Infinite

Again, the FX-9590 is trading around the Haswell i3 margin.

Tomb Raider

The next benchmark in our test is Tomb Raider. Tomb Raider is an AMD optimized game, lauded for its use of TressFX creating dynamic hair to increase the immersion in game. Tomb Raider uses a modified version of the Crystal Engine, and enjoys raw horsepower. We test the benchmark using the Adrenaline benchmark tool and the Xtreme (1920x1080, Maximum) performance setting, noting down the average frame rates and the minimum frame rates.

Discrete SLI, Average FPS, Tomb Raider

Tomb Raider has historically been CPU agnostic, with all the latest CPUs performing similarly.

Sleeping Dogs

Sleeping Dogs is a benchmarking wet dream – a highly complex benchmark that can bring the toughest setup and high resolutions down into single figures. Having an extreme SSAO setting can do that, but at the right settings Sleeping Dogs is highly playable and enjoyable. We run the basic benchmark program laid out in the Adrenaline benchmark tool, and the Xtreme (1920x1080, Maximum) performance setting, noting down the average frame rates and the minimum frame rates.

Discrete SLI, Average FPS, Sleeping Dogs

The FX-9590 loses 4-6 FPS on average to the latest Intel cohort, which is not bad considering the release date difference.

Company of Heroes 2

Company of Heroes 2 also can bring a top end GPU to its knees, even at very basic benchmark settings. To get an average 30 FPS using a normal GPU is a challenge, let alone a minimum frame rate of 30 FPS. For this benchmark I use modified versions of Ryan’s batch files at 1920x1080 on High. COH2 is a little odd in that it does not scale with more GPUs with the drivers we use.

Discrete SLI, Average FPS, Company of Heroes 2

Company of Heroes 2 is usually relatively CPU agnostic except for the older dual core CPUs, but the FX-9590 gets a win here.

Battlefield 4

The EA/DICE series that has taken countless hours of my life away is back for another iteration, using the Frostbite 3 engine. AMD is also piling its resources into BF4 with the new Mantle API for developers, designed to cut the time required for the CPU to dispatch commands to the graphical sub-system. For our test we use the in-game benchmarking tools and record the frame time for the first ~70 seconds of the Tashgar single player mission, which is an on-rails generation of and rendering of objects and textures. We test at 1920x1080 at Ultra settings.

Discrete SLI, Average FPS, Battlefield 4

Similar to Sleeping Dogs, the FX-9590 does not lose much considering the release date difference of the architectures.

CPU Benchmarks: Comparing the ASRock 990FX Extreme9 AMD FX-9590 and ASRock 990FX Extreme9 Conclusions
Comments Locked

146 Comments

View All Comments

  • just4U - Monday, August 11, 2014 - link

    I think coolermaster makes it.. Not bad, not great.. You'd be better served getting the the variant without liquid cooling (I think..) and than deciding on your own what you need.
  • Natfly - Tuesday, August 12, 2014 - link

    Garbage....you can't polish a turd.
  • The_Riddick - Wednesday, August 13, 2014 - link

    These processors really need to be running at below Intel wattage in order to be competitive, even if they tried and sold me one of these cpus for $10 I wouldn't buy one. 220W and performs worse then i5, no thanks.
  • TiGr1982 - Wednesday, August 13, 2014 - link

    Well, this same FX Piledriver certainly can run below Intel wattage (say, around 70 W for the CPU itself), but only at no more than 2.5-3.0 GHz frequency - like Opterons 6300 do.
    Then, it won't make a lot of sense on the desktop either :)
  • eanazag - Friday, August 15, 2014 - link

    AMD misunderstood me.

    Before this product was initially released as an OEM part I had posted on an AMD article that I would be interested in a 200 W APU not CPU. I wouldn't mind an APU that could clock the GPU and CPU outrageously. I have a 300 W video card plus a 105 W Intel CPU, which the CPU is supposed to be 95 W. So a 200 W APU that comes close to both of those is a cost savings if it will clock down while idle. Plus space and heat savings.

    Evidently they opted for doing it with this CPU. This is not totally bad, but between the pricing, performance, chipset features, and efficiency versus the 8350 and Intel parts it is really tough to justify. I saw numbers in there for workloads where the Haswell i3 is more than the 9590. I bought the Phenom 9600 with the errata and the CPU was fine, but I am still not willing to go that far in loyalty to AMD on the CPU side anymore. They would do better to just import the 8350/9590 silicon onto their 28 nm process. So what if it takes a clock regression as it will also have a TDP drop. They really need to do a better job updating their chipsets. This is less forgivable than their CPU line. That old, crappy 9590 would look better with a new chipset (PCIe 3 at least).
  • 0ldman79 - Monday, August 18, 2014 - link

    I've been running AMD in my desktops as a primary since the K6-2 (and K6-3+ mobile in a desktop if anyone remembers that gem).

    That being said, I have to call the current generation as AMD's version of the Pentium 4. I have an FX6300 in my main gaming PC. While it does well, it just isn't up to par with Intel's offerings.

    I've been keeping an eye out for the next version of the AM3+ performance line and found that I've pretty much got it.

    Uh... what?

    AMD's flagship performance socket, AM3, has pretty much been dropped completely with all focus towards hot dual cores (seriously, that is what they are) with some rather nice integrated graphics. While I've sold several of these to my business customers I'm seriously considering jumping to Intel for my rigs.

    The biggest reason is I've always had an upgrade path with AMD. It was always easy to keep building a new AMD as I'd have a couple of generations of CPU available to a platform and some of my parts from the previous system would cross. It was rarely ever a 100% replacement, more a long term evolution.

    My next system will likely require a new motherboard to replace what is to me a fairly new board. With AMD effectively dropping the AM3 line just after I got onboard, I've got a sour taste in my mouth.

    Those Core i7 are looking better. AMD has done this to themselves.

    They took their IT customers, those that tell everyone else what to buy, told them about this awesome new CPU on the AM3 platform, the ultimate of the Bulldozer line, walked them out blindfolded for the big reveal, then walked away. We're standing there in a field of nothing with a blindfold on looking like jackasses.

    That is what I think of AMD's current roadmap.
  • Cryio - Tuesday, August 19, 2014 - link

    The absolutely BEST game for CPU benchmarking remains Crysis 3.

    I don't know why they use Tomb Raider and Company of Heroes which both are CPU agnostic. Not to mention that F1 series just hates AMD CPUs for whatever reason.

    Games that really use the CPU: Crysis 3, Hitman Absolution, Assassin's Creed IV (I think). GRID 2, or really any mainline DIRT games. Hell, even Watch Dogs.

    If any of those games that know how to properly use more than 2-4 cores were tested then this AMD beast would wipe the floor with those i3s.
  • Budburnicus - Wednesday, January 14, 2015 - link

    LMAO!

    Just laughing at "This AMD "beast" would wipe the floor with those i3s." - You used beast to describe AMD's flagship product, and then talked about proper software coding allowing it to wipe the floor - wiiith i3s! LOL! just LOL!

    I really am not sure if you were being sarcastic or actually meant that, but I hope it was sarcasm!
  • nctritech - Monday, October 20, 2014 - link

    I just got an FX-9590 and an ASUS M5A99FX PRO R2.0 motherboard to go with it. I closely examined available Intel options and chose this chip. Most of the comments here put down this chip and AMD because Intel has higher-performing options and most of those comments are completely missing one vital factor: PRICE. I got this combo for $355 tax + shipping. Even if you go with the previous generation of Intel's flagship CPU, the i7-3770K, Newegg has them TODAY for $330. Hmm, that's almost as much as I paid for the FX with a brand new motherboard! Same story for the i7-4770K at $335.

    I walked away during a CPU sale special paying $220 total for the FX-9590 chip. It's faster in video compression benchmarks than EVERY desktop Intel chip EXCEPT the X-series i7 chips. It runs with or near the 3770K and 4770K in almost every other benchmark, possibly excluding games.

    For those of you jeering at "efficiency" and praising how much faster Intel's Haswell chips can be, I wish you the best...but I'll be able to get an SSD, better RAM, or a nicer graphics card because I have $100 extra in my pocket, all while enjoying roughly the same performance as the Intel chips you've formed a cult around. Best of all, there's no LGA socket with extremely fragile pins to void my warranty; you know, when you return a mobo and they refuse to honor your return because "user-caused CPU socket pin damage" even though it was sent back because a nearby defective power component visibly burned up. Plus, did you know that CPUs only use their TDP worth of heat when you're taxing them to the maximum constantly? Who knew?!

    You can have your lower performance-per-currency-unit chips and theoretical efficiency, I'll take the best overall deal, thanks!
  • Jinx50 - Sunday, November 2, 2014 - link

    I agree enthusiast's are rarely concerned with power consumption. I quote a user i encountered who stated My Haswell is more energy efficient "meanwhile they have their rig picture with 3 Titans running in 3 way SLI for their avatar". Pure derp and a grasping desperately for the one and only straw they have in regards to downplaying the FX lineup. It's obviously neither price or performance, or multitasking for that matter.

    I've just become accustomed to tuning them out like annoying kids at the pub.

Log in

Don't have an account? Sign up now