Gaming Benchmarks

F1 2013

First up is F1 2013 by Codemasters. I am a big Formula 1 fan in my spare time, and nothing makes me happier than carving up the field in a Caterham, waving to the Red Bulls as I drive by (because I play on easy and take shortcuts). F1 2013 uses the EGO Engine, and like other Codemasters games ends up being very playable on old hardware quite easily. In order to beef up the benchmark a bit, we devised the following scenario for the benchmark mode: one lap of Spa-Francorchamps in the heavy wet, the benchmark follows Jenson Button in the McLaren who starts on the grid in 22nd place, with the field made up of 11 Williams cars, 5 Marussia and 5 Caterham in that order. This puts emphasis on the CPU to handle the AI in the wet, and allows for a good amount of overtaking during the automated benchmark. We test at 1920x1080 on Ultra graphical settings.

Discrete SLI, Average FPS, F1 2013

The FX-9590 seems to match the i3-4360, indicating that even more cores, more frequency and more PCIe lanes is not always a good thing. The i3-4360 is using PCIe 3.0 x8/x8, compared to PCIe 2.0 x16/x16 for the FX-9590, which should put them both equal in bandwidth.

Bioshock Infinite

Bioshock Infinite was Zero Punctuation’s Game of the Year for 2013, uses the Unreal Engine 3, and is designed to scale with both cores and graphical prowess. We test the benchmark using the Adrenaline benchmark tool and the Xtreme (1920x1080, Maximum) performance setting, noting down the average frame rates and the minimum frame rates.

Discrete SLI, Average FPS, Bioshock Infinite

Again, the FX-9590 is trading around the Haswell i3 margin.

Tomb Raider

The next benchmark in our test is Tomb Raider. Tomb Raider is an AMD optimized game, lauded for its use of TressFX creating dynamic hair to increase the immersion in game. Tomb Raider uses a modified version of the Crystal Engine, and enjoys raw horsepower. We test the benchmark using the Adrenaline benchmark tool and the Xtreme (1920x1080, Maximum) performance setting, noting down the average frame rates and the minimum frame rates.

Discrete SLI, Average FPS, Tomb Raider

Tomb Raider has historically been CPU agnostic, with all the latest CPUs performing similarly.

Sleeping Dogs

Sleeping Dogs is a benchmarking wet dream – a highly complex benchmark that can bring the toughest setup and high resolutions down into single figures. Having an extreme SSAO setting can do that, but at the right settings Sleeping Dogs is highly playable and enjoyable. We run the basic benchmark program laid out in the Adrenaline benchmark tool, and the Xtreme (1920x1080, Maximum) performance setting, noting down the average frame rates and the minimum frame rates.

Discrete SLI, Average FPS, Sleeping Dogs

The FX-9590 loses 4-6 FPS on average to the latest Intel cohort, which is not bad considering the release date difference.

Company of Heroes 2

Company of Heroes 2 also can bring a top end GPU to its knees, even at very basic benchmark settings. To get an average 30 FPS using a normal GPU is a challenge, let alone a minimum frame rate of 30 FPS. For this benchmark I use modified versions of Ryan’s batch files at 1920x1080 on High. COH2 is a little odd in that it does not scale with more GPUs with the drivers we use.

Discrete SLI, Average FPS, Company of Heroes 2

Company of Heroes 2 is usually relatively CPU agnostic except for the older dual core CPUs, but the FX-9590 gets a win here.

Battlefield 4

The EA/DICE series that has taken countless hours of my life away is back for another iteration, using the Frostbite 3 engine. AMD is also piling its resources into BF4 with the new Mantle API for developers, designed to cut the time required for the CPU to dispatch commands to the graphical sub-system. For our test we use the in-game benchmarking tools and record the frame time for the first ~70 seconds of the Tashgar single player mission, which is an on-rails generation of and rendering of objects and textures. We test at 1920x1080 at Ultra settings.

Discrete SLI, Average FPS, Battlefield 4

Similar to Sleeping Dogs, the FX-9590 does not lose much considering the release date difference of the architectures.

CPU Benchmarks: Comparing the ASRock 990FX Extreme9 AMD FX-9590 and ASRock 990FX Extreme9 Conclusions
Comments Locked

146 Comments

View All Comments

  • Budburnicus - Wednesday, January 14, 2015 - link

    Umm you KNOW that even stock clocked that the i7-3770K is better and faster in EVERY way, than the 9590 OCed to the max, right?

    You also know that while you saved, literally, a couple bucks on BUYING your hardware - you are going to spend, comparatively, hundreds of dollars more running it for even 2 years!

    Oh then there is the effect that the HUGE power draw has on components - mobos, PSUs, video cards, RAM - because ALL of it gets effected by the insane heat - and certainly ALL of it will be effected if you short your PSU!

    And i7-3770K at stock frequencies out performing this POS FX 9590 - is NOT synthetic! That is real world PROVEN speed! NTM, you can EASILY hit 4.7 GHz on any SandyBridge chip - which will not only yield MUCH better performance, but will suck less power and be more reliable as well! And you aren't even going to have to spend that much to buy a good Z77 board and an i5/i7 2500k/2600k (ASRock Z77 Extreme 4 runs right around $100 right now, and is FAR from any budget board, and in fact has more features than ANY FX board that could run this 220W POS!)

    So pat yourself on the back, you saved a few bucks on hardware! BUT you completely sacrificed ALL performance, and ANY reasonable upgradability! Also, you will end up paying FAR more than you saved in power costs! (I am sure your power company will thank you for choosing an AMD space heater for a CPU!)

    Well on that note, you may save a few bucks on heating, given you live somewhere that gets colder than 50 degrees (Fahrenheit) at some point in the year anyway.
  • Budburnicus - Wednesday, January 14, 2015 - link

    LMAO! Cult - you are such an AMD fanboy NOOB! It is not a cult when PERFORMANCE and EFFICIENCY are the deciding factors!

    And again, you saved a few bucks when buying your already outdated POS AMD space heater - just wait til that power bill comes!

    And if you seriously cannot afford an i5/i7 K processor, even a Sandy or Ivy Bridge, and a Z77 or newer chipset, what the hell good is faster ram and an SSD going to do you? None, unless your workload is entirely composed of highly multi threaded compression/encoding etc.

    Have fun with your 220W space heater that has to compete against i3 CPUS! LMFAO!
  • Budburnicus - Saturday, January 10, 2015 - link

    Just another instance of AMD CHUGGING power and crapping performance! Combine this FLOP of a chip (220 W TDP - sweet Jesus! At STOCK) with an R9 290X (300 W TDP, again holy sh*t!) and you are up to over 500 watts TDP!

    Compare this to an Intel i7-5930K (140 W TDP) paired with a GTX 970 (148 W TDP) and you are ONLY at 288!

    Not ONLY that, but the Intel is faster BY FAR at stock speeds, as well the GTX 970, while costing only $30 more - provides ~10%-25% frame rates in basically EVERY benchmark!

    I just feel sad for AMD anymore... I am guessing they are too busy with owning ATI and TRYING to compete with SandyBridge and newer Intel products (My i7-2600K beats this FX chip at STOCK speeds in most benchmarks - and basically ALL gaming benchmarks! And that is not even mentioning that the 2600K easily hits 4.4 GHz on any decent mobo/chipset! Then there is the fact that the 2600K is THREE years old, only 95 W TDP - which will NEVER go above 125 even at the 4.7 GHz/102.3 Bclock OC I run!)

    AMD should have realized this Chip's Architecture was DOA with the first PileDriver CPUs falling FAR behind the Phenom 2 1100Ts! And even now the 1100Ts generally have better gaming performance!

    The REAL question is, WHY? Why have they not dropped this design and brought us a new one? I mean they could try it with limited releases to test it at VERY least, but I hear no word whatsoever about AMD being anywhere close to a completely new chip design!

    I was a staunch "AMD Fan-boy" back in the Pentium 4/Athlon XP days! They WERE far better! Also back then ATI could actually compete in gaming!

    Now? AMD is only good for budget gaming builds - parts like this FX chip are just about pointless - apart from people who already own a good socket AM3+ mobo. But buying this chip for a breand new build? That would be a HORRIBLE idea! Only the biggest fans of AMD would waste such money and power..

    And AMD VideoCards - yes they have better compute performance - so yeah, if you are still GPU mining new Crypto's (Like VertCoin's Lyra2RE Algorithm) - buy an AMD GPU, but 2 or 3 if that is your goal! But even then, apparently the HD 7000 series are STILL the best miners, as they do not consume INSANE amounts of power and do NOT run at a "SAFE 95 deg C" (AMD's quote on R9 290X operating temp!) So, whereas Nvidia and Intel move forward with less power consumption, cooler temps, and better performance where it matters, it REALLY seems AMD is taking steps backwards!
  • SviatA - Friday, October 30, 2015 - link

    So this motherboard suits basically for those who will overclock the processor, graphics and RAM.
    Honestly, I don't know why would any purchase a AM3+-based motherboard since we have to wait for eight months only to get some AMD Zen processors that are (at least on paper) much better than FX. So, I am thinking about the new motherboard and a new processor. Since Doom 4 will come out next year, will have to get something better than my current configuration, that is based on the ASRock 970 R2.0 (BTW, this is a pretty good MB, have bought it here - http://hardware.nl/asrock/970-extreme3r20.html almost two years ago and happy with it)
  • Beljim - Saturday, January 9, 2016 - link


    Do not buy from Asrock. You will be on your own.

    I bought an Extreme9 just before Christmas 2015. Went to raid my 2 Crucial 512gb SSDs and computer would not see them. Called Asrock tech and they told me certain makes of SSDs are not compatible with Extreme9 boards. I explained that SSDs were much older and Asrock board should be backwards compatable. He gave me a short list of drives that were compatible and said I'd have to buy new drives. Asrock took no responsibility and were in no way helpful.
  • paradonym - Wednesday, February 17, 2016 - link

    Where's the described M.2 Slot? Ctrl+F'ing the manual for M.2 doesn't shows up any point.

Log in

Don't have an account? Sign up now