To start, we want to thank the many manufacturers who have donated kit for our test beds in order to make this review, along with many others, possible.

Thank you to OCZ for providing us with 1250W Gold Power Supplies.
Thank you to G.Skill for providing us with memory kits.
Thank you to Corsair for providing us with an AX1200i PSU and 16GB 2400C10 memory.
Thank you to ASUS for providing us with the AMD GPUs and some IO Testing kit.
Thank you to ECS for providing us with the NVIDIA GPUs.
Thank you to Corsair for providing us with the Corsair H80i CLC.
Thank you to Rosewill for providing us with the 500W Platinum Power Supply for mITX testing, BlackHawk Ultra, and 1600W Hercules PSU for extreme dual CPU + quad GPU testing, and RK-9100 keyboards.

Also many thanks go to the manufacturers who over the years have provided review samples which contribute to this review.  For this Intel update we would particularly like to thank Gigabyte for loaning the Haswell and Nehalem CPUs!

Testing Methodology

In order to keep the testing fair, we set strict rules in place for each of these setups.  For every new chipset, the SSD was formatted and a fresh installation of the OS was applied.  The chipset drivers for the motherboard were installed, along with NVIDIA drivers then AMD drivers.  The games were preinstalled on a second partition, but relinked to ensure they worked properly. The games were then tested as follows:

Metro 2033: Benchmark Mode, two runs of four scenes of Frontline at 1440p, max settings.  First run of four is discarded, average of second run is taken (minus outliers).
Dirt3: Benchmark Mode, four runs of the first scene with 8 cars at 1440p, max settings.  Average is taken.
Civilization V: One five minute run of the benchmark mode accessible at the command line, at 1440p and max settings.  Results produced are total frames in sets of 60 seconds, average taken.
Sleeping Dogs: Using the Adrenaline benchmark software, four scenes at 1440p in Ultra settings.  Average is taken.

If the platform was being used for the next CPU (e.g. Maximus V Formula, moving from FX-8150 to FX-8350), no need to reinstall.  If the platform is changed for the next test, a full reinstall and setup takes place.

How to Read This Review

Due to the large number of different variables in our review, it is hard to accurately label each data point with all the information about that setup.  It also stands to reason that just putting the CPU model is also a bad idea when the same CPU could be in two different motherboards with different GPU lane allocations.  There is also the memory aspect to consider, as well as if a motherboard uses MCT at stock.  Here is a set of labels correlating to configurations you will see in this review:

CPU[+] [CP] (PCIe version – lane allocation to GPUs [PLX])

e.g. A10-5800K (2 – x16/x16): A10-5800K with two GPUs in PCIe 2.0 mode

- First is the name of the CPU, then an optional + identifier for MCT enabled motherboards. 
 - CP indicates we are dealing with a Bulldozer derived CPU and using the Core Parking updates. 
 - Inside the circular brackets is the PCIe version of the lanes we are dealing with, along with the lane allocation to each GPU. 
 - The final flag is if a PLX chip is involved in lane allocation.

This one of the more complex configurations:

i7-3770K+ (3 – x8/x8/x8/x8 PLX)

Which means an i7-3770K (with MCT) powering four GPUs in PCIe 3.0 via a PLX chip

Common Configuration Points

All the system setups below have the following consistent configurations points:

- A fresh install of Windows 7 Ultimate 64-bit
- Either an Intel Stock CPU Cooler, a Corsair H80i CLC or Thermalright TRUE Copper
- OCZ 1250W Gold ZX Series PSU or Corsair AX1200i PSU for SP
- Rosewill 1600W Hercules for DP systems
- Up to 4x ASUS AMD HD 7970 GPUs, using Catalyst 13.1
- Up to 2x ECS NVIDIA GTX 580 GPUs, using GeForce WHQL 310.90
- SSD Boot Drives, OCZ Vertex 3 128 GB
- LG GH22NS50 Optical Drives
- Open Test Beds, either a DimasTech V2.5 EasyHard or a CoolerMaster Test Lab

CPU and Motherboard Configurations

Those listed as ‘Part 2’ are new for this update.

Part 1 A6-3650 + Gigabyte A75-UD4H + 16GB DDR3-1866 8-10-10
Part 1 A8-3850 + ASRock A75 Extreme6 + 16GB DDR3 1866 8-10-10
Part 1 A8-5600K + Gigabyte F2A85-UP4 + 16GB DDR3-2133 9-10-10
Part 1 A10-5800K + Gigabyte F2A85-UP4 + 16GB DDR3-2133 9-10-10
Part 1 X2-555 BE + ASUS Crosshair V Formula + 16GB DDR3 1600 8-8-8
Part 1 X4-960T + ASUS Crosshair V Formula + 16GB DDR3-1600 8-8-8
Part 1 X6-1100T + ASUS Crosshair V Formula + 16GB DDR3-1600 8-8-8
Part 1 FX-8150 + ASUS Crosshair V Formula + 16GB DDR3-2133 10-12-11
Part 1 FX-8350 + ASUS Crosshair V Formula + 16GB DDR3-2133 9-11-10
Part 1 FX-8150 + ASUS Crosshair V Formula + 16GB DDR3-2133 10-12-11 + CP
Part 1 FX-8350 + ASUS Crosshair V Formula + 16GB DDR3-2133 9-11-10 + CP
Part 1 E6400 + MSI i975X Platinum + 4GB DDR2-666 5-6-6
Part 1 E6700 + ASUS P965 Commando + 4GB DDR2-666 4-5-5
Part 1 Xeon X5690 + EVGA SR-2 + 6GB DDR3 1333 6-7-7
Part 1 2x Xeon X5690 + EVGA SR-2
Part 1 Celeron G465 + ASUS Maximus V Formula + 16GB DDR3-2133 9-11-11
Part 1 i5-2500K + ASUS Maximus V Formula + 16GB DDR3-2133 9-11-11
Part 1 i7-2600K + ASUS Maximus V Formula + 16GB DDR3-2133 9-11-11
Part 1 i3-3225 + ASUS Maximus V Formula + 16GB DDR3-2400 10-12-12
Part 1 i7-3770K + Gigabyte Z77X-UP7 + 16GB DDR3-2133 9-11-11
Part 1 i7-3770K + ASUS Maximus V Formula + 16GB DDR3-2400 9-11-11
Part 1 i7-3930K + ASUS Rampage IV Extreme + 16GB DDR3-2133 10-12-12
Part 1 i7-3960X + ASRock X79 Professional + 16GB DDR3-2133 10-12-12
Part 1b E6400 + ASUS P965 Commando + 4GB DDR2-666 4-5-5
Part 1b E6550 + ASUS P965 Commando + 4GB DDR2-666 5-6-6
Part 1b Q9400 + ASUS P965 Commando + 4GB DDR2-666 5-6-6
Part 1b i7-4770K + Gigabyte Z87X-UD3H + 16GB DDR3-2400 10-12-12
Part 1b i7-4770K + ASUS Z87-Pro + 16GB DDR3-2400 10-12-12
Part 1b i7-4770K + MSI Z87A-GD65 Gaming + 16GB DDR3-2400 10-12-12
Part 2 A6-5200 + ASRock IMB-A180-H + 8GB DDR3-1333 9-9-10
Part 2 Fusion E-350 + Zotac Fusion-A-E + 8GB DDR3-1066 7-7-7
Part 2 i7-4770K + MSI Z87 XPower + 16GB DDR3-2400 10-12-12
Part 2 4x E5-4650L + SuperMicro + 128GB DDR3 1600 11-11-11
Part 2 2x E5-2690 + Gigabyte GA-7PESH1 + 32GB DDR3-1600 11-11-11
Part 2 Celeron 847 + ECS NM70-I2 + 8GB DDR3-1333 9-9-9
Part 2 i7-920 + Gigabyte X58-UD9 + 6GB DDR3-1866 7-8-7
Part 2 i7-950 + Gigabyte X58-UD9 + 6GB DDR3-1866 7-8-7
Part 2 i7-990X + Gigabyte X58-UD9 + 6GB DDR3-1866 7-8-7
Part 2 i7-920 + ASRock X58 Extreme3 + 6GB DDR3-1866 7-8-7
Part 2 i7-950 + ASRock X58 Extreme3 + 6GB DDR3-1866 7-8-7
Part 2 i7-990X + ASRock X58 Extreme3 + 6GB DDR3-1866 7-8-7
Part 2 i5-4430 + Gigabyte Z87X-UD3H + DDR3-2400 10-12-12
Part 2 i7-4670K + Gigabyte Z87X-UD3H + DDR3-2400 10-12-12
Part 2 Xeon E3-1280 v3 + Gigabyte Z87X-UD3H + DDR3-2400 10-12-12
Part 2 Xeon E3-1285 v3 + Gigabyte Z87X-UD3H + DDR3-2400 10-12-12
Part 2 Via L2007 + ECS VX900-I + 8GB DDR3-1066 7-7-7

Our first port of call with all our testing is CPU throughput analysis, using our regular motherboard review benchmarks.

CPUs, GPUs, Motherboards, and Memory CPU Benchmarks
Comments Locked

137 Comments

View All Comments

  • BrightCandle - Thursday, October 3, 2013 - link

    So again we see tests with games that are known not to scale with more CPU cores. There are games however that show clear benefits, your site simply doesn't test them. Its not universally true that more cores or HT make a difference but maybe it would be a good idea to focus on those games we know do benefit like Metro last light, Hitman absolution, Medal of honour warfighter and some areas of Crysis 3.

    The problem here is that its the games that support more multithreading, so to give true impression you need to test a much wider and modern set of games. To do otherwise is pretty misleading.
  • althaz - Thursday, October 3, 2013 - link

    To test only those games would be more misleading, as the vast majority of games are barely multithreaded at all.
  • erple2 - Thursday, October 3, 2013 - link

    Honestly, if the stats for single GPU's weren't all at about the same level, this would be an issue. It isn't until you get to multiple GPU's - an area that you start to see some differentiation. But that level begins to become very expensive very quickly. I'd posit that if you're already into multiple high-end video cards, the price difference between dual and quad core is relatively insignificant anyway, so the point is moot.
  • Pheesh - Thursday, October 3, 2013 - link

    appreciate the review, but it seems like the choice of games and settings makes the results primarily reflect a GPU constrained situation (1440p max settings for a CPU test?). It would be nice to see some of the newer engines which utilize more cores as most people will be buying CPU for titles in the future. I'm personally more interested in the delta between the CPU's when in CPU bound situations. Early benchmarks of next gen engines have shown larger differences between 8 threads vs 4 threads.
  • cbrownx88 - Friday, October 4, 2013 - link

    amen
  • TheJian - Sunday, October 6, 2013 - link

    Precisely. Also only 2% of us even own 1440p monitors and I'm guessing the super small % of us in a terrible economy that have say $550 to blow on a PC (the price of the FIRST 1440p monitor model you'd actually recognize the name of on newegg-asus model (122reviews) – and the only one with more than 12reviews) would buy anything BUT a monitor that would probably require 2 vid cards to fully utilize anyway. Raise your hand if you're planning on buying a $550 monitor instead of say, buying a near top end maxwell next year? I see no hands. Since 98% of us are on 1920x1200 or LESS (and more to the point a good 60% are less than 1920x1080), I'm guessing we are all planning on buying either a top vid card, or if you're in the 60% or so that have UNDER 1080p, you'll buy a $100-200 monitor (1080p upgrade to 22in-24in) and a $350-450 vid card to max out your game play.

    Translation: These results affect less than 2% of us and are pointless for another few years at the very least. I'm planning on buying a 1440p monitor but LONG after I get my maxwell. The vid card improves almost everything I'll do in games. The monitor only works well if I have the VID CARD muscle ALREADY. Most people of the super small 2% running 1440p or up have two vid cards to push the monitors (whatever they have). I don't want to buy a monitor and say "oh crap, all my games got super slow" for the next few years (1440p for me is a purchase once a name brand is $400 at 27in – only $150 away…LOL). I refuse to run anything but native and won't turn stuff off. I don't see the point in buying a beautiful monitor if I have to turn in into crap to get higher fps anyway... :)

    Who is this article for? Start writing articles for 98% of your readers, not 2%. Also you'll find the cpu's are far more important where that 98% is running as fewer games are gpu bound. I find it almost stupid to recommend AMD these days for cpus and that stupidity grows even more as vid cards get faster. So basically if you are running 1080p and plan to for a while look at the cpu separation on the triple cards and consider that what you'll see as cpu results. If you want a good indication of what I mean, see the first or second 1440p article here and CTRL-F my nick. I listed all the games previously that part like the red sea leaving AMD cpus in the dust (it's quite a bit longer than CIV 5...ROFL). I gave links to the benchmarks showing all those games.
    http://www.anandtech.com/comments/6985/choosing-a-...
    There's the comments section on the 2nd 1440p article for the lazy people :)

    Note that even here in this article TWO of the games aren't playable on single cards...LOL. 34fps avg in metro 2033 means you'll be hitting low 20's or worse MINIMUM. Sleeping dogs is already under 30fps AVG, so not even playable at avg fps let alone MIN fps you will hit (again TEENS probably). So if you buy that fancy new $550+ monitor (because only a retard or a gambler buys a $350 korean job from ebay etc...LOL) get used to slide shows and stutter gaming for even SINGLE 7970's in a lot of games never mind everything below sucking even more. Raise your hand if you have money for a $550 monitor AND a second vid card...ROFL. And these imaginary people this article is for, apparently should buy a $110 CPU from AMD to pair with this setup...ROFLMAO.

    REALISTIC Recommendations:
    Buy a GPU first.
    Buy a great CPU second (and don't bother with AMD unless you're absolutely broke).
    Buy that 1440p monitor if your single card is above 7970 already or you're planning shortly on maxwell or some such card. As we move to unreal 4 engine, cryengine 3.5 (or cryengine 4th gen…whatever) etc next year, get ready to feel the pain of that 1440p monitor even more if you're not above 7970. So again, this article should be considered largely irrelevant for most people unless you can fork over for the top end cards AND that monitor they test here. On top of this, as soon as you tell me you have the cash for both of those, what they heck are you doing talking $100 AMD cpus?...LOL.

    And for AMD gpu lovers like the whole anandtech team it seems...Where's the NV portal site? (I love the gpus, just not their drivers):
    http://hothardware.com/News/Origin-PC-Ditching-AMD...
    Origin AND Valve have abandoned AMD even in the face of new gpus. Origin spells it right out exactly as we already know:
    "Wasielewski offered a further clarifying statement from Alvaro Masis, one of Origin’s technical support managers, who said, “Primarily the overall issues have been stability of the cards, overheating, performance, scaling, and the amount of time to receive new drivers on both desktop and mobile GPUs.”

    http://www.pcworld.com/article/2052184/whats-behin...
    More data, nearly twice the failure rate at another vendor confirming why the first probably dropped AMD. I’d call a 5% rate bad, never mind AMD’s 1yr rate of nearly 8% failure (and nearly 9% over 3yrs). Cutting RMAs nearly in half certainly saves a company some money, never mind all the driver issues AMD still has and has had for 2yrs. A person adds a monitor and calls tech support about AMD eyefinity right? If they add a gpu they call about crossfire next?...ROFL. I hope AMD starts putting more effort into drivers, or the hardware sucks no matter how good the silicon is. As a boutique vendor at the high end surely the crossfire and multi-monitor situation affects them more than most who don't even ship high end stuff really (read:overly expensive...heh)

    Note one of the games here performs worse with 3 cards than 2. So I guess even anandtech accidentally shows AMD's drivers still suck for triple's. 12 cpus in civ5 post above 107fps with 2 7970's, but only 5 can do over 107fps with 3 cards...Talk about going backwards. These tests, while wasted on 98% of us, should have at the least been done with the GPU maker who has properly functioning drivers WITH 2 or 3 CARDS :)
  • CrispySilicon - Thursday, October 3, 2013 - link

    What gives Anand?

    I may be a little biased here since I'm still rocking a Q6600 (albeit fairly OC'd). But with all the other high end platforms you used, why not use a DDR3 X48/P45 for S775?

    I say this because NOBODY who reads this article would still be running a mobo that old with pcie 1.1, especially in multi-gpu configuration.
  • dishayu - Friday, October 4, 2013 - link

    I share you opinion on the matter, although I'm myself still running a Q6600 on a MSI P965 Platinum with an AMD HD6670. :P
  • ThomasS31 - Thursday, October 3, 2013 - link

    Please also add Battlefield 4 to the game tests in the next update(s)/2014.

    I think it will be very relevant, based on the beta experience.
  • tackle70 - Thursday, October 3, 2013 - link

    I know you dealt with this criticism in the intro, and I understand the reasoning (consistency, repeatability, etc) but I'm going to criticize anyways...

    These CPU results are to me fairly insignificant and not worth the many hours of testing, given that the majority of cases where CPU muscle is important are multiplayer (BF3/Crysis 3/BF4/etc). As you can see even from your benchmark data, these single player scenarios just don't really care about CPU all that much - even in multi-GPU. That's COMPLETELY different in the multiplayer games above.

    Pretty much the only single player game I'm aware of that will eat up CPU power is Crysis 3. That game should at least be added to this test suite, in my opinion. I know it has no built in benchmark, but it would at least serve as a point of contact between the world of single player CPU-agnostic GPU-bound tests like these and the world of CPU-hungry multiplayer gaming.

Log in

Don't have an account? Sign up now