Haswell Update:

Because we have only managed to get hold of the top Haswell processor thus far, it is a little difficult to see where Haswell lies.  On the front of it, Haswell is more than adequate in our testing scenario for a single GPU experience and will perform as well as a mid-range CPU. It is when you start moving up into more GPUs, more demanding games and higher resolutions when the big boys start to take control.

On almost all fronts, the i7-4770K is the preferred chip over anything Sandy Bridge-E, if not by virtue of the single threaded speed it is due to the price difference.  Sandy Bridge-E is still there if you need the raw CPU horsepower for other things.

Our analysis also shows that without the proper configuration in the BIOS, having a GPU at PCIe 2.0 x1 is really bad for scaling.  On the ASUS Z87 Pro, the third full-length PCIe slot is at x1 bandwidth, as it shares the four PCIe lanes from the chipset with other controllers on board – if it is moved up to PCIe 2.0 x4, then the other controllers are disabled.  Nonetheless, scaling at either PCIe 2.0 x1 or x4 cannot compete with a proper PCIe 3.0 x8/x4/x4 setup.

Over the course of Haswell, we will update the results as we get hold of PLX enabled motherboards for some of those x8/x8/x8/x8 layouts, and not to mention the weird looking PCIe 3.0 x8/x4/x4 + PCIe x2.0 x4 layouts seen on a couple of motherboards in our Z87 motherboard preview.

As mentioned in our last Gaming CPU testing, the results show several points worth noting.

Firstly, it is important to test both accurately, fairly, and with a good will.  Choosing to perform a comparative test when misleading the audience by not understanding how it works underneath is a poor game to play.  Leave the bias at home, let the results do the talking.

In three of our games, having a single GPU make almost no difference to what CPU performs the best.  Civilization V was the sole exception, which also has issues scaling when you add more GPUs if you do not have the most expensive CPUs on the market.  For Civilization V, I would suggest having only a single GPU and trying to get the best out of it.

In Dirt3, Sleeping Dogs and Metro2033, almost every CPU performed the same in a single GPU setup.  Moving up the GPUs and Dirt 3 leaned towards PCIe 3.0 above two GPUs, Metro 2033 started to lean towards AMD GPUs and Sleeping Dogs was agnostic.

Above three GPUs, the extra horsepower from the single thread performance of an Intel CPU was starting to make sense, with as much as 70 FPS difference in Dirt 3.  Sleeping Dogs was also starting to become sensitive to CPU choice.

We Know What Is Missing

As it has only been a month or so since the last Gaming CPU update, and my hands being deep in Haswell testing, new CPUs have not been streaming through the mail.  However, due to suggestions from readers and a little digging, I currently have the following list to acquire and test/retest:

Celeron G1101
Celeron G1620
Pentium G2020
Pentium G6950
i3-2100
i5-3570K
i5-4570T
i5-4670K
i3-560
i5-680
i5-760
i5-860
i5-880
i7-920
i7-950
i7-980X
QX9775
Q6600
Xeon E3-1220L v2
Xeon E3-1220v2
Xeon E3-1230v2
Xeon E3-1245v2
Athlon II X2 220
Athlon II X2 250
Athlon II X2 280
Athlon II X3 425
Athlon II X3 460
Sempron 145
Phenom II X3 740
Phenom II X4 820
Phenom II X4 925
Phenom II X6 1045T
FX-4130
FX-4200
FX-4300
FX-4350
FX-6200
FX-6350
A8-5600K + Core Parking retest
A10-5800K + Core Parking retest

As you can imagine, that is quite a list, and I will be breaking it down into sections and updates for everyone.

But for now, onto our recommendations.

Recommendations for the Games Tested at 1440p/Max Settings

A CPU for Single GPU Gaming: A8-5600K + Core Parking updates

If I were gaming today on a single GPU, the A8-5600K (or non-K equivalent) would strike me as a price competitive choice for frame rates, as long as you are not a big Civilization V player and do not mind the single threaded performance.  The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580, as well as feel the same in the OS as an equivalent Intel CPU.  The A8-5600K will also overclock a little, giving a boost, and comes in at a stout $110, meaning that some of those $$$ can go towards a beefier GPU or an SSD.  The only downside is if you are planning some heavy OS work – if the software is Piledriver-aware, all is well, although most processing is not, and perhaps an i3-3225 or FX-8350 might be worth a look.

It is possible to consider the non-IGP versions of the A8-5600K, such as the FX-4xxx variant or the Athlon X4 750K BE.  But as we have not had these chips in to test, it would be unethical to suggest them without having data to back them up.  Watch this space, we have processors in the list to test.

A CPU for Dual GPU Gaming: i5-2500K or FX-8350

Looking back through the results, moving to a dual GPU setup obviously has some issues.  Various AMD platforms are not certified for dual NVIDIA cards for example, meaning while they may excel for AMD, you cannot recommend them for team Green.  There is also the dilemma that while in certain games you can be fairly GPU limited (Metro 2033, Sleeping Dogs), there are others were having the CPU horsepower can double the frame rate (Civilization V).

After the overview, my recommendation for dual GPU gaming comes in at the feet of the i5-2500K.  This recommendation may seem odd – these chips are not the latest from Intel, but chances are that pre-owned they will be hitting a nice price point, especially if/when people move over to Haswell.  If you were buying new, the obvious answer would be looking at an i5-3570K on Ivy Bridge rather than the 2500K, so consider this suggestion a minimum CPU recommendation.

On the AMD side, the FX-8350 puts up a good show across most of the benchmarks, but falls spectacularly in Civilization V.  If this is not the game you are aiming for and want to invest AMD, then the FX-8350 is a good choice for dual GPU gaming.

A CPU for Tri-GPU Gaming: i7-4770K with an x8/x4/x4 (AMD) or PLX (NVIDIA) motherboard

By moving up in GPU power we also have to boost the CPU power in order to see the best scaling at 1440p.  It might be a sad thing to hear but the only CPUa in our testing that provide the top frame rates at this level are the top line Ivy Bridge and Haswell models.  For a comparison point, the Sandy Bridge-E 6-core results were often very similar, but the price jump to such as setup is prohibitive to all but the most sturdy of wallets.  Of course we would suggest Haswell over Ivy Bridge based on Haswell being that newer platform, but users who can get hold of the i7-3770K in a sale would reap the benefits.

As noted in the introduction, using 3-way on NVIDIA with Ivy Bridge/Haswell will require a PLX motherboard in order to get enough lanes to satisfy the SLI requirement of x8 minimum per CPU.  This also raises the bar in terms of price, as PLX motherboards start around the $280 mark.  For a 3-way AMD setup, an x8/x4/x4 enabled motherboard performs similarly to a PLX enabled one, and ahead of the slightly crippled x8/x8 + x4 variations.  However investing in a PLX board would help moving to a 4-way setup should that be your intended goal.  In either scenario, the i7-3770K or i7-4770K are the processors of choice from our testing suite.

A CPU for Quad-GPU Gaming: i7-3770K with a PLX motherboard

So our recommendation in four-way, based on results, would nominally be an i7-3770K.  We cannot recommend the 4770K as of yet, as we have no data to back it up!  Although this will be coming in the next update, and if any predictions are made, the 4770K would be the preferential chip based on single thread speed and the newer chip. 

But even still, a four-way GPU configuration is for those insane few users that have both the money and the physical requirement for pixel power.  We are all aware of the law of diminishing returns, and more often than not adding that fourth GPU is taking the biscuit for most resolutions.  Despite this, even at 1440p, we see awesome scaling in games like Sleeping Dogs (+73% of a single card moving from three to four cards) and more recently I have seen that four-way GTX680s help give BF3 in Ultra settings a healthy 35 FPS minimum on a 4K monitor.  So while four-way setups are insane, there is clearly a usage scenario where it matters to have card number four.

Our testing was pretty clear as to what CPUs are needed at 1440p with fairly powerful GPUs.  While the i7-2600K was nearly there in all our benchmarks, only two sets of CPUs made sure of the highest frame rates – the i7-3770K/4770K and any six-core Sandy Bridge-E.  As mentioned in the three-way conclusion, the price barrier to SB-E is a big step for most users (even if they are splashing out $1500+ on four big cards), giving the nod to an Ivy Bridge configuration.  Of course that CPU will have to be paired with a PLX enabled motherboard as well.

One could argue that with overclocking the i7-2600K could come into play, and I do not doubt that is the case.  People building three and four way GPU monsters are more than likely to run extra cooling and overclock.  Unfortunately that adds plenty of variables and extra testing which will have to be made at a later date.  For now our recommendation at stock, for 4-way at 1440p, is an i7-3770K CPU.

What We Have Not Tested

In the intro to this update, I addressed a couple of points regarding testing 1440p over 1080p, as well as reasons for not using FCAT or reporting minimum FPS.  But one of the bigger issues brought up in the first Gaming CPU article comes from the multiplayer gaming perspective, when dealing with a 64-player map in BF3.  This is going to be a CPU intensive situation for sure, dealing with the network interface to update the GPU and processing.  The only issue from our side is repetitive testing.  I focused a lot on the statistics of reporting benchmarking results, and trying to get a consistent MP environment for game testing that can be viewed at objectively is for all intents and purposes practically impossible.  Sure I could play a few rounds in every configuration, but FPS numbers would be all over the place based on how the rounds went.  I would not be happy on publishing such data and then basing recommendations from it.

The purpose of the data in this article is to help buying decisions based on the games at hand.  As a reader who might play more strenuous games, it is clear that riding the cusp of a boundary between CPU performance might not be the best route, especially when modifications start coming into play that drag the frame rates right down, or cause more complex calculations to be performed.  In that situation, it makes sense to play it safe with a more powerful processor, and as such our recommendations may not necessarily apply.  The recommendations are trying to find a balance between performance, price, and the state of affairs tested in this article at the present time, and if a user knows that the future titles are going to be powerful and they need a system for the next 3-5 years, some future proofing is going to have to form part of the personal decision when it comes down to paying for hardware. 

When I have friends or family who come up to me and said ‘I want to play X and have Y to spend’ (not an uncommon occurrence), I try and match what they want with their budget – gaming typically gets a big GPU to begin and then a processor to match depending on what sort of games they play.  With more CPUs under our belt here at AnandTech, with an added element of understanding on where the data comes from and how it was obtained, we hope to help make such decisions.

As always, we are open to suggestions!  I have had requests for Bioshock Infinite and Tomb Raider to be included – unfortunately each new driver update is still increasing performance for these titles, meaning that our numbers would not be relevant next quarter without a full retest.  I will hopefully put them in the testing with the next driver update.

GPU Benchmarks: Sleeping Dogs
Comments Locked

116 Comments

View All Comments

  • yougotkicked - Tuesday, June 4, 2013 - link

    this sounds quite interesting, though I wonder if the AI is runtime bound rather than solution bound, as this could make the testing somewhat nondeterministic.

    To clarify what I mean; a common method in AI programming is to let algorithms continue searching for better and better solution, interrupting the algorithm when a time limit has passed and taking the best solution it has found so far. Such approaches can result in inconsistent gameplay when pitting multiple AI units against each other, which may change the game state too much between trials to serve as a good testing platform.

    Even if the AI does use this approach it may not bias the results enough to matter, so I guess the only way to be sure is to run the tests a few times and see how consistent the results are on a single test system.
  • Zoeff - Tuesday, June 4, 2013 - link

    Forget about SupCom2 - That game has been scaled down quite a bit compared to SupCom1 and isn't as demanding to CPUs. There's also an active SupCom1 community that has and still is pushing out community made patches. :-)

    SupCom actually has a build-in benchmark that plays a scripted map with some fancy camera work. Anyone can launch this by adding "/map perftest" to your shortcut. That said, it doesn't seem to be working properly anymore after several patches nor does it actually give any useful data as the sim score is capped at 10k for today's CPUs. And yet it's extremely easy to cripple any CPU you throw at it when simply playing the game. Just open up an 81x81km map with 7 AI enemies and watch your computer slow to a crawl as the map starts filling up.

    And yes, the AI is "solution bound". Replays of recorded games with AI in them wouldn't work otherwise.

    I wonder if somebody could create a custom SupCom1 benchmark... *Hint Hint*
  • FBB - Tuesday, June 4, 2013 - link

    They've had over 5 million concurrent online users. The total number will be much higher.
  • DanNeely - Tuesday, June 4, 2013 - link

    What exactly does Steam count as online? Does just having the client sit in my tray count; or do I need to be playing a steam game at the time to be counted?
  • wicko - Tuesday, June 4, 2013 - link

    Definitely just signed in: 823,220 Players In-Game | 4,309,324 Players Online
    Source: http://steamcommunity.com/
  • chizow - Tuesday, June 4, 2013 - link

    Thanks for the tests, there's a lot of data points in there so that's always appreciated.

    I would've liked to have seen some higher perf Nvidia solutions in there though, at the very least some Kepler parts. It looks like a lot of the higher end Intel parts hit a GPU bottleneck at the top, which is not unexpected at 1440p with last-gen Fermi parts.

    What it does show for sure is, you may give pause to going beyond 2-way CF/SLI if you have to go lower than x8 on that 3rd slot. Which means you will probably have to shell out for one of the pricier boards. Hard not to recommend X79 at this point for 3-way or higher, although the lack of official PCIe 3.0 support was a red flag for me.

    I went with the Gigabyte Z87x UD4 because I don't ever intend to go beyond 2-way SLI and the 3rd slot being x4 (2.0) was better than the x8/x4/x4 (3.0) config on most boards, which gives me the option to run a PhsyX card and retain x8/x8 (3.0) for my two main cards.
  • Gunbuster - Tuesday, June 4, 2013 - link

    So I'll stick with my 2600K @4.5ghz and continue to ponder what new Korean 27" LCD to get. Tech is pretty boring at the moment.
  • wicko - Tuesday, June 4, 2013 - link

    I haven't bothered overclocking my 2600K and I still feel it's plenty powerful. I think I may get a second GTX 670 though, Metro Last Light doesn't run all that great at 2560x1440.
  • kallogan - Tuesday, June 4, 2013 - link

    Haswell, haswell, haswell. Making one paper per day about it will not make it better. Boring cpu gen. Wake me up when something interesting shows up.
  • chizow - Tuesday, June 4, 2013 - link

    So I guess the solution is to just ignore the launch to placate all those who have no interest in the launch, rather than post reviews and info about it for the ones that actually do? Doesn't make a lot of sense.

    If it doesn't interest you, move along.

Log in

Don't have an account? Sign up now