Haswell Update:

Because we have only managed to get hold of the top Haswell processor thus far, it is a little difficult to see where Haswell lies.  On the front of it, Haswell is more than adequate in our testing scenario for a single GPU experience and will perform as well as a mid-range CPU. It is when you start moving up into more GPUs, more demanding games and higher resolutions when the big boys start to take control.

On almost all fronts, the i7-4770K is the preferred chip over anything Sandy Bridge-E, if not by virtue of the single threaded speed it is due to the price difference.  Sandy Bridge-E is still there if you need the raw CPU horsepower for other things.

Our analysis also shows that without the proper configuration in the BIOS, having a GPU at PCIe 2.0 x1 is really bad for scaling.  On the ASUS Z87 Pro, the third full-length PCIe slot is at x1 bandwidth, as it shares the four PCIe lanes from the chipset with other controllers on board – if it is moved up to PCIe 2.0 x4, then the other controllers are disabled.  Nonetheless, scaling at either PCIe 2.0 x1 or x4 cannot compete with a proper PCIe 3.0 x8/x4/x4 setup.

Over the course of Haswell, we will update the results as we get hold of PLX enabled motherboards for some of those x8/x8/x8/x8 layouts, and not to mention the weird looking PCIe 3.0 x8/x4/x4 + PCIe x2.0 x4 layouts seen on a couple of motherboards in our Z87 motherboard preview.

As mentioned in our last Gaming CPU testing, the results show several points worth noting.

Firstly, it is important to test both accurately, fairly, and with a good will.  Choosing to perform a comparative test when misleading the audience by not understanding how it works underneath is a poor game to play.  Leave the bias at home, let the results do the talking.

In three of our games, having a single GPU make almost no difference to what CPU performs the best.  Civilization V was the sole exception, which also has issues scaling when you add more GPUs if you do not have the most expensive CPUs on the market.  For Civilization V, I would suggest having only a single GPU and trying to get the best out of it.

In Dirt3, Sleeping Dogs and Metro2033, almost every CPU performed the same in a single GPU setup.  Moving up the GPUs and Dirt 3 leaned towards PCIe 3.0 above two GPUs, Metro 2033 started to lean towards AMD GPUs and Sleeping Dogs was agnostic.

Above three GPUs, the extra horsepower from the single thread performance of an Intel CPU was starting to make sense, with as much as 70 FPS difference in Dirt 3.  Sleeping Dogs was also starting to become sensitive to CPU choice.

We Know What Is Missing

As it has only been a month or so since the last Gaming CPU update, and my hands being deep in Haswell testing, new CPUs have not been streaming through the mail.  However, due to suggestions from readers and a little digging, I currently have the following list to acquire and test/retest:

Celeron G1101
Celeron G1620
Pentium G2020
Pentium G6950
i3-2100
i5-3570K
i5-4570T
i5-4670K
i3-560
i5-680
i5-760
i5-860
i5-880
i7-920
i7-950
i7-980X
QX9775
Q6600
Xeon E3-1220L v2
Xeon E3-1220v2
Xeon E3-1230v2
Xeon E3-1245v2
Athlon II X2 220
Athlon II X2 250
Athlon II X2 280
Athlon II X3 425
Athlon II X3 460
Sempron 145
Phenom II X3 740
Phenom II X4 820
Phenom II X4 925
Phenom II X6 1045T
FX-4130
FX-4200
FX-4300
FX-4350
FX-6200
FX-6350
A8-5600K + Core Parking retest
A10-5800K + Core Parking retest

As you can imagine, that is quite a list, and I will be breaking it down into sections and updates for everyone.

But for now, onto our recommendations.

Recommendations for the Games Tested at 1440p/Max Settings

A CPU for Single GPU Gaming: A8-5600K + Core Parking updates

If I were gaming today on a single GPU, the A8-5600K (or non-K equivalent) would strike me as a price competitive choice for frame rates, as long as you are not a big Civilization V player and do not mind the single threaded performance.  The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580, as well as feel the same in the OS as an equivalent Intel CPU.  The A8-5600K will also overclock a little, giving a boost, and comes in at a stout $110, meaning that some of those $$$ can go towards a beefier GPU or an SSD.  The only downside is if you are planning some heavy OS work – if the software is Piledriver-aware, all is well, although most processing is not, and perhaps an i3-3225 or FX-8350 might be worth a look.

It is possible to consider the non-IGP versions of the A8-5600K, such as the FX-4xxx variant or the Athlon X4 750K BE.  But as we have not had these chips in to test, it would be unethical to suggest them without having data to back them up.  Watch this space, we have processors in the list to test.

A CPU for Dual GPU Gaming: i5-2500K or FX-8350

Looking back through the results, moving to a dual GPU setup obviously has some issues.  Various AMD platforms are not certified for dual NVIDIA cards for example, meaning while they may excel for AMD, you cannot recommend them for team Green.  There is also the dilemma that while in certain games you can be fairly GPU limited (Metro 2033, Sleeping Dogs), there are others were having the CPU horsepower can double the frame rate (Civilization V).

After the overview, my recommendation for dual GPU gaming comes in at the feet of the i5-2500K.  This recommendation may seem odd – these chips are not the latest from Intel, but chances are that pre-owned they will be hitting a nice price point, especially if/when people move over to Haswell.  If you were buying new, the obvious answer would be looking at an i5-3570K on Ivy Bridge rather than the 2500K, so consider this suggestion a minimum CPU recommendation.

On the AMD side, the FX-8350 puts up a good show across most of the benchmarks, but falls spectacularly in Civilization V.  If this is not the game you are aiming for and want to invest AMD, then the FX-8350 is a good choice for dual GPU gaming.

A CPU for Tri-GPU Gaming: i7-4770K with an x8/x4/x4 (AMD) or PLX (NVIDIA) motherboard

By moving up in GPU power we also have to boost the CPU power in order to see the best scaling at 1440p.  It might be a sad thing to hear but the only CPUa in our testing that provide the top frame rates at this level are the top line Ivy Bridge and Haswell models.  For a comparison point, the Sandy Bridge-E 6-core results were often very similar, but the price jump to such as setup is prohibitive to all but the most sturdy of wallets.  Of course we would suggest Haswell over Ivy Bridge based on Haswell being that newer platform, but users who can get hold of the i7-3770K in a sale would reap the benefits.

As noted in the introduction, using 3-way on NVIDIA with Ivy Bridge/Haswell will require a PLX motherboard in order to get enough lanes to satisfy the SLI requirement of x8 minimum per CPU.  This also raises the bar in terms of price, as PLX motherboards start around the $280 mark.  For a 3-way AMD setup, an x8/x4/x4 enabled motherboard performs similarly to a PLX enabled one, and ahead of the slightly crippled x8/x8 + x4 variations.  However investing in a PLX board would help moving to a 4-way setup should that be your intended goal.  In either scenario, the i7-3770K or i7-4770K are the processors of choice from our testing suite.

A CPU for Quad-GPU Gaming: i7-3770K with a PLX motherboard

So our recommendation in four-way, based on results, would nominally be an i7-3770K.  We cannot recommend the 4770K as of yet, as we have no data to back it up!  Although this will be coming in the next update, and if any predictions are made, the 4770K would be the preferential chip based on single thread speed and the newer chip. 

But even still, a four-way GPU configuration is for those insane few users that have both the money and the physical requirement for pixel power.  We are all aware of the law of diminishing returns, and more often than not adding that fourth GPU is taking the biscuit for most resolutions.  Despite this, even at 1440p, we see awesome scaling in games like Sleeping Dogs (+73% of a single card moving from three to four cards) and more recently I have seen that four-way GTX680s help give BF3 in Ultra settings a healthy 35 FPS minimum on a 4K monitor.  So while four-way setups are insane, there is clearly a usage scenario where it matters to have card number four.

Our testing was pretty clear as to what CPUs are needed at 1440p with fairly powerful GPUs.  While the i7-2600K was nearly there in all our benchmarks, only two sets of CPUs made sure of the highest frame rates – the i7-3770K/4770K and any six-core Sandy Bridge-E.  As mentioned in the three-way conclusion, the price barrier to SB-E is a big step for most users (even if they are splashing out $1500+ on four big cards), giving the nod to an Ivy Bridge configuration.  Of course that CPU will have to be paired with a PLX enabled motherboard as well.

One could argue that with overclocking the i7-2600K could come into play, and I do not doubt that is the case.  People building three and four way GPU monsters are more than likely to run extra cooling and overclock.  Unfortunately that adds plenty of variables and extra testing which will have to be made at a later date.  For now our recommendation at stock, for 4-way at 1440p, is an i7-3770K CPU.

What We Have Not Tested

In the intro to this update, I addressed a couple of points regarding testing 1440p over 1080p, as well as reasons for not using FCAT or reporting minimum FPS.  But one of the bigger issues brought up in the first Gaming CPU article comes from the multiplayer gaming perspective, when dealing with a 64-player map in BF3.  This is going to be a CPU intensive situation for sure, dealing with the network interface to update the GPU and processing.  The only issue from our side is repetitive testing.  I focused a lot on the statistics of reporting benchmarking results, and trying to get a consistent MP environment for game testing that can be viewed at objectively is for all intents and purposes practically impossible.  Sure I could play a few rounds in every configuration, but FPS numbers would be all over the place based on how the rounds went.  I would not be happy on publishing such data and then basing recommendations from it.

The purpose of the data in this article is to help buying decisions based on the games at hand.  As a reader who might play more strenuous games, it is clear that riding the cusp of a boundary between CPU performance might not be the best route, especially when modifications start coming into play that drag the frame rates right down, or cause more complex calculations to be performed.  In that situation, it makes sense to play it safe with a more powerful processor, and as such our recommendations may not necessarily apply.  The recommendations are trying to find a balance between performance, price, and the state of affairs tested in this article at the present time, and if a user knows that the future titles are going to be powerful and they need a system for the next 3-5 years, some future proofing is going to have to form part of the personal decision when it comes down to paying for hardware. 

When I have friends or family who come up to me and said ‘I want to play X and have Y to spend’ (not an uncommon occurrence), I try and match what they want with their budget – gaming typically gets a big GPU to begin and then a processor to match depending on what sort of games they play.  With more CPUs under our belt here at AnandTech, with an added element of understanding on where the data comes from and how it was obtained, we hope to help make such decisions.

As always, we are open to suggestions!  I have had requests for Bioshock Infinite and Tomb Raider to be included – unfortunately each new driver update is still increasing performance for these titles, meaning that our numbers would not be relevant next quarter without a full retest.  I will hopefully put them in the testing with the next driver update.

GPU Benchmarks: Sleeping Dogs
Comments Locked

116 Comments

View All Comments

  • Dentons - Tuesday, June 4, 2013 - link

    He's complaint is on the mark. Haswell is about mobile, not desktop, not gaming.

    Ivy Bridge was about cost reduction, Haswell is about reducing TDP. It is shocking that a mid-range 2+ year old Sandy Bridge desktop part is still so very competitive, even though it's been superseded by two whole generations.

    Intel deserves all this criticism and more. They've clearly put the interests of desktop users and gamers far onto the back burner. They're now focused on almost entirely mobile and are treading water with everything else.
  • takeship - Tuesday, June 4, 2013 - link

    Eh, how can you blame them? The pure play desktop market has been shrinking for a while now, with high performance desktop (basically gamers) even more of a niche. Maybe if they had some real competition from AMD in single threaded perf... A lot of this is just Amdahl's law at it's natural conclusion. The easy performance gains are mostly gone, so if you're Intel do you dump endless money into another 25-30% per generation, or go after the areas that haven't been well optimized yet instead? Not a hard choice to make, especially considering the market moves towards mobile & cool computing in the last decade.
  • Silma - Wednesday, June 5, 2013 - link

    Intel doesn't deserve criticism. Haswell is a small improvement over Ivy Bridge because it has become extremely difficult to optimize and already excellent processor. Do you see anything better from AMD, ARM, Oracle or others?

    Is there a need to upgrade from Ivy to Haswell? No. Was it necessary to upgrade from Nethalem to Sandy Bride? No. The fact is that for most applications processors have been good enough for years and money is better spent on ssds, gpus and whatnot.

    The real conclusion of this article should be that processors absolutely do not matter for gaming and that the money is better spent on speedier gpu. Processors may become relevant for the very very very few people that have extreme 2/3x extreme cards. Even a setup with 2 middle cards such as gtx 560 is not cpu dependent. I would welcome actual statistics from the number of players with 2x 3x high-end gpus. I'm quite sure the count is ultra tiny and for those people willing and able to spend thousand of dollars, do you think 100$ less on a processor is relevant?
  • chizow - Wednesday, June 5, 2013 - link

    I don't have a problem with the conclusion he comes to, complaining about dissemination of information to come to that conclusion is what makes no sense. Put all the information out there, 1, 2, 3 articles a day np, then make your own informed decision on the platform. Bemoan the fact there is actual coverage a day or two after launch and one or two reviews? Makes no sense.
  • Memristor - Tuesday, June 4, 2013 - link

    Too bad that Richland, which is available as of today, didn't make it into this review. Other than that great read.
  • eddieobscurant - Tuesday, June 4, 2013 - link

    many of us have a q6600 @ 3600mhz, and personally i'm very happy this and my 7870. I would still like to see a comparison of my cpu @ 3600mhz, with the modern cpus because i don't think there is a huge difference in games.
  • chizow - Tuesday, June 4, 2013 - link

    It depends what you play, any game that is CPU limited is going to be HUGE difference with that CPU. I had the same chip at 3.6GHz, which was great btw, and even when I upgraded to 920 @4GHz there was huge improvement in some games, most notably GTA4 at the time. Some other games that scale extremely well with CPU are WoW, Diablo 3, etc. just to name a few.
  • medi02 - Wednesday, June 5, 2013 - link

    Nah. Most of the tests show that to get CPU limited you need a multi-GPU setup.
    i7 and intel mobo will cost you about 500$ with marginal improvements.
  • chizow - Wednesday, June 5, 2013 - link

    Sorry, just not true. Even with just 1x680 WoW and other similarly CPU dependent games scale tremendously well with faster CPUs:

    http://www.tomshardware.com/reviews/fx-8350-visher...

    Q6600 @ 3.6 is probably just a tad faster than the Phenom IIs in that test.
  • TheJian - Thursday, June 6, 2013 - link

    See my comments here...Chizow is correct, and even understating it some. There are a LOT of games cpu limited as I showed in my links. Huge differences in cpu perf from A10-5800 up to 4770k, never mind the junk Ian recommends here A8-5600 for single gpu. It just isn't correct to recommend that cpu or even A10-5800K which I showed getting smacked around in many games at 1080p. Articles like this make people think games are not cpu bound (it's far more games than Civ5). Neverwinter, Metro Last light, tomb raider, Farcry3, Crysis 3 etc etc...Once 20nm comes we may find even 1440p showing just as many limited by cpu. If rumors are true Volcanic doubles stream processors. I'm sure NV will match that. You end up gpu bound when you up the res to 1440 on single cards now, but that won't be forever and 98.75% of us according to steam don't play at 1440p (.87%) or above (1.25% total of all res above 1920x1200).

    Check the 1080p data on my links (techreport was a good one as they show 1080p in most of the listed games). Toms shows neverwinter as I noted needing a very high cpu also. Hit all comments on this article, and Ctrl-F my name. Ignore my post comments and just click the links in them to prove Chizow's point (and my own). CPU is important at 1080p and 1920x1200 NOW and will be important at higher res with the next gen cards at 20nm. You will never get out of your AMD mistake if you take this article's suggestions. Well at least not without changing to an Intel board/chip...LOL. Who wants to do that? Just buy an Intel unless you're broke. Don't trust me though, read the links provided and judge for yourself how accurate anandtech is here.

    I showed some games that are nearly DOUBLE on Intel vs. A10-5800K! You don't have to like the way I made my point or believe me, just check the links :) They all say the same thing. CPU is an issue just as Chizow shows in his link. You can find this in many cpu articles where they use a top gpu (usually 7970/680) and test new cpus with the oldies in there too which show large separations. Check i7-3770k or fx 8350 articles (just google those two cpu models and "review" for ample sites showing the spreak)...1080p separates the men from the boys in cpu's.

    After you check the links (and chizow's), come back and agree Anandtech needs to change their ways, or tear my comments apart if I'm lying :) Future gpu's will only make our point stick out even more. CPU matters. Also note a lot of the games that are gpu limited on single cards are NOT playable anyway (check sleeping dogs right here in this article 1440p...7970 at 28fps avg is NOT playable, mins will dip to 20's or below). So you're forced back into cpu limited in a lot of cases at 1080p. Where 98.75% of us play you see cpu limits a lot.

    Go back one page on Chizow's link to Skyrim's benchmark in the same article for the same data. 1080p 3770 scores 88.2 to 8350's 67.4 (that's a lot and a huge hint to how your future on AMD will look)
    http://www.tomshardware.com/reviews/fx-8350-visher...
    That's a 30% difference and an 8350FX is far faster than an A8-5600 Ian recommends here. Chizow is even more right if you toss in Ian's recommendation of an even slower cpu than 8350 vs. Intel's stuff. Even in skyrim at 1680x1050 they separate from 90fps to 68fps for 8350fx. So until you completely tap out your gpu (1440p and up which basically requires 2+ cards) you will notice if your cpu is junk or not. Since this article is only written for apparently 1.25% of the readership (or world for that matter according to steam survey), you will notice the cpu! Unless you're raising your hand as the 1.25% :) I don't call 30-100% faster marginal improvements do you? Add CIV 5 also which this site even proves in this article ;) At least they got something right.

Log in

Don't have an account? Sign up now