Conclusions

After testing for this review, one thing is clear in my mind – the performance of CPUs paired with a single GPU is hitting a limit. As games get more complex, those designing the graphics and physics engines know that shifting calculations onto the GPU gives a greater boost in performance. If an engine is written to take advantage of the GPU, then the CPU does not really matter for the most part. If you can transfer textures over to the GPU and keep them in memory, the work of the CPU is essentially done apart from light maintenance or interfacing with the network.

Perhaps a better test would have been with more mid-range GPUs, such as 660 Tis or 7790s; with limited memory on the GPU itself, having that faster CPU and faster DDR3 memory might make a big difference. However the ecosystem may be that a gamer can buy a good GPU and not have to worry that the CPU might be a bit underpowered. Unless you need the performance of a big CPU, the big GPU should be a main priority if it means the CPU is less of a concern at the higher GPU/resolutions.

There is also scope for those using less powerful GPUs, such that the CPU could matter a lot more in this scenario. With limited memory, the CPU would have to organize more texture copies between the memory and the GPU, causing other aspects of the system to become the limiting factor. This is very important when interpreting our results. However, our results for our testing scenarios show several points worth noting.

Firstly, it is important to test both accurately, fairly, and with a good will. Choosing to perform a comparative test when misleading the audience by not understanding how it works underneath is a poor game to play. Leave the bias at home, let the results do the talking.

In three of our games, having a single GPU make almost no difference to what CPU performs the best. Civilization V was the sole exception, which also has issues scaling when you add more GPUs if you do not have the most expensive CPUs on the market. For Civilization V, I would suggest having only a single GPU and trying to get the best out of it.

In DiRT 3, Sleeping Dogs and Metro 2033, almost every CPU performed the same in a single GPU setup. Moving up the GPUs and DiRT 3 leaned towards PCIe 3.0 above two GPUs, Metro 2033 started to lean towards Intel CPUs and Sleeping Dogs needed CPU power when scaling up.

Above three GPUs, the extra horsepower from the single thread performance of an Intel CPU starts to make sense, with as much as 70 FPS difference in DiRT 3. Sleeping Dogs also starts to become sensitive to CPU choice.

We Know What Is Missing

On my list of future updates to this article, we need an i5-3570K processor, as well as dual and tri-module Piledriver and an i7-920 for a roundup. I will have a short window soon to rummage in a large storeroom of processors, which will be a prime opportunity for some of the harder to acquire CPUs. Haswell is just around the corner and should provide an interesting update to data points across the spectrum, in most of its desktop forms. From now on I will aim to cover all the different PCIe lane allocations in a chipset, as well as some of those odd ones caused by PLX chips.

If you have a specific processor you would like me to test for a future article, please leave a note below in the comments, and we will try to cover it. :) Top of that list is an i5-3750K, followed by Haswell, then some more AMD cores. I have 29 more processors on my 'ideal' list (if I can get them), but if anyone has any suggestions that I may not have thought of, please let me know. If I am able to get a hold of Titans, I may be in a position to retest across the board for NVIDIA results, meaning another benchmark or two as well (Bioshock Infinite perhaps).

Recommendations for the Games Tested at 1440p/Max Settings

A CPU for Single GPU Gaming: A8-5600K + Core Parking updates

If I were gaming today on a single GPU, the A8-5600K (or non-K equivalent) would strike me as a price competitive choice for frame rates, as long as you are not a big Civilization V player and don’t mind the single threaded performance. The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580, as well as feels the same in the OS as an equivalent Intel CPU. The A8-5600K will also overclock a little, giving a boost, and comes in at a stout $110, meaning that some of those $$$ can go towards a beefier GPU or an SSD. The only downside is if you are planning some heavy OS work – if the software is Piledriver-aware all might be well, although most processing is not, and perhaps an i3-3225 or FX-8350 might be worth a look.

A CPU for Dual GPU Gaming: i5-2500K or FX-8350

Looking back through the results, moving to a dual GPU setup obviously has some issues. Various AMD platforms are not certified for dual NVIDIA cards for example, meaning while they may excel for AMD, you cannot recommend them for Team Green. There is also the dilemma that while in certain games you can be fairly GPU limited (Metro 2033, Sleeping Dogs), there are others were having the CPU horsepower can double the frame rate (Civilization V).

After the overview, my recommendation for dual GPU gaming comes in at the feet of the i5-2500K. This recommendation may seem odd – these chips are not the latest from Intel, but chances are that pre-owned they will be hitting a nice price point, especially if/when people move over to Haswell. If you were buying new, the obvious answer would be looking at an i5-3570K on Ivy Bridge rather than the 2500K, so consider this suggestion a minimum CPU recommendation.

On the AMD side, the FX-8350 puts up a good show across most of the benchmarks, but falls spectacularly in Civilization V. If this is not the game you are aiming for and want to invest AMD, then the FX-8350 is a good choice for dual GPU gaming.

A CPU for Tri-GPU Gaming: i7-3770K with an x8/x4/x4 (AMD) or PLX (NVIDIA) motherboard

By moving up in GPU power we also have to boost the CPU power in order to see the best scaling at 1440p. It might be a sad thing to hear but the only CPU in our testing that provides the top frame rates at this level is the top line Ivy Bridge model. For a comparison point, the Sandy Bridge-E 6-core results were often very similar, but the price jump to such as setup is prohibitive to all but the most sturdy of wallets.

As noted in the introduction, using 3-way on NVIDIA with Ivy Bridge will require a PLX motherboard in order to get enough lanes to satisfy the SLI requirement of x8 minimum per CPU. This also raises the bar in terms of price, as PLX motherboards start around the $280 mark. For a 3-way AMD setup, an x8/x4/x4 enabled motherboard performs similarly to a PLX enabled one, and ahead of the slightly crippled x8/x8 + x4 variations. However investing in a PLX board would help moving to a 4-way setup should that be your intended goal. In either scenario, at stock clocks, the i7-3770K is the processor of choice from our testing suite.

A CPU for Quad-GPU Gaming: i7-3770K with a PLX motherboard

A four-way GPU configuration is for those insane few users that have both the money and the physical requirement for pixel power. We are all aware of the law of diminishing returns, and more often than not adding that fourth GPU is taking the biscuit for most resolutions. Despite this, even at 1440p, we see awesome scaling in games like Sleeping Dogs (+73% of a single card moving from three to four cards) and more recently I have seen that four-way GTX680s help give BF3 in Ultra settings a healthy 35 FPS minimum on a 4K monitor. So while four-way setups are insane, there is clearly a usage scenario where it matters to have card number four.

Our testing was pretty clear as to what CPUs are needed at 1440p with fairly powerful GPUs. While the i7-2600K was nearly there in all our benchmarks, only two sets of CPUs made sure of the highest frame rates – the i7-3770K and any six-core Sandy Bridge-E. As mentioned in the three-way conclusion, the price barrier to SB-E is a big step for most users (even if they are splashing out $1500+ on four big cards), giving the nod to an Ivy Bridge configuration. Of course that i7-3770K CPU will have to be paired with a PLX enabled motherboard as well.

One could argue that with overclocking the i7-2600K could come into play, and I don’t doubt that is the case. People building three and four way GPU monsters are more than likely to run extra cooling and overclock. Unfortunately that adds plenty of variables and extra testing which will have to be made at a later date. For now our recommendation at stock, for 4-way at 1440p, is an i7-3770K CPU.

What to Take Away From Our Testing

Ultimately the spectrum for testing this sort of thing is huge - the minute you deal with multiple GPUs in a system, testing different GPUs, testing different resolutions, testing different quality settings, and then extrapolating those across the normal array of benchmarks we apply to a GPU test, we might as well spend a month just looking at a single CPU platform!

We know the testing done here today looks at a niche scenario - 1440p at Max Settings using very powerful GPUs. The trend in gaming, as I see it, will be towards the higher resolution panels, and with Korean 27" monitors coming into the market, if you're ok with that sort of monitor it is a direction to take to improve your gaming experience. 4K is on the horizon, which means either more pixel pushing power or lower resolutions/settings if you want the quality. Testing at 1440p/max settings is something I like to test as it pushes the GPU and hopefully the rest of the system - if you're a gamer, you want the best experience, and finding the hardware to do that is one of the most important things in that process (after getting good at the game you want).

So these results are offered in order to aid a purchasing decision based on our small sample size. No sample size is ever going to be big enough (unless you are able to test in Narnia), but we hope to expand on this in the future. Consider the data, read our conclusions - you may have a different interpretation of the data. Let us know what you think!

GPU Benchmarks: Sleeping Dogs
Comments Locked

242 Comments

View All Comments

  • TheJian - Thursday, May 16, 2013 - link

    Am I supposed to not respond now? You just said I have no manners, am uncivilized, have no objectivity, and previously I’m offensive and it’s ok to HATE me…ROFL. POT – MEET KETTLE. If you were to take your own advice, shouldn’t you have just said “you could word it differently but I agree with the data” and left it at that? No, you took it much further with what amounts to an ad hominem attack on ME. You posted 333 words yourself to do it. :) But thanks for recognizing the work I put in :) I can type 60+wpm though so, not that much effort really and two to three times that with Dragon Naturally Speaking premium easily (pick up a copy if you can't keep up - 1600 words in about 9 minutes...ROFL v12.5 rocks). The homework takes time, but that was already done before they wrote this article as I read everything I can find on stocks I track and parts I'm interesting in.

    I've watched this site (and toms) since they were born. 1997 I think here. I did leave toms when Tom Pabst himself forced out Van Smith over the sysmark crap years ago (and removed his name from ALL of his articles he wrote there, putting "tomshardware staff" or some such in Van's name's place). That was AWFUL to watch and I loved reading Tom Pabst's stuff for years. Millions of people were snowed there while they made AMD look like crap in articles with sysmark flagging Intel chips and turning off SSE on AMD. Eventually people like Van, I and others said enough that people took notice and it devalued his site before he sold it. Rightfully so if you ask me, as he was basically an Intel shill at that point as many had pointed out by then.

    At some point somebody has to stand up and tell the truth like Van tried to do. It cost him his job, but the message made it through. Someone has to be willing to “take the hate” for other people's benefit. :) Or nothing will ever get fixed right? People reviewing stuff for millions need some kind of checks and balances right? There are NONE right now in our govt and look what’s happening there as they spend us into bankruptcy amid scandal after scandal kicking our financial future down the road time and again. If we had checks and balances for REAL our president would be in jail along with many dirty congress members on both sides (he just got caught wiretapping the AP – freedom of speech is being trampled, gun rights assaulted, our constitution is attacked at every turn!). People are DEAD possibly because this guy did NOTHING to save them in Benghazi for 7 hours under attack. What happened in Boston? Etc…I'm seeing the same stuff happen here that happened at Tomshardware. Someone has to correct them instead of congratulating them right? Otherwise so many people will make the wrong purchasing decisions based on bad advice from influential and supposedly trusted people (I still like this site, just want back to the neutral stance it used to have for years). In this economy I'd be thanking anyone who takes the time and effort to attempt to save me from buying a piece of junk with my hard earned money. In a nutshell this is why I take the time to show another side for people to consider. They don’t have to believe me, that’s the point of the links, quotes from those links etc. I WANT you to look at the data and make up your own minds. Either it costs this site tons of hits eventually and wakes them up or they need to be put out of business. If nobody ever complained about Win8 how long would we get crap like that? Look how fast it got an 8.1 version as a response and the product manager fired. Put their feet to the fire or they don’t stop ever.

    Anand would have to be seeing his sites traffic go down.
    http://www.alexa.com/siteinfo/anandtech.com#
    If someone takes the time to prove you’re putting up bad data article after article and there is no defense put up (because there isn’t a defense) you are eventually taken down. Jared attacked me in Aug 2012. Pity you can’t go back a year but you can see this site is sliding at least at alexa for the last 6 months. Until they quit yanking our chains I’ll keep yanking theirs if my time allows! Toms went from 10mil to 2mil in just a couple years. I’m not sure what he sold for but it was far less than he’d have gotten before attacking Van, the article shenanigans etc.

    Tell me, what parts of my comments were UNCIVILIZED or RUDE? Did I call anyone a name? Say they are stupid? Did I attack ANYONE personally? Did I do what you did? Actually I did quite the opposite. I said they are NOT ignorant and know exactly what they're doing here (hmm, insinuated intelligence…That’s a good comment right?). I even let Ian off multiple times (he's just doing what he's told no doubt) and noted from the get go he did a lot of work, but due to "someone" pushing bad data to hide AMD's faults it's all wasted. I attacked the crap this site is pushing (crap too harsh for you?), not any of the people themselves (who I'm sure are probably nice guys - well, I can't say that about them all, Jarred attacked ME not the data when I buried Ryan's conclusions and benchmarks). Did I swear at someone? Did I spew hate like the guy who gave a one liner to me? He's claiming its ok to HATE me? When did I ever cross a line like that? Is a debate of the facts worthy of HATE today?

    If you hate the length of my post don't read it. Take your own advice, move along please. Was it necessary for you to post 1000 words back? :) I'd say even the HATERS took me seriously (the only ones that responded besides Tential – what 2 total plus a polite tential?) and saw the arguments were valid and listened. ALL of them did in their own way. Only the first below wasn’t rude as you say and just discussed what I was saying- tential - no flare up from him, just good old fashioned debate:
    "I don't agree with your analysis on consoles but everything else sure. Gaming for 98% of people is 1080p."

    Tential clearly got the message despite our console differences (they weren’t the point really). I’m sure tons of others did even if they’re silent about it. I used to be SILENT. You can’t argue with steampowered.com’s data, nor everyone else showing the res you SHOULD be running here. You can confirm via techreport, hardocp, tomshardware, etc I gave plenty of links and quotes for people to analyze.

    "We might all hate this guy (for good reason) but the words he writes regarding CPU performance in this article have a lot of truth."

    WOW...But at least he saw the truth, and his name is hilarious to me :) Did I attack back? NOPE. Even when he seriously crossed a line IMHO I did nothing but a polite rebuttal with some questions – still waiting for why he thinks it’s ok to HATE people for simple comments, but I don’t mind either way, even he got the message. Worse you agreed with the hate...LOL

    Here’s you:
    "Agreed. What he wrote is offending, emotional and hardly objective. However, there's a truth hidden in there somewhere. Consider the following scenario."

    Comic, I said nothing bad about people, just their data. But to you, it's OK to hate me for it and then toss comments about my character...This goes back to the double standard I mentioned in my previous posts.

    There is nothing wrong with a vigorous debate of the facts in any case and I was CIVIL though critical. This was an article about the proper choice of a GAMER cpu. As presented the data is lies as they presented a situation that doesn’t exist (as even you pointed out in your scenario basically). It would be just "incorrect" if they didn't know what they were doing. But they DO know. They know they’re hiding FCAT data as I pointed out. AMD only talks to them as Guru3d recently pointed out (hilbert did). Odd, yes?

    I find it funny I already answered your questions before with comments like this (but why not do another 1600 word essay for you) :) :
    “People will eventually JUDGE this site accordingly if you keep this stuff up. I sincerely hope this site returns to good neutral data soon.”

    This doesn’t tell you why I’m doing it? I claim OTHER websites I pointed to are OBJECTIVE and VALID. I piled on with my own observations, but I was merely quoting others who all disagree with this site. That’s not subjective that’s FACT. It’s not my point of view; it is the same one as EVERY other site reporting this type of data. Hardocp, Techreport, PCper, Tomshardware. How many do I need before you call me objective? I can give more sites and another 1000 words of quotes…LOL. I can scientifically claim the resolution they chose here to make all cpu’s show the same perf because the gpu is bottlenecking everything, represents less than 1% of the population and I will be RIGHT. Introducing a variable that totally invalidates the entire premise of the experiment is not subjective, it’s misleading at best and easily proved wrong as I have done. My message travelled far enough as nobody missed it as far as I can tell. Mission accomplished, gentle or NOT ;)

    If you don’t like my posts, To quote you:
    “why can't you just look past them? What is your problem?”
    “why don't you just... leave?”
    :) Gee, it seems I've upset you ;)

    "What are you doing here - are you some sort of freedom fighter for objective data on the internet?"

    Already answered and YES, why not :) What are you doing here? Are you some kind of smart alec that objects to people voicing their RELEVANT opinions in a "comment" section? Silly me, I thought that's what this section is for. Can we get back to discussing the data now? You've distracted us all from the topic at hand long enough and it isn't changing the data one bit.
  • OwnedKThxBye - Thursday, May 16, 2013 - link

    Sorry for seriously crossing the line good sir but I still reserve the right to hate you if I choose. A wise man once wrote “We are FAR to sensitive today. It's like nobody can take a criticism these days and the person who gives it is evil...LOL.” <--- this is you =). Keep in mind I was also the first one to agree with you… What you write never fails to bring a smile to my face TheJian, and I hope you don’t stop pointing out the truth any time soon. Just try to keep the next comment shorter so we can read it without so much scrolling..... we don't all own LCDs with 1440+ vertical pixels like we are told to. In the end all we can pray for is a few less gamers to run out and buy an A8-5600K for their HD7970 and for a few of your points to be taken into consideration next time round.
  • yhselp - Sunday, May 26, 2013 - link

    First of all, I’d like to apologize for this long-delayed response – I simply didn’t have the time.

    Truly epic. To start off, you haven't upset me, really; not before and not now - I was genuinely curious as to what it is that you think you're accomplishing by all this (not just this article, others as well). Thus, I set forth to playfully provoke you into responding. Success. Now that you’ve answered, and to be fair – more clearly than expected, I have a better understanding of what urges you to do what you do. Such a peculiar case you are, I am fascinated – are you a troll or aren’t you? Somewhere in between I guess. The arguments you provide are sound, although I still think they’re a bit… let’s not use a word as I’m sure you will twist it into a meaning of your choosing (not originally intended); and most of what you say is, well, adequate – all that makes you not-troll after all. Despite that fact that you would’ve probably responded to anything anyway, I still feel that a ‘thank you’ on my side is necessary for your taking the time to respond; and I’m not being ironic here.

    Now, let’s get a few things out of the way. Note that I’m neither defending nor criticizing AnandTech, I’m simply voicing an opinion just the way you are. Very important – I never said it was okay to hate you or anybody for that matter, you deduced that yourself. I simply agreed with the gist of what OwnedKThxBye said. You cannot cling to ever word you read online, I don’t think anybody here truly feels hate, certainly not me. People just throw words around in the heat of the moment just the way you debate vigorously, I’m sure you understand that. The semantic field of the word ‘hate’ in 21st century contemporary English is huge, especially when used in this type of discourse.

    Why would you blame me for distracting “us all” from the topic at hand when you are the King of Sidetracking? Gotta love your insights on US politics – it’s like watching one of those documentaries on History and the like. My favorite part is about “gun rights” – nice, so eloquently put. The only reason we still have the Second Amendment is because the US cannot just change the Bill of Rights which is part of the oldest acting constitution in the world – it’s a matter of national pride. The reason it was written is a historical occurrence no longer valid. During Colonial times the settlers had to harbor British soldiers which often mistreated them, and so the settlers needed a means of protection. That is how the Second Amendment came to be. Obviously, this is no longer the case. You could argue the right to bear arms is part of Americannness, but this doesn’t change the fact that the original, intended reason for the Second Amendment is a thing of the past.

    Checks and balances for the consumer computer industry – so amusing. Manufacturers, Reviewers and Consumers each checking on the others; that is such an utopian concept. You say it doesn’t work for a country’s government, how do you expect it to work for an industry where money is king? There would always be hidden agendas, you can’t stop that.

    I believe I’ve discovered a new form of entertainment, and that is reading Jian’s comments. You, sir, are crazy. I don’t mean this as an insult. Keep on fighting the good fight, I can’t wait to read more of your comments; and, please, never stop sidetracking and using internet abbreviations such as LOL.
  • azdood - Wednesday, May 15, 2013 - link

    Hi Ian, have you ever considered testing time between turns on Civ5? CPU makes a HUGE difference especially as you get deep into a game.
  • tential - Thursday, May 16, 2013 - link

    This is partially at that Jian guy and at everyone. I understand the desire for high end GPU reviews but using your OWN earlier posts, you stated that the majority of people game at 1080p. If that's the case, whats the point of pushing for a 7990, Titan, FCAT review when quite frankly NO ONE HAS THOSE CARDS. According to your own data and posts from the previous page.

    To me it seems like you're just trolling however, because you brought up the point of affordability, I think that that's where the majority of reviews should target. YES I want to see how the 7970 and the GTX 680 perform, yes I want to see the next gen too, but I really don't think we should waste so much time on Multi GPU setups that under 1% of the gaming community has.

    How about more reviews on upgrade paths, Price to Performance, how to get the most performance at a reasonable price point. That's what I care to see. Any review in which the hardware being tested exceeds 2k (I mean additional hardware), to me is just boring because at the end of the day, I'm not buying two titans, or two 7990s, or even 3 7970s.

    This is of course my PERSONAL opinion, but considering data backs it up, I'd like to see some more reviews cater to the average (when I say average I mean average in terms of the gamer who reads reviews and makes educated price to performance ratio choices) gamer.

    This review kind of tries to do that but in all reality, we aren't gaming at 1440p so more reviews at how to get the best performance at 1080p for a good price, while leaving us a decent upgrade path would be nice.
  • FriedZombie - Friday, May 17, 2013 - link

    Could you possibly go to some slightly older processors and GPUs? In particular the i7-990x would be a great start and the lower and upper end of AMDs 6000 series would be nice too (it seems a LOT of people upgraded from the 5000 series to the 7000 series this year) A benchmarking for Witcher 2 would be nice as well as max settings with Ubersampling turned on is extremely taxing on both CPU and GPU because of how inefficient CDProjekt's RED engine is.
  • ol1bit - Friday, May 17, 2013 - link

    All I can say is WOW!

    Nice work!
  • qulckgun - Sunday, May 19, 2013 - link

    62yrs old play ~150hrs a month. Ready to build new PC. Know next to nothing about building new PC. Read various forums and articles and find the comment sections are great at clearing up some of what I didn't understand in the main article. That being said this is one of the most intertaining comment sections I've read in awhile and was pretty informative. It's helped me put into perspective my hardware choices. Please lets agree to disagree but in a respectable manner. Thank you all for your comments and responces, it's an education.
  • Rob94hawk - Sunday, May 19, 2013 - link

    This was a great article! I'm surprised you didn't use a QX9770 for socket 775. Any reason for that?
  • bds71 - Wednesday, May 22, 2013 - link

    Ian - since the new 4k TV's are out, i think these types of reviews are very indicative of what we can expect once we are able to hook a PC up (using multiple outputs - such as eyefinity or nVidia surround) to a single input 4k TV. for those who don't know, the new 4k standard (3840x2160) is equivelant to eyefinity or nVidia surround at 1080p, but with 4 monitors instead of 3, and in a normal 16x9 format rather than the super wide 3 screen setups. ie: --|--|-- vs ==|== note: equivelant resolution, but not actually 4 monitors :)

    can't wait for THAT testing to begin. assuming an owner can turn off overscan (so you can see the taskbar at the bottom) i indeed intend to purchase one (likely, soon) and would definately want to hook my PC to it. my GTX690 would likely be able to do OK at such a resolution, but i would eventually want to get another 690 - as soon as i could figure out how to utilize the second card with only a single HDMI input on the TV.

    as far as blue ray content - if you wait....it will come :)

Log in

Don't have an account? Sign up now