Conclusions

After testing for this review, one thing is clear in my mind – the performance of CPUs paired with a single GPU is hitting a limit. As games get more complex, those designing the graphics and physics engines know that shifting calculations onto the GPU gives a greater boost in performance. If an engine is written to take advantage of the GPU, then the CPU does not really matter for the most part. If you can transfer textures over to the GPU and keep them in memory, the work of the CPU is essentially done apart from light maintenance or interfacing with the network.

Perhaps a better test would have been with more mid-range GPUs, such as 660 Tis or 7790s; with limited memory on the GPU itself, having that faster CPU and faster DDR3 memory might make a big difference. However the ecosystem may be that a gamer can buy a good GPU and not have to worry that the CPU might be a bit underpowered. Unless you need the performance of a big CPU, the big GPU should be a main priority if it means the CPU is less of a concern at the higher GPU/resolutions.

There is also scope for those using less powerful GPUs, such that the CPU could matter a lot more in this scenario. With limited memory, the CPU would have to organize more texture copies between the memory and the GPU, causing other aspects of the system to become the limiting factor. This is very important when interpreting our results. However, our results for our testing scenarios show several points worth noting.

Firstly, it is important to test both accurately, fairly, and with a good will. Choosing to perform a comparative test when misleading the audience by not understanding how it works underneath is a poor game to play. Leave the bias at home, let the results do the talking.

In three of our games, having a single GPU make almost no difference to what CPU performs the best. Civilization V was the sole exception, which also has issues scaling when you add more GPUs if you do not have the most expensive CPUs on the market. For Civilization V, I would suggest having only a single GPU and trying to get the best out of it.

In DiRT 3, Sleeping Dogs and Metro 2033, almost every CPU performed the same in a single GPU setup. Moving up the GPUs and DiRT 3 leaned towards PCIe 3.0 above two GPUs, Metro 2033 started to lean towards Intel CPUs and Sleeping Dogs needed CPU power when scaling up.

Above three GPUs, the extra horsepower from the single thread performance of an Intel CPU starts to make sense, with as much as 70 FPS difference in DiRT 3. Sleeping Dogs also starts to become sensitive to CPU choice.

We Know What Is Missing

On my list of future updates to this article, we need an i5-3570K processor, as well as dual and tri-module Piledriver and an i7-920 for a roundup. I will have a short window soon to rummage in a large storeroom of processors, which will be a prime opportunity for some of the harder to acquire CPUs. Haswell is just around the corner and should provide an interesting update to data points across the spectrum, in most of its desktop forms. From now on I will aim to cover all the different PCIe lane allocations in a chipset, as well as some of those odd ones caused by PLX chips.

If you have a specific processor you would like me to test for a future article, please leave a note below in the comments, and we will try to cover it. :) Top of that list is an i5-3750K, followed by Haswell, then some more AMD cores. I have 29 more processors on my 'ideal' list (if I can get them), but if anyone has any suggestions that I may not have thought of, please let me know. If I am able to get a hold of Titans, I may be in a position to retest across the board for NVIDIA results, meaning another benchmark or two as well (Bioshock Infinite perhaps).

Recommendations for the Games Tested at 1440p/Max Settings

A CPU for Single GPU Gaming: A8-5600K + Core Parking updates

If I were gaming today on a single GPU, the A8-5600K (or non-K equivalent) would strike me as a price competitive choice for frame rates, as long as you are not a big Civilization V player and don’t mind the single threaded performance. The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580, as well as feels the same in the OS as an equivalent Intel CPU. The A8-5600K will also overclock a little, giving a boost, and comes in at a stout $110, meaning that some of those $$$ can go towards a beefier GPU or an SSD. The only downside is if you are planning some heavy OS work – if the software is Piledriver-aware all might be well, although most processing is not, and perhaps an i3-3225 or FX-8350 might be worth a look.

A CPU for Dual GPU Gaming: i5-2500K or FX-8350

Looking back through the results, moving to a dual GPU setup obviously has some issues. Various AMD platforms are not certified for dual NVIDIA cards for example, meaning while they may excel for AMD, you cannot recommend them for Team Green. There is also the dilemma that while in certain games you can be fairly GPU limited (Metro 2033, Sleeping Dogs), there are others were having the CPU horsepower can double the frame rate (Civilization V).

After the overview, my recommendation for dual GPU gaming comes in at the feet of the i5-2500K. This recommendation may seem odd – these chips are not the latest from Intel, but chances are that pre-owned they will be hitting a nice price point, especially if/when people move over to Haswell. If you were buying new, the obvious answer would be looking at an i5-3570K on Ivy Bridge rather than the 2500K, so consider this suggestion a minimum CPU recommendation.

On the AMD side, the FX-8350 puts up a good show across most of the benchmarks, but falls spectacularly in Civilization V. If this is not the game you are aiming for and want to invest AMD, then the FX-8350 is a good choice for dual GPU gaming.

A CPU for Tri-GPU Gaming: i7-3770K with an x8/x4/x4 (AMD) or PLX (NVIDIA) motherboard

By moving up in GPU power we also have to boost the CPU power in order to see the best scaling at 1440p. It might be a sad thing to hear but the only CPU in our testing that provides the top frame rates at this level is the top line Ivy Bridge model. For a comparison point, the Sandy Bridge-E 6-core results were often very similar, but the price jump to such as setup is prohibitive to all but the most sturdy of wallets.

As noted in the introduction, using 3-way on NVIDIA with Ivy Bridge will require a PLX motherboard in order to get enough lanes to satisfy the SLI requirement of x8 minimum per CPU. This also raises the bar in terms of price, as PLX motherboards start around the $280 mark. For a 3-way AMD setup, an x8/x4/x4 enabled motherboard performs similarly to a PLX enabled one, and ahead of the slightly crippled x8/x8 + x4 variations. However investing in a PLX board would help moving to a 4-way setup should that be your intended goal. In either scenario, at stock clocks, the i7-3770K is the processor of choice from our testing suite.

A CPU for Quad-GPU Gaming: i7-3770K with a PLX motherboard

A four-way GPU configuration is for those insane few users that have both the money and the physical requirement for pixel power. We are all aware of the law of diminishing returns, and more often than not adding that fourth GPU is taking the biscuit for most resolutions. Despite this, even at 1440p, we see awesome scaling in games like Sleeping Dogs (+73% of a single card moving from three to four cards) and more recently I have seen that four-way GTX680s help give BF3 in Ultra settings a healthy 35 FPS minimum on a 4K monitor. So while four-way setups are insane, there is clearly a usage scenario where it matters to have card number four.

Our testing was pretty clear as to what CPUs are needed at 1440p with fairly powerful GPUs. While the i7-2600K was nearly there in all our benchmarks, only two sets of CPUs made sure of the highest frame rates – the i7-3770K and any six-core Sandy Bridge-E. As mentioned in the three-way conclusion, the price barrier to SB-E is a big step for most users (even if they are splashing out $1500+ on four big cards), giving the nod to an Ivy Bridge configuration. Of course that i7-3770K CPU will have to be paired with a PLX enabled motherboard as well.

One could argue that with overclocking the i7-2600K could come into play, and I don’t doubt that is the case. People building three and four way GPU monsters are more than likely to run extra cooling and overclock. Unfortunately that adds plenty of variables and extra testing which will have to be made at a later date. For now our recommendation at stock, for 4-way at 1440p, is an i7-3770K CPU.

What to Take Away From Our Testing

Ultimately the spectrum for testing this sort of thing is huge - the minute you deal with multiple GPUs in a system, testing different GPUs, testing different resolutions, testing different quality settings, and then extrapolating those across the normal array of benchmarks we apply to a GPU test, we might as well spend a month just looking at a single CPU platform!

We know the testing done here today looks at a niche scenario - 1440p at Max Settings using very powerful GPUs. The trend in gaming, as I see it, will be towards the higher resolution panels, and with Korean 27" monitors coming into the market, if you're ok with that sort of monitor it is a direction to take to improve your gaming experience. 4K is on the horizon, which means either more pixel pushing power or lower resolutions/settings if you want the quality. Testing at 1440p/max settings is something I like to test as it pushes the GPU and hopefully the rest of the system - if you're a gamer, you want the best experience, and finding the hardware to do that is one of the most important things in that process (after getting good at the game you want).

So these results are offered in order to aid a purchasing decision based on our small sample size. No sample size is ever going to be big enough (unless you are able to test in Narnia), but we hope to expand on this in the future. Consider the data, read our conclusions - you may have a different interpretation of the data. Let us know what you think!

GPU Benchmarks: Sleeping Dogs
Comments Locked

242 Comments

View All Comments

  • TheJian - Wednesday, May 15, 2013 - link

    So you're agreeing with a guy that says it's OK to HATE someone but I'm the evil person for pointing out data that is incorrect? HATE? That's not a bit strong? "We might all hate this guy (for good reason)". And people are calling ME offensive? WOW. This reminds me of the gay people who claim to be tolerant, but god forbid any person says something against them (chic-fil-a comes to mind). They want that person tarred and feathered, smear them in the media and never work again, put them out of business, call them names, cheer people who commit violence against them etc...Nice...No double standards there. Another example, Stacy Dash voting for Romney. They called a BLACK woman who spoke her mind a RACIST...ROFL. What? They tore that chick apart merely for having a very well spoken (IMHO) opinion and pretty good reasons for saying them. She didn't sound stupid (despite what anyone thinks about her opinion), but they tarred and feathered her for saying something anti-obama... :( She's a very classy chick if you ask me and they still pick on her (saw some ripping on her on roku last week - msnbc or something).

    Not sure what his reason is anyway. Did I attack you guys personally? I even let Ian himself off the hook and left the problem at the doorstep of whoever is directing these articles to be written this way. What bothers me most is all the "great article" "nice job" comments to an article that is very wrong and advocates buying a very low end AMD cpu vs. Intel and says it's going to be ok. IT WON'T and in FAR more than just CIV 5 as I showed via other hardware sites.

    What part wasn't objective? My data? The other websites giving the opposite of this site? I can't change their data, and there is nothing objective to discuss when the data is just patently wrong as proved.

    People can argue I'm not objective on my console beliefs (though backed by sales data, and I freely admit I hate them...LOL but I own an xbox360/2 ps2's - go figure - I don't want another holding my games at 1080p for 8yrs) and the new gen at xmas may sell very well (we'll know in 9-10 months if they scan sell past xmas pop), but the PC comments and data I provided are facts based on data from steampowered's survey, hardocp, toms, and techreport. I could have went with another group also with pcper, guru3d etc...but too many links and this site says your post is spam.

    If it was offensive they need thicker skin or stop writing stuff that other sites totally refute. These guys KNOW that when you drop it down to 1080p the cpu is going to SHOW rather than the gpu's shown here (which aren't as taxed at 1080p) showing any cpu can get the job done. Well yeah, any cpu but only when you push gpu's so far they beg for mercy. To me saying that stuff in the article is a LIE when they know what happens turning it down. I wouldn't be so harsh if they were just ignorant of the data, but Anandtech is NOT ignorant. They've been benchmarking the heck out of this crap for ~15 years (I think he started the year I started my 8yr PC business in '97!). I guess you can't call people out for what they're doing today without being called offensive, emotional (LOL) and not objective. I couldn't have written that post if they would have tested where 98% of us play at 1080p right?

    What are they doing here at anandtech? Why would they do this? They know what steampowered shows, I basically said the same stuff to Ryan in the 660TI article ages ago but with even MORE proof and using his own articles to prove my points. I used HIS benchmarks.

    Ask yourself why we are still waiting for the FCAT articles (now we're up to 2 or more...part 2 of the first, and 7990 data etc)? Ryan said we'd see them in a week. We are into months now.
    http://www.anandtech.com/show/6862/fcat-the-evolut...
    Where's part2? He still hasn't given us ONE ounce of data using it.

    "In part one of our series on FCAT, today we will be taking a high-level overview of FCAT. How it works, why it’s different from FRAPS, and why we are so excited about this tool. Meanwhile next week will see the release of part two of our series, in which we’ll dive into our FCAT results, utilizing FCAT to its full extent to look at where FCAT sees stuttering and under what conditions."

    That's from his Part1 linked above. How long do we wait?

    Just for kicks:
    http://www.anandtech.com/show/6910/origin-genesis-...
    "Overall anything short of 5760 with 4x MSAA fails to make a 3rd Titan worthwhile. On the other hand, you do need at least 2 Titans to handle MSAA even at 2560"

    Ok, so I need to spend $2000 on two titans to handle some MSAA at 2560 OVERALL in the tested games (heck one hits under 30fps in a game he tested at 1080p in that review). Raise your hand if you think IAN's article is correct...ROFL.

    "In three of our games, having a single GPU make almost no difference to what CPU performs the best. "

    Yeah in a res that according to Ryan's article taps out two $1000 titans...Then you're right. All cpu's are the same because the Titans are crying for some relief :)
    Their recommendation here:
    "A CPU for Single GPU Gaming: A8-5600K + Core Parking updates"

    "The A8-5600K will also overclock a little, giving a boost, and comes in at a stout $110, meaning that some of those $$$ can go towards a beefier GPU or an SSD."

    No way...So you'll buy $110 cpu and according to Ryan's article on the titan box, buy $2000 worth of titans to go with it to run at the resolution Anandtech thinks is important (2560x1440).

    How do I respond to that without being offensive? You should hear what I'm saying in my brain right now...ROFL. The sad part is people are reading reviews like this and thinking it's correct. Look at the first comments on this article "nice work" etc...Really? I don't see a bunch of HATERS on my comments anyway. Just a few who at the least 1/2 agree with what I said ;) Yourself included. Your example proves to some degree, I didn't waste my time.

    Sorry if you think my "truth" was hidden. I was attempting to make it more "in your face" for simplicity sake. Maybe I failed a bit...LOL. Can't please everyone I guess.
  • TheJian - Tuesday, May 14, 2013 - link

    Nice...What reason? I defamed a hero of yours? Are they doing you any favors by hiding reality? Can you say after reading the links the other sites are wrong? The point of the links showing the exact opposite of this site is so you JUDGE Anandtech yourselves. I really don't want one of my favorite sites to go away. I just want them to start reporting FACTS as they are without the snow.

    I don't feel I have to be politically correct all day for that. People need to get over that PC garbage and get thicker skins. We are FAR to sensitive today. It's like nobody can take a criticism these days and the person who gives it is evil...LOL.

    For the sake of your PC purchase, if you intend on buying on their advice, read the links I gave guys. I'm trying to save people from getting burned! Like me or hate me, the data does NOT lie. You just have to look at it and judge for yourself. When one cpu scores 58 vs. another at 108, there is a SERIOUS reason to pick the proper cpu (just one example from above). If you're seriously broke, I'm all for AMD at that point (great integrated with richland probably making a pretty decent experience), but if not...INTEL. But in either case I wouldn't buy EITHER now. Wait for haswell (broadwell goes in it later...important maybe) or Richland which really makes low end gaming possibly pretty fun I think (at least you can play that is). In laptops maybe Haswell with GT3e makes sense as it should get near AMD or blow by them with 128mb in there. But that's not going to desktops. Integrated on desktops from Intel is still useless IMHO and won't affect Discrete sales one bit from AMD or NV.
  • tential - Tuesday, May 14, 2013 - link

    I don't agree with your analysis on consoles but everything else sure. Gaming for 98% of people is 1080p. That's why I laugh when people quote Titan on ANYTHING (which happens surprisingly a lot here). No one has a Titan so why even talk about such a card saying "AMD has no answer for it". Well no one even has the card anyway except for a couple of people. I agree also with the resolution thing. It makes no sense that so many reviews are catered to high resolution and mutli monitor setups.at?

    People have been wondering why NV and AMD have increased top of the line GFX cards and it's because quite simply, few people have everything needed to exploit such cards. I'd get a 7970, but I don't have a multimonitor setup or a high resolution monitor so what's the point?

    Console wise I think the WiiU was a bad for any comparison. It was an upgrade that really brought nothing extra. People who have a Wii don't care about graphics so most of the upgrades of the WiiU are meaningless to Wii owners. The new Xbox and PS4 will be much better in terms of sales.Those console gamers have been dying for a graphics boost.

    In the end though you're response explains to me GPU pricing today and why top of the line GPUs are costing more and more. A smaller percentage of people are buying them, because GPUs that are lower end, or GPUs that are older are perfectly capable of doing the tasks needed by gamers today. Maybe when monitors drop in price and more people game at higher resolutions but for now, most people do 1080p, and that's the sweet spot for most people. I know thats the ONLY resolution I ever look and care about.
  • TheJian - Wednesday, May 15, 2013 - link

    Thanks...Console arguments are like ford vs. chevy right? How many people won that argument back in the day? :)

    If consoles sales after xmas pop continue for 6 months after (unlike wiiu etc that died as Kotick etc point to, wiiu off 50% says something, Vita, 3DS etc down from last revs too, software off also for all), I'll come back and say YOU sir were right :) You have my word. Of course it goes without saying, I'll be saying the exact opposite if it doesn't happen.

    Regarding why we need more power...I can show situations where 1080P brought the top end to unplayable. Hardocp just did this.
    http://hardocp.com/article/2013/03/12/crysis_3_vid...
    They had to turn some settings down even on 680 and 7970ghz and cards below this really turned stuff off (670 etc). People can say, well this or that doesn't make much difference visually, but the point is you can't have everything on without more power (maxwell/volcano should finally make everything on 1080p playable with ALL details on, no sacrifice at all in anything I'd hope).
    "Crysis 3 plays a lot better at 1080p resolution, 1920x1080. At 1080p the GeForce GTX 680 and Radeon HD 7970 GHz Edition are able to push the graphics to very high and play with the best experience possible in the game. Granted, we have to use SMAA Medium in order to achieve this. It will most likely take next generation single-GPU video cards to allow us to play at SMAA High 4X at very high at 1080p."

    Tombraider has the same issues only worse I guess. :
    http://hardocp.com/article/2013/03/20/tomb_raider_...
    "If you are interested in playing Tomb Raider the NVIDIA GeForce GTX 680 provided the fastest performance at 1080p, and was the only single GPU video card capable of playing with 2X SSAA at this resolution. At 2560x1600 the AMD Radeon HD 7970 GHz Edition CrossFire setup will provide more performance. For gaming on a budget, or at resolutions lower than 1080p, the GeForce GTX 660 Ti is an excellent option."

    So the 660TI I almost bought is for LOWER than 1080p?...ROFL. OUCH. As they point out two cards for above 1080p and only the 680 survived 1080p itself, and only at 2xSSAA. I can site more examples also, but this makes the point. Even 1080p is tough for top end cards if gaming as the devs intended with all candy on is attempted. We need more power, and 20nm should give this from either company I hope. I hope I'll have enough of a reason to buy 1440p for a few games, then flop it over to my dell 1920x1200 when the new cards can't hack my 27in I plan to buy (if I do, might stick with 27in at 1080p, but I like having 2 resolutions native on the desk to play whichever my card can handle). It's comic ryan was pussing 1440p for the 660TI article, but hardocp says that card is for BELOW 1080p...LOL.
  • ShieTar - Monday, May 13, 2013 - link

    Well, if even older dual core CPUs and the weaker AMD parts don't scale at all with a single GPU, it would seem to me like a 60$ Pentium or even a 40$ Celeron with a bit below 3GHz might make a great companion for the typical a 200$-GPU for a Full-HD Gamer. Would be interesting to add any one of those low-cost Ivy Bridge parts to the comparison to see how they keep up with their core ix counterparts.
  • trajan2448 - Tuesday, May 14, 2013 - link

    As soon as I saw Crossfire I stopped reading.
  • TheJian - Wednesday, May 15, 2013 - link

    One more comment on FCAT missing - From the 7990 review:
    " The end result is that we’re not going to have FCAT data for today’s launch, as there simply hasn’t been enough time to put it together. FCAT was specifically designed for multi-GPU testing so this is an ideal use case for it and we’d otherwise like to have it, but without complete results it’s not very useful. Sorry guys.

    The good news is that this means we have (and will be compiling) FCAT results for our cards based on the very latest drivers. So we’ll get to the bottom of frame pacing on the 7990, GTX 690, and more with an FCAT article later this week or early next week. So please stay tuned for that."

    So we're 3 weeks later and no review for this data STILL. Again, people realize the delay tactics here. In another week it will be a MONTH. This is on top of already waiting for FCAT part2 article I mentioned already.

    "Our goal with FCAT was to run an in-depth article about it shortly before the launch of the 7990 as a preparatory article for today’s launch. However like most ambitious goals, that hasn’t panned out."

    It's not really ambitious when EVERYBODY else is already presenting data article after article. Just keep making excuses. Take a good look at the credibility of this site here people and judge these guys yourselves. Ryan Shrout seems to be able to pump out article after article on FCAT, including his review for the 7990, Titan etc...Every article discusses it at this point. Is Ryan Shrout at PCper.com so much more effective than this huge website? Ryan's asking for donations to upgrade his camera equipment for recording podcast type stuff etc. How many people do you have working here compared to his little site? Which I love BTW. Great site, and he nearly has doubled the funding the asked for :)

    At some point I hope people start asking you guys more questions after looking at my posts pointing out stuff most just seem to miss. People will eventually JUDGE this site accordingly if you keep this stuff up. I sincerely hope this site returns to good neutral data soon. You can start with an FCAT article that makes other sites like PCper seem as small as they are.

    Are you still trying to figure out how to use it or something? Call Ryan Shrout :)
  • bds71 - Wednesday, May 15, 2013 - link

    Ian: i noticed you were GPU bound a lot. doesn't this sort of defeat the test? (i think you were GPU bound more than 50% of the time). i'm curious why you didn't use eye-finity or nVidia surround to test the quad graphics setup? with that much power under the hood it's almost a necessity. anyway, i don't mean to critisize the review, i think it still had some very usefull information. i just think that the conclusion wasn't complete if you're GPU bound. note: and decreasing the graphics so that your CPU bound is unrealistic - nobody with quad graphics is going to reduce the graphics capability so their CPU bound.
  • bds71 - Wednesday, May 15, 2013 - link

    edit: i just read through (most) of the comments above. and, while 98% (doubtful, but OK) may play on a single 1080p screen, the fact is that high end graphics are a waste of money for a single 1080p monitor. and, while some games (like Skyrim and Civ V) use a lot of processor, that type of scenario is not indicative of most games. note: also, most of those 98% single screen 1080p users also probably DON'T have a top-of-the-line (ie: 980 or 7970, much less titan, 690 or 7990) graphics card. they probably have a 200-$300 graphics card and a 100-$250 CPU (ie: mainstream). nor do any of those less than top-of-the-line *need* anything more than single 1080p monitor and a mid-range CPU (of which the AMD or Intel variety will do just fine for 98% of those 98% with a single monitor) from my point of view this article set out to find out how much the CPU is used in gaming. does it make sense then to put a limit on the graphics capabilities? of course not. so you go with the high end (top-of-the-line) graphics solution. but in the end, the graphics capabilities was still limited by the screen resolution - you couldn't really see what the GPU's were/are capable of because they couldn't really stretch their leggs (and, in turn, the CPUs never stretched to their limits to feed such a request).

    i participate in F@H. as such i also use my GPU's. i've noticed that (depending on the work unit) the GPU's can take as much as 20% of the CPU to keep them fed. is gaming really that much different? the CPU is needed to feed the GPU, and to do those functions that can not be done on the GPU. for folding, it doesn't matter how fast something gets done - so a faster CPU isn't imperative. but, for gaming, the speed of CPU and its ability to keep relevant data going to the GPU does matter. when the CPU can't keep up with the GPU you get slow minimum frame rates and a general "slow" feeling from the game. so, yes, i agree minimum frame rates are important when determining what CPU to use when feeding a high end graphics solution (more so when using more than a single GPU solution). but you still have to let the GPU's stretch thier legs to see how much of the CPU is being used - and that will determine if a CPU is good or not (min frame rates and CPU usage with high end graphics at appropriate resolutions)
  • yhselp - Wednesday, May 15, 2013 - link

    Wow, the sheer amount of 'content' that Jian guy is producing is amazing. You could probably publish a few books worth of comments by now. Is it really necessary to hit everybody up with a 1000-word reply?

    What I don't get is why you actually do this. You don't agree with what's been tested and how the data has been interpreted; okay, that is your right. And, yes, some of the conclusions drawn might be controversial; but what's your problem? Why don't you just voice your opinion once and leave it be? What are you doing here - are you some sort of freedom fighter for objective data on the internet?

    You complain about how AnandTech are doing it wrong and claim that your own observations are objective and valid. From your point of view they might be, but what you are forgetting is that testing hardware is so vast a field, with so many variables that it's impossible to scientifically claim that ANY conclusion is objective, since the very essence of what you're dealing with precludes that. Everything (in hardware testing) is subjective - live with that truth.

    It's not about having "tough skin", but having manners and being civilized. You can't expect people to listen to you and take you seriously if you're being rude even if your arguments are valid. Try a more gentle approach - I guarantee your message, whatever it might be, will travel further.

    Remember, this is not an article about choosing a CPU for 1080p gaming, also, it's not complete. The provides information for people to interpret their own way. Yes, it draws conclusions at the end that I too think are best left unsaid; but why can't you just look past them? What is your problem? What are you trying to change here? If you don't like AnandTech so much, why don't you just... leave?

Log in

Don't have an account? Sign up now