Conclusions

After testing for this review, one thing is clear in my mind – the performance of CPUs paired with a single GPU is hitting a limit. As games get more complex, those designing the graphics and physics engines know that shifting calculations onto the GPU gives a greater boost in performance. If an engine is written to take advantage of the GPU, then the CPU does not really matter for the most part. If you can transfer textures over to the GPU and keep them in memory, the work of the CPU is essentially done apart from light maintenance or interfacing with the network.

Perhaps a better test would have been with more mid-range GPUs, such as 660 Tis or 7790s; with limited memory on the GPU itself, having that faster CPU and faster DDR3 memory might make a big difference. However the ecosystem may be that a gamer can buy a good GPU and not have to worry that the CPU might be a bit underpowered. Unless you need the performance of a big CPU, the big GPU should be a main priority if it means the CPU is less of a concern at the higher GPU/resolutions.

There is also scope for those using less powerful GPUs, such that the CPU could matter a lot more in this scenario. With limited memory, the CPU would have to organize more texture copies between the memory and the GPU, causing other aspects of the system to become the limiting factor. This is very important when interpreting our results. However, our results for our testing scenarios show several points worth noting.

Firstly, it is important to test both accurately, fairly, and with a good will. Choosing to perform a comparative test when misleading the audience by not understanding how it works underneath is a poor game to play. Leave the bias at home, let the results do the talking.

In three of our games, having a single GPU make almost no difference to what CPU performs the best. Civilization V was the sole exception, which also has issues scaling when you add more GPUs if you do not have the most expensive CPUs on the market. For Civilization V, I would suggest having only a single GPU and trying to get the best out of it.

In DiRT 3, Sleeping Dogs and Metro 2033, almost every CPU performed the same in a single GPU setup. Moving up the GPUs and DiRT 3 leaned towards PCIe 3.0 above two GPUs, Metro 2033 started to lean towards Intel CPUs and Sleeping Dogs needed CPU power when scaling up.

Above three GPUs, the extra horsepower from the single thread performance of an Intel CPU starts to make sense, with as much as 70 FPS difference in DiRT 3. Sleeping Dogs also starts to become sensitive to CPU choice.

We Know What Is Missing

On my list of future updates to this article, we need an i5-3570K processor, as well as dual and tri-module Piledriver and an i7-920 for a roundup. I will have a short window soon to rummage in a large storeroom of processors, which will be a prime opportunity for some of the harder to acquire CPUs. Haswell is just around the corner and should provide an interesting update to data points across the spectrum, in most of its desktop forms. From now on I will aim to cover all the different PCIe lane allocations in a chipset, as well as some of those odd ones caused by PLX chips.

If you have a specific processor you would like me to test for a future article, please leave a note below in the comments, and we will try to cover it. :) Top of that list is an i5-3750K, followed by Haswell, then some more AMD cores. I have 29 more processors on my 'ideal' list (if I can get them), but if anyone has any suggestions that I may not have thought of, please let me know. If I am able to get a hold of Titans, I may be in a position to retest across the board for NVIDIA results, meaning another benchmark or two as well (Bioshock Infinite perhaps).

Recommendations for the Games Tested at 1440p/Max Settings

A CPU for Single GPU Gaming: A8-5600K + Core Parking updates

If I were gaming today on a single GPU, the A8-5600K (or non-K equivalent) would strike me as a price competitive choice for frame rates, as long as you are not a big Civilization V player and don’t mind the single threaded performance. The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580, as well as feels the same in the OS as an equivalent Intel CPU. The A8-5600K will also overclock a little, giving a boost, and comes in at a stout $110, meaning that some of those $$$ can go towards a beefier GPU or an SSD. The only downside is if you are planning some heavy OS work – if the software is Piledriver-aware all might be well, although most processing is not, and perhaps an i3-3225 or FX-8350 might be worth a look.

A CPU for Dual GPU Gaming: i5-2500K or FX-8350

Looking back through the results, moving to a dual GPU setup obviously has some issues. Various AMD platforms are not certified for dual NVIDIA cards for example, meaning while they may excel for AMD, you cannot recommend them for Team Green. There is also the dilemma that while in certain games you can be fairly GPU limited (Metro 2033, Sleeping Dogs), there are others were having the CPU horsepower can double the frame rate (Civilization V).

After the overview, my recommendation for dual GPU gaming comes in at the feet of the i5-2500K. This recommendation may seem odd – these chips are not the latest from Intel, but chances are that pre-owned they will be hitting a nice price point, especially if/when people move over to Haswell. If you were buying new, the obvious answer would be looking at an i5-3570K on Ivy Bridge rather than the 2500K, so consider this suggestion a minimum CPU recommendation.

On the AMD side, the FX-8350 puts up a good show across most of the benchmarks, but falls spectacularly in Civilization V. If this is not the game you are aiming for and want to invest AMD, then the FX-8350 is a good choice for dual GPU gaming.

A CPU for Tri-GPU Gaming: i7-3770K with an x8/x4/x4 (AMD) or PLX (NVIDIA) motherboard

By moving up in GPU power we also have to boost the CPU power in order to see the best scaling at 1440p. It might be a sad thing to hear but the only CPU in our testing that provides the top frame rates at this level is the top line Ivy Bridge model. For a comparison point, the Sandy Bridge-E 6-core results were often very similar, but the price jump to such as setup is prohibitive to all but the most sturdy of wallets.

As noted in the introduction, using 3-way on NVIDIA with Ivy Bridge will require a PLX motherboard in order to get enough lanes to satisfy the SLI requirement of x8 minimum per CPU. This also raises the bar in terms of price, as PLX motherboards start around the $280 mark. For a 3-way AMD setup, an x8/x4/x4 enabled motherboard performs similarly to a PLX enabled one, and ahead of the slightly crippled x8/x8 + x4 variations. However investing in a PLX board would help moving to a 4-way setup should that be your intended goal. In either scenario, at stock clocks, the i7-3770K is the processor of choice from our testing suite.

A CPU for Quad-GPU Gaming: i7-3770K with a PLX motherboard

A four-way GPU configuration is for those insane few users that have both the money and the physical requirement for pixel power. We are all aware of the law of diminishing returns, and more often than not adding that fourth GPU is taking the biscuit for most resolutions. Despite this, even at 1440p, we see awesome scaling in games like Sleeping Dogs (+73% of a single card moving from three to four cards) and more recently I have seen that four-way GTX680s help give BF3 in Ultra settings a healthy 35 FPS minimum on a 4K monitor. So while four-way setups are insane, there is clearly a usage scenario where it matters to have card number four.

Our testing was pretty clear as to what CPUs are needed at 1440p with fairly powerful GPUs. While the i7-2600K was nearly there in all our benchmarks, only two sets of CPUs made sure of the highest frame rates – the i7-3770K and any six-core Sandy Bridge-E. As mentioned in the three-way conclusion, the price barrier to SB-E is a big step for most users (even if they are splashing out $1500+ on four big cards), giving the nod to an Ivy Bridge configuration. Of course that i7-3770K CPU will have to be paired with a PLX enabled motherboard as well.

One could argue that with overclocking the i7-2600K could come into play, and I don’t doubt that is the case. People building three and four way GPU monsters are more than likely to run extra cooling and overclock. Unfortunately that adds plenty of variables and extra testing which will have to be made at a later date. For now our recommendation at stock, for 4-way at 1440p, is an i7-3770K CPU.

What to Take Away From Our Testing

Ultimately the spectrum for testing this sort of thing is huge - the minute you deal with multiple GPUs in a system, testing different GPUs, testing different resolutions, testing different quality settings, and then extrapolating those across the normal array of benchmarks we apply to a GPU test, we might as well spend a month just looking at a single CPU platform!

We know the testing done here today looks at a niche scenario - 1440p at Max Settings using very powerful GPUs. The trend in gaming, as I see it, will be towards the higher resolution panels, and with Korean 27" monitors coming into the market, if you're ok with that sort of monitor it is a direction to take to improve your gaming experience. 4K is on the horizon, which means either more pixel pushing power or lower resolutions/settings if you want the quality. Testing at 1440p/max settings is something I like to test as it pushes the GPU and hopefully the rest of the system - if you're a gamer, you want the best experience, and finding the hardware to do that is one of the most important things in that process (after getting good at the game you want).

So these results are offered in order to aid a purchasing decision based on our small sample size. No sample size is ever going to be big enough (unless you are able to test in Narnia), but we hope to expand on this in the future. Consider the data, read our conclusions - you may have a different interpretation of the data. Let us know what you think!

GPU Benchmarks: Sleeping Dogs
Comments Locked

242 Comments

View All Comments

  • CookieKrusher - Saturday, May 11, 2013 - link

    Good to know that my 2500K is still taking care of business. Now if I could just upgrade this GTX 460 this year, I'll be golden. :-)
  • Tchamber - Saturday, May 11, 2013 - link

    I wonder why games are vastly more parallel on the GPU side of things than the CPU side. If a game can utilize 2048 SPs, why doesn't adding 2 or 4 more CPU cores help much?
  • ShieTar - Tuesday, May 14, 2013 - link

    Because all parts of the code which can be run in parallel are already running on the GPU, and the CPU is stuck with the code that needs to be serial.
  • MelodyRamos47 - Sunday, May 12, 2013 - link

    Sienna. true that Amber`s artlclee is impossible... last saturday I bought Ariel Atom after having made $9498 this-last/5 weeks and-also, $10,000 last-munth. with-out any question its the coolest work I have ever had. I started this seven months/ago and pretty much straight away started bringin home minimum $71 p/h. I follow the details on this straightforward website, Bow6.comTAKE A LOOK
  • OwnedKThxBye - Sunday, May 12, 2013 - link

    Love this information it was an eye opener. Great Job Ian!

    To choose a gaming CPU is a question I am asked to answer nearly on a daily basis from clients or friends in my line of work. While your concluding recommendation are spot on given the information you provided, I wouldn't often find myself giving out the same advice. The reason behind this is the future upgrade path of the PC. My apologies if this has already been pointed out in the comments as I haven’t read every one yet.

    Most people seeking a PC upgrade have just started playing a new title and have hit a wall. They are unable to play this new game at the resolution and detail they feel to be the minimum they can put up with. This wall is mostly a CPU or GPU limitation but sometimes it’s both. Of these upgrades the new graphics card is significantly less expensive than a full system upgrade, can be installed easily by most people, and doesn't leave you without a PC or any down time. On the other hand a full system upgrade is expensive, not everyone can put it all together, and often requires an OS reinstallation with data backup.

    Let’s say an average gamer (not necessarily you and me) purchases a nice new gaming rig today for whatever reason. It’s likely that within two years or so they are going to hit a wall again. At this point most people have hit the GPU limitation and are able to upgrade the graphics card and away they go again for another one to two years. After hitting this wall for the second time it’s most likely time for a full system upgrade. This process could be only two years for some of us but for others it’s going to be four to five.

    What I’m trying to point out is that we can recommend a CPU that is the cheapest while still not limiting our current GPU and get the best possible FPS per dollar right now. But if we do this it’s far more likely we are going to run into a CPU bottleneck early in the upgrade path and instead of forking out a few hundred for a new graphics card after a year or two, we might end up having to replace the both the CPU and motherboard as well.

    For this reason I could not recommend an AMD A8-5600K or an equivalent Intel CPU to be purchased with a HD7970 or GTX580 unless you plan to never upgrade your graphics card. Spend the extra $100 to $150 on a better CPU and potentially make the PC last another two years. Maybe the inclusion of some popular titles like Battlefield 3 or PlanetSide 2 would have significantly changed your concluding recommendations. The information provided gives us a good indication of where the CPU bottleneck comes into play but I think the upgrade path of the PC along with what games are being played need to be given a lot more weight for an accurate recommendation to be made. Having said that I could be totally wrong and have recommended the wrong gaming builds for years.
  • TheJian - Monday, May 13, 2013 - link

    I can see a lot of work but only for a future that won't exist for a good long while. You tested at a res that is too high and not showing reality today based on this dumb idea that we'll all buy $400+ monitors. This is the same crap Ryan tries to push (see the 660ti comments section, he pushed it then when they were $600 and ebay via your visa to korea...ROFLMAO - read as I destroyed his responses to me, click ALL comments so you can just CTRL-F both of us). So raise your hand if you're planning on buying a $400+ monitor, to go with an expensive $300 card only to end up at say 28fps in a game like sleeping dogs (avg...so game is unplayable as minimums would be oh I don't know 15fps?). I don't see any hands raised. So we'll be lucky if MAXWELL in Q1 (or whatever Volcanic does for AMD Q4 this year) will be playable at 1440p. Translation, we'll all buy 1920x1200 or 1080p for a while to come unless we own more than one card. Raise your hand if you have multi-gpu's. According to steampowered.com hardware survey that number (last I checked) was under 2%. You're wasting your time. Start writing for the 98% instead of the 2%. I just wasted MY time reading this crap.

    REALITY: We are all gaming at 1920x1200 or 1080p (or worse, below this). This should be the focus. This would show LARGE separations in cpus and Intel kicking the crap out of AMD and that you wouldn't want to touch that A8-5600 with a 10ft pole. Why? The 7970 would not be the limiter, or at least not every time like here. What % of the people have 3-4 gpus? Give me a break this is what you see as the future? $1200 in gpus and a $400+ monitor? You're pandering to a crowd that almost doesn't exist at all. For what? To make an AMD cpu seem buy-able?

    The data in this article will be useful in 3yrs+ when you can hit 1440p at above 30fps MINIMUM on most cards. Today however, we needed to see what cpu matters at a resolution that doesn't make a 7970 look like a piece of outdated trash. You're pretty special today if you have 7970 or up in the gpu.

    More AMD CYA if you ask me. Just like we're still waiting months for Ryan to do an FCAT testing article...LOL. We'll be waiting for that until xmas I'd guess unless AMD doesn't get the prototype driver done by then, which means we'll never see FCAT testing here...ROFL.

    Ryan has ignored TWO articles now on fcat. It didn't make the 7990 review, and part2 of fcat article never even came. Just keep delaying, your sites credibility is going down the drain while everyone else tells it like it is. AMD & their drivers currently SUCK (cpu & gpu). Their cpu's suck; hence running at a res that shows all your games can't run without multi-gpu and hit 30fps+ MINIMUM - meaning at this res they ALL require more than one gpu making cpu choice a non issue of course. Their gpu's are great but drivers suck so they give away games by the truckload to try to sell a gpu vs. exceptional NV drivers. Lets face it, the best hardware in the world sucks if drivers can't live up to the hardware. Unfortunately AMD blew all their R&D on consoles that are about to die on the vine, instead of GREAT drivers to go with a GREAT gpu.

    What do you end up with when you spend your wad on consoles instead of drivers? FCAT showing you suck, runts, stutter, enduro that lacks on notebooks (see notebookcheck 7970m article recently, it was even mentioned here oddly...LOL) and CF that is abysmal and at times showing NEGATIVE scaling for more than one gpu vs....again, NV drivers that have none of these issues. Optimus works (hence nv beats this drum repeatedly and justifiably) and so does their SLI. While AMD sucked for a year (see hardocp driver review for AMD & NV recently) NV got to sit on their butts (driver butts) waiting for AMD to finally get done with consoles and make a "Never Settle" driver that actually performed the way the cards should have OUT OF THE BOX! Thank god for never settle drivers in Nov or Nvidia wouldn't have released monthly driver enhancements from Dec-May...ROFL. People would be stuck with the same perf today as out of the box from NV (as hardocp showed they didn't improve 1fps all year until AMD caught them...well duh, no point giving out free perf when blowing your enemy away all year).

    Mark my words...AMD will be writing off R&D for consoles soon. Even activision's Kotick just said last week that consoles (for all the reasons I've said repeatedly here and at tomshardware etc) have a very tough road ahead vs. all the new ones coming out. Sales of Wiiu off 50% after xmas pop. Just one Q after debut nobody cares already! He basically said they'll be watching for the same on the next two (ps4/xbox720). When that happens no games will be made going forward for this crap as we all move to tablet/phone/ cheaper console or PC (for ultimate gaming).

    Video killed the radio star. Cheap android gaming killed the console star....LOL.
    Ouya, Steambox, Shield (pc to tv here!), wikipad, razer edge, gamepop etc...All stuff that will help kill the consoles and stuff they have never faced before. It was always the big 3, but his time big 3 with little 6-10+a billion phones & tablets chasing them and our gaming time...ROFL. The writing has been on the wall for a LONG while. As usual AMD management screws up. Wisely NV passed on a dying market and only spent 10mil on both Shield and Grid respectively...ROFL. Dirk Meyer wouldn't be doing this crap. They were idiots letting him go thinking he didn't get it. He had a mobile strategy, it just wasn't one that made their CORE products suck while creating it. Management has PIPE dreams. Dirk had REALITY dreams.

    http://www.tomsguide.com/us/Next-Generation-Bobby-...
    Kotick saying consoles are dead, well he almost says it...Wait and see is basically the same thing...LOL.

    "If I were gaming today on a single GPU, the A8-5600K (or non-K equivalent) would strike me as a price competitive choice for frame rates, as long as you are not a big Civilization V player and don’t mind the single threaded performance. The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580, as well as feels the same in the OS as an equivalent Intel CPU."

    AMD CYA. Total lie. Drop this crap down to 1080p and watch the Intel chips separate the men from the boys and in MORE than just CIV5. ALL games would show separation I'd guess. You must of found this out, which immediately made you up the res huh? AMD threaten the free money or something if you showed them weak or Ryan managed to run FCAT testing?...LOL.

    "We know the testing done here today looks at a niche scenario - 1440p at Max Settings using very powerful GPUs. The trend in gaming, as I see it, will be towards the higher resolution panels, and with Korean 27" monitors coming into the market, if you're ok with that sort of monitor it is a direction to take to improve your gaming experience."

    Seriously? "If you're ok with EBAYing your $400 "KOREAN" monitor this is a great way to improve your gaming at under 30fps minimum in all games...ROFL. Reworded for clarity Ian :)

    NICHE situation is correct in that first sentence...LOL. Again, start paying attention to your audience which is 98% not the NICHE 2% or less. I'm debating buying a 2nd 1920x1200 (already have 2 monitors, one dell 24 and a 22in at 1680x1050) instead of your NICHE just because of what you showed here. 1440p is going to be difficult to run ABOVE 30fps MIN for ages. I'd spend most of my gaming time on the smaller dell 24 at 1920x1200 I think. So I'm debating buying the same thing again in 27in. I want a bigger screen, but not if I can't run 30fps for another 2-3 vid card revs (maxwell rev2?). This is just like I described above with AMD's gpu. Great hardware, but worthless without drivers that work right too. A korean monitor may look great, but what is it worth if you require $600+ in vid cards to have a prayer of 30fps? I'd rather buy a titan, not upgrade the monitor and hit well above 30fps on my dell 24 at 1920x1200 all day! THAT is a great gaming experience I can live with. I can't live with a BEAUTIFUL SLIDE SHOW on a korean monitor off ebay...LOL. I realize you can get a few here in the US now, but you get the point. This is making your niche look like a regular niche is 98%...LOL. Your situation is a NICHE of the NICHE. Check steampowered survey if you don't get what I just said.
    http://store.steampowered.com/hwsurvey/
    Less than 1% run your res tested here. That's niche of a niche right? The entire group of people above 1920x1200 is less than 2% added all up (and this is out of probably over a few hundred MILLION steam users). Just click the monitor res and it will break them out for you. You wrote an article for NOBODY to show AMD doesn't suck vs Intel? Start writing for EVERYBODY (that other 99%) and you'll be making recommendations for INTEL ONLY.

    I'm not saying anything bad against Ian here, clearly he did a lot of work. But whoever is pushing these articles instead of FCAT etc is driving this website into useless land. You guys didn't even mention NV's killer quarter (AGAIN). Profits up 29% over last year, heck everything was up even in a supposedly bad PC time (pc sales off 14%...no affect on Nvidia sales...LOL). They sell cards because their drivers don't suck and a new one comes out for every AAA title either before or on the day the game comes out! That's what AMD should be doing instead of console dev. They gave up the cpu race for consoles too! I'll be glad when this gen (not out yet) of consoles is DEAD. Maybe they will finally stop holding us back on PC's. They stuck us with 720p and dx9 for years, and they're set to stick us at 1080p for another 8yrs. They also allowed NV to not do anything to improve drivers for a year (due to AMD not catching them until Never Settle end of Nov2012). But maybe not this time...LOL. DIE CONSOLES DIE! :)

    Here's what happens when you show 1080p with details down...cpu's part like the red sea:
    http://www.tomshardware.com/reviews/neverwinter-pe...
    Look at that separation!
    "It's a little surprising that the Core i3-3220, FX-4170, and Phenom II X4 960 aren't able to manage a minimum of 30 FPS, though they come close. The dual-core chips are stuck at about 20 FPS, and the FX-8350 does a bit better with a 31 FPS floor that averages closer to 41 FPS. Only Intel's Core i5-3550 demonstrates a significantly better result, and we have to assume that higher-end Core processors are really what it takes to let AMD's single-GPU flagship achieve its best showing."

    Note only two CPU's managed above 30fps minimum! I guess you need a good cpu for more than just CIV 5 huh? You should have ran at this res with details down to show how bad AMD is currently. PEOPLE, listen to me now. Buy AMD cpus only if you're REALLY taxed in the wallet and can't afford INTEL! I love AMD, but if you value your gaming fun (meaning above 30fps) and have a decent card, for the love of god, BUY INTEL. This was a test with a SINGLE 7970ghz. AMD is light years away from Taxing their won top end gpus. But Intel isn't. The bottom to top in this article at toms was 17fps to 41fps. THAT IS HUGE! And they didn't even show top i7's. It would likely go up into the 50's or 60's then.

    Anandtech (not really blaming Ian himself here) is steering people into stupid decisions and hiding AMD's weaknesses in cpu's here, and in FCAT/gpu's with Ryan. I can't believe I'm saying this, but Tomshardware is actually becoming better than anandtech...LOL. WOW, I said that out loud. I never thought that would happen. It's 4:50am so I'm not going to grammar/spellcheck the nazi's can have fun if desired. :) Too bad I got to this article a week late.

    http://techreport.com/review/23246/inside-the-seco...
    THE REAL CPU ARTICLE YOU SHOULD READ. Note the separation from top to bottom in skyrim here is 58fps for AMD up to 108fps for Intel...See my point? Leave it to Scott Wasson (the guy who broke out the need for FCAT! along with Ryan Shrout I guess at pcper) to write the REAL article on why you don't want a slow cpu for ANY game. This is what happens at 1080P people! Note the FX8350 and 1100T are nowhere NEAR Intel in this review in ANY game tested. The phenom ii x4 980 is slow as molasses also! Note also Scott discusses frametimes which show AMD sucks. Welcome to stutter that isn't just because of the gpu...LOL.
    " All of them remain slower than the Intel chips from two generations back, though. "

    And this one sums it up best on the conclusion at techreport's article:
    "We don't like pointing out AMD's struggles any more than many of you like reading about them. It's worth reiterating here that the FX processors aren't hopeless for gaming—they just perform similarly to mid-range Intel processors from two generations ago. If you want competence, they may suffice, but if you desire glassy smooth frame delivery, you'd best look elsewhere. Our sense is that AMD desperately needs to improve its per-thread performance—through IPC gains, higher clock speeds, or both—before they'll have a truly desirable CPU to offer PC gamers. "

    Only anandtech has AMD rose colored glasses people. READ ELSEWHERE for real reporting. So AMD doesn't even offer a desirable cpu for gamers...LOL. Sad but true. Toms shows it, techreport shows it and if I had more time people, I really rip these guys apart at anandtech by posting a few more cpu tell-alls. This site keeps putting up stuff that HIDES AMD's deficiencies. I'd like to buy an AMD cpu this round, but I'd be an idiot if I did as a gamer. I7-4770k for me sir. Spend whatever you can on a haswell based system (it supposedly takes broadwell later) and wait for 20nm gpus until xmas/q1 where the real gain will come (even low end should get a huge bump). Haswell comes next month, you can wait for the FUTUREproof (if there is such a thing) socket one more month. Trust me. You'll be happy :)

    I'd put more links, but this site will see too many and call me a spammer...UGH.
  • colonelclaw - Monday, May 13, 2013 - link

    You lost me at '...same crap Ryan...'

    Never a great idea to preface a wall of text with an insult.
  • TheJian - Tuesday, May 14, 2013 - link

    Well they have previously done worse to me :) I presented data in the 660ti article, called out their obvious lies even with their own data (LOTs of Ryan's own benchmarks were used to show the lies), which prompted Jarred to call me a Ahole and said my opinion was UNINFORMED ;). Ryan was claiming his article wasn't for above 1920x1080 (or 1200) but he was pitching me $600 Korean monitors (same ones mentioned here) you had to buy from EBAY and give you Visa to a nobody in Korea. Seriously? It could not even be bought on amazon from anyone with more than a SINGLE review, which I pointed out was probably the guy reviewing himself :) He had no about page on his site, no support etc, not even a phone#, just an email if memory serves. It was laughable. After taking Ryan down, Jarred attacked ME not the data.

    What do you expect a person to do after that?

    They've been snowing people for a long time with articles like this.

    Where is FCAT article part2? Where is the FCAT results from 7990? We are still waiting for both and will continue as I keep saying until AMD fixes their junk drivers and I guess gives a green-light for Ryan to finally write about FCAT for REAL. This is a pro AMD site (used to be more neutral!), I really didn't write it hoping to get love from the viewers. I just wanted the data correctly presented which other sites did with aplomb. You don't have to like me, or the data, just realize it makes sense as shown in the post via links to others saying it. NOT me.

    People who stopped at "same crap ryan" were not my intended audience ;) I can hate a person (well I never do) and still value the data in a great argument made by said person. I don't care about them as long as it makes sense. The person means nothing. As I said above I don't blame IAN really, he's just doing what he's told. I even admired the work he put in it. I just wish that work could have been dedicated to data actually useful to 98% of us instead of nobody while hiding AMD's weaknesses. AMD is not a cpu I could recommend currently at all for anything unless you are totally strapped for cash. Even then, I'd say save for another month or something and come home with Intel. I'm not really an Intel fan either...LOL. I was selling AMD back when Asus was leaving their name off their boards (fear of Intel) and putting their motherboards in WHITE boxes! Intel should have had to pay AMD 15B (they made 60+B over the years screwing AMD like this). They had the best cpu's and Intel managed to stall with nasty tactics until they caught them. I love some of Intel's chips but generally hate the company. But I'd consider myself a D-Bag for not telling people the truth and shafting their computer purchase doing it. If I personally want to help AMD and buy a chip I think is way behind, great - I've done my part (I won't just saying). But I wouldn't tell my friends to do it claiming AMD is great right now. That's not a friend. Anandtech's readers are in a sense their friends (we keep reading or they go out of business right?). Hiding things on a massive scale here is not what friends do for friends is it?

    I didn't expect any favorable comments from consoles lovers either :)
  • OwnedKThxBye - Tuesday, May 14, 2013 - link

    We might all hate this guy (for good reason) but the words he writes regarding CPU performance in this article have a lot of truth.
  • yhselp - Tuesday, May 14, 2013 - link

    Agreed. What he wrote is offending, emotional and hardly objective. However, there's a truth hidden in there somewhere. Consider the following scenario.

    Here are a few suggestions. Since most users that would spend $500 on a flagship video card and $600-$800 on a 1440p monitor and God knows how much more on the rest of the system, aren’t likely to skimp on CPU choice to save a hundred bucks, a different testing scenario might produce more useful information for the masses (regarding cheap/er CPUs for gaming).

    A more likely market for an AMD CPUs in a gaming rig would be people on a tight budget – when every buck matters and the emphasis is on getting as fast a GPU as possible. In my opinion, it’d be quite useful to test various AMD CPUs which are cheaper than an Intel quad-core; paired with a 650 Ti Boost and/or 600 and/or similarly-priced AMD video card at 1080p. Of course, this would raise yet another question – are Intel dual-cores faster than similarly-priced AMD quad-cores in this mid-range gaming scenario?

    Suggestions for other CPUs:
    Core i5-3350P – baseline Intel quad-core performance (cheapest Intel quad-core on the market)
    Pentium G2120 – should perform similarly as an i3 for gaming (costs less)
    Celeron G1610 – cheapest Intel CPU

Log in

Don't have an account? Sign up now