Sleeping Dogs

Sleeping Dogs is a strenuous game with a pretty hardcore benchmark that scales well with additional GPU power when SSAO is enabled.  The team at Adrenaline.com.br is supreme for making an easy to use benchmark GUI, allowing a numpty like me to charge ahead with a set of four 1440p runs with maximum graphical settings.

One 7970

Sleeping Dogs - One 7970, 1440p, Max Settings

With one AMD GPU, Sleeping Dogs is similar across the board.

Two 7970s

Sleeping Dogs - Two 7970s, 1440p, Max Settings

On dual AMD GPUs, there seems to be a little kink with those running x16+x4 lane allocations, although this is a minor difference.

Three 7970s

Sleeping Dogs - Three 7970, 1440p, Max Settings

Between an i7-920 and an i5-4430 we get a 7 FPS difference, almost 10%, showing the change over CPU generations.  In fact at this level anything above that i7-920 gives 70 FPS+, but the hex-core Ivy-E takes top spot at ~81 FPS.

One 580

Sleeping Dogs - One 580, 1440p, Max Settings

0.4 FPS between Core2Duo and Haswell.  For one NVIDIA GPU, CPU does not seem to matter(!)

Two 580s

Sleeping Dogs - Two 580s, 1440p, Max Settings

Similarly with dual NVIDIA GPUs, with less than ~3% between top and bottom results.

Sleeping Dogs Conclusion

While the NVIDIA results did not change much between different CPUs, any modern processor seems to hit the high notes when it comes to multi-GPU Sleeping Dogs.

GPU Benchmarks: Civilization V Final Results, Conclusions and Recommendations
Comments Locked

137 Comments

View All Comments

  • BrightCandle - Thursday, October 3, 2013 - link

    So again we see tests with games that are known not to scale with more CPU cores. There are games however that show clear benefits, your site simply doesn't test them. Its not universally true that more cores or HT make a difference but maybe it would be a good idea to focus on those games we know do benefit like Metro last light, Hitman absolution, Medal of honour warfighter and some areas of Crysis 3.

    The problem here is that its the games that support more multithreading, so to give true impression you need to test a much wider and modern set of games. To do otherwise is pretty misleading.
  • althaz - Thursday, October 3, 2013 - link

    To test only those games would be more misleading, as the vast majority of games are barely multithreaded at all.
  • erple2 - Thursday, October 3, 2013 - link

    Honestly, if the stats for single GPU's weren't all at about the same level, this would be an issue. It isn't until you get to multiple GPU's - an area that you start to see some differentiation. But that level begins to become very expensive very quickly. I'd posit that if you're already into multiple high-end video cards, the price difference between dual and quad core is relatively insignificant anyway, so the point is moot.
  • Pheesh - Thursday, October 3, 2013 - link

    appreciate the review, but it seems like the choice of games and settings makes the results primarily reflect a GPU constrained situation (1440p max settings for a CPU test?). It would be nice to see some of the newer engines which utilize more cores as most people will be buying CPU for titles in the future. I'm personally more interested in the delta between the CPU's when in CPU bound situations. Early benchmarks of next gen engines have shown larger differences between 8 threads vs 4 threads.
  • cbrownx88 - Friday, October 4, 2013 - link

    amen
  • TheJian - Sunday, October 6, 2013 - link

    Precisely. Also only 2% of us even own 1440p monitors and I'm guessing the super small % of us in a terrible economy that have say $550 to blow on a PC (the price of the FIRST 1440p monitor model you'd actually recognize the name of on newegg-asus model (122reviews) – and the only one with more than 12reviews) would buy anything BUT a monitor that would probably require 2 vid cards to fully utilize anyway. Raise your hand if you're planning on buying a $550 monitor instead of say, buying a near top end maxwell next year? I see no hands. Since 98% of us are on 1920x1200 or LESS (and more to the point a good 60% are less than 1920x1080), I'm guessing we are all planning on buying either a top vid card, or if you're in the 60% or so that have UNDER 1080p, you'll buy a $100-200 monitor (1080p upgrade to 22in-24in) and a $350-450 vid card to max out your game play.

    Translation: These results affect less than 2% of us and are pointless for another few years at the very least. I'm planning on buying a 1440p monitor but LONG after I get my maxwell. The vid card improves almost everything I'll do in games. The monitor only works well if I have the VID CARD muscle ALREADY. Most people of the super small 2% running 1440p or up have two vid cards to push the monitors (whatever they have). I don't want to buy a monitor and say "oh crap, all my games got super slow" for the next few years (1440p for me is a purchase once a name brand is $400 at 27in – only $150 away…LOL). I refuse to run anything but native and won't turn stuff off. I don't see the point in buying a beautiful monitor if I have to turn in into crap to get higher fps anyway... :)

    Who is this article for? Start writing articles for 98% of your readers, not 2%. Also you'll find the cpu's are far more important where that 98% is running as fewer games are gpu bound. I find it almost stupid to recommend AMD these days for cpus and that stupidity grows even more as vid cards get faster. So basically if you are running 1080p and plan to for a while look at the cpu separation on the triple cards and consider that what you'll see as cpu results. If you want a good indication of what I mean, see the first or second 1440p article here and CTRL-F my nick. I listed all the games previously that part like the red sea leaving AMD cpus in the dust (it's quite a bit longer than CIV 5...ROFL). I gave links to the benchmarks showing all those games.
    http://www.anandtech.com/comments/6985/choosing-a-...
    There's the comments section on the 2nd 1440p article for the lazy people :)

    Note that even here in this article TWO of the games aren't playable on single cards...LOL. 34fps avg in metro 2033 means you'll be hitting low 20's or worse MINIMUM. Sleeping dogs is already under 30fps AVG, so not even playable at avg fps let alone MIN fps you will hit (again TEENS probably). So if you buy that fancy new $550+ monitor (because only a retard or a gambler buys a $350 korean job from ebay etc...LOL) get used to slide shows and stutter gaming for even SINGLE 7970's in a lot of games never mind everything below sucking even more. Raise your hand if you have money for a $550 monitor AND a second vid card...ROFL. And these imaginary people this article is for, apparently should buy a $110 CPU from AMD to pair with this setup...ROFLMAO.

    REALISTIC Recommendations:
    Buy a GPU first.
    Buy a great CPU second (and don't bother with AMD unless you're absolutely broke).
    Buy that 1440p monitor if your single card is above 7970 already or you're planning shortly on maxwell or some such card. As we move to unreal 4 engine, cryengine 3.5 (or cryengine 4th gen…whatever) etc next year, get ready to feel the pain of that 1440p monitor even more if you're not above 7970. So again, this article should be considered largely irrelevant for most people unless you can fork over for the top end cards AND that monitor they test here. On top of this, as soon as you tell me you have the cash for both of those, what they heck are you doing talking $100 AMD cpus?...LOL.

    And for AMD gpu lovers like the whole anandtech team it seems...Where's the NV portal site? (I love the gpus, just not their drivers):
    http://hothardware.com/News/Origin-PC-Ditching-AMD...
    Origin AND Valve have abandoned AMD even in the face of new gpus. Origin spells it right out exactly as we already know:
    "Wasielewski offered a further clarifying statement from Alvaro Masis, one of Origin’s technical support managers, who said, “Primarily the overall issues have been stability of the cards, overheating, performance, scaling, and the amount of time to receive new drivers on both desktop and mobile GPUs.”

    http://www.pcworld.com/article/2052184/whats-behin...
    More data, nearly twice the failure rate at another vendor confirming why the first probably dropped AMD. I’d call a 5% rate bad, never mind AMD’s 1yr rate of nearly 8% failure (and nearly 9% over 3yrs). Cutting RMAs nearly in half certainly saves a company some money, never mind all the driver issues AMD still has and has had for 2yrs. A person adds a monitor and calls tech support about AMD eyefinity right? If they add a gpu they call about crossfire next?...ROFL. I hope AMD starts putting more effort into drivers, or the hardware sucks no matter how good the silicon is. As a boutique vendor at the high end surely the crossfire and multi-monitor situation affects them more than most who don't even ship high end stuff really (read:overly expensive...heh)

    Note one of the games here performs worse with 3 cards than 2. So I guess even anandtech accidentally shows AMD's drivers still suck for triple's. 12 cpus in civ5 post above 107fps with 2 7970's, but only 5 can do over 107fps with 3 cards...Talk about going backwards. These tests, while wasted on 98% of us, should have at the least been done with the GPU maker who has properly functioning drivers WITH 2 or 3 CARDS :)
  • CrispySilicon - Thursday, October 3, 2013 - link

    What gives Anand?

    I may be a little biased here since I'm still rocking a Q6600 (albeit fairly OC'd). But with all the other high end platforms you used, why not use a DDR3 X48/P45 for S775?

    I say this because NOBODY who reads this article would still be running a mobo that old with pcie 1.1, especially in multi-gpu configuration.
  • dishayu - Friday, October 4, 2013 - link

    I share you opinion on the matter, although I'm myself still running a Q6600 on a MSI P965 Platinum with an AMD HD6670. :P
  • ThomasS31 - Thursday, October 3, 2013 - link

    Please also add Battlefield 4 to the game tests in the next update(s)/2014.

    I think it will be very relevant, based on the beta experience.
  • tackle70 - Thursday, October 3, 2013 - link

    I know you dealt with this criticism in the intro, and I understand the reasoning (consistency, repeatability, etc) but I'm going to criticize anyways...

    These CPU results are to me fairly insignificant and not worth the many hours of testing, given that the majority of cases where CPU muscle is important are multiplayer (BF3/Crysis 3/BF4/etc). As you can see even from your benchmark data, these single player scenarios just don't really care about CPU all that much - even in multi-GPU. That's COMPLETELY different in the multiplayer games above.

    Pretty much the only single player game I'm aware of that will eat up CPU power is Crysis 3. That game should at least be added to this test suite, in my opinion. I know it has no built in benchmark, but it would at least serve as a point of contact between the world of single player CPU-agnostic GPU-bound tests like these and the world of CPU-hungry multiplayer gaming.

Log in

Don't have an account? Sign up now