Sleeping Dogs

Sleeping Dogs is a strenuous game with a pretty hardcore benchmark that scales well with additional GPU power when SSAO is enabled.  The team at Adrenaline.com.br is supreme for making an easy to use benchmark GUI, allowing a numpty like me to charge ahead with a set of four 1440p runs with maximum graphical settings.

One 7970

Sleeping Dogs - One 7970, 1440p, Max Settings

With one AMD GPU, Sleeping Dogs is similar across the board.

Two 7970s

Sleeping Dogs - Two 7970s, 1440p, Max Settings

On dual AMD GPUs, there seems to be a little kink with those running x16+x4 lane allocations, although this is a minor difference.

Three 7970s

Sleeping Dogs - Three 7970, 1440p, Max Settings

Between an i7-920 and an i5-4430 we get a 7 FPS difference, almost 10%, showing the change over CPU generations.  In fact at this level anything above that i7-920 gives 70 FPS+, but the hex-core Ivy-E takes top spot at ~81 FPS.

One 580

Sleeping Dogs - One 580, 1440p, Max Settings

0.4 FPS between Core2Duo and Haswell.  For one NVIDIA GPU, CPU does not seem to matter(!)

Two 580s

Sleeping Dogs - Two 580s, 1440p, Max Settings

Similarly with dual NVIDIA GPUs, with less than ~3% between top and bottom results.

Sleeping Dogs Conclusion

While the NVIDIA results did not change much between different CPUs, any modern processor seems to hit the high notes when it comes to multi-GPU Sleeping Dogs.

GPU Benchmarks: Civilization V Final Results, Conclusions and Recommendations
Comments Locked

137 Comments

View All Comments

  • warezme - Thursday, October 3, 2013 - link

    I to invested in the venerable (speak only in awe hushed whispers), i7 920 which I promptly overclocked to 3.6Ghz. This little jewel has been going strong for Goodness almost half a decade? and stable as a rock and I notice holding it's own very well even up against the latest and greatest. This is a testament to competition and engineering when competition in the CPU arena existed. I have long switched from dual GPU's to single but dual core cards on a single fat 16x pci-e bus even though my Evga X58SLI board supports higher. I'll ride the wave one more year and see what new gear crashes in next year. Hopefully a new Nvidia architecture that will inspire me to upgrade everything.
  • Hrel - Thursday, October 3, 2013 - link

    "our next update will focus solely on the AMD midrange."

    Please don't do that. PLEASE include at least 3 Intel CPU's for comparison. It doesn't matter if the FX8320 does well in benchmarks if for another $40 bucks I can get a i54670 that runs 50% faster. These are hypothetical numbers, obviously, but then Intel will be faster. By how much matters, once you factor in price and energy draw especially.
  • A5 - Thursday, October 3, 2013 - link

    The old numbers will still be there for comparison. The next update is just *adding* more AMD data.
  • just4U - Thursday, October 3, 2013 - link

    It's hard making sense of AMD data in comparison to Intel. As near as I can tell their sitting at just beyond i7920 performance these days but /w all the new features. It gets confusing when you look at the X4 X6 older stuff though since some of that is actually faster... yet somehow only compares favorably to Intel's 9X Core2 stuff.
  • just4U - Thursday, October 3, 2013 - link

    Why 3? The i5 entry level 4430 beats out every AMD chip on the market in most instances. Adding in more simply confuses people and adds more fodder for fanboys to fight over.. and I think it taxes the patience of most of us that already know what's what in the cpu arena.

    Simple rule of thumb. If your on a budget you may want to go AMD to get all the "other" bells and whistles your looking to buy or.. if you have a more to spend your starting point will be the i54430.
  • just4U - Thursday, October 3, 2013 - link

    Excellent article Ian, I really like the inclusion of older CPU's. It's a good basis in which to decide if it's "time" to upgrade on that front. Most of the people I know are not on the bleeding edge of technology. Many sit back in 2009 with minor updates to video and Hard Drives. Anyway.. Well done lots to sift thru.
  • Jackie60 - Thursday, October 3, 2013 - link

    At last Anandtech is doing some meaningful second decade of 21st century testing. Well done and keep it up ffs!
  • SolMiester - Thursday, October 3, 2013 - link

    Can someone please tell me why we are using 2+ yr old GPUs?
  • A5 - Thursday, October 3, 2013 - link

    You could read the article.
  • OrphanageExplosion - Thursday, October 3, 2013 - link

    Amazing data. I do wonder whether the testing at max settings is a good idea though. The variation in performance can be extreme. Just watch the Metro 2033 benchmark play out. Does that look like the kind of experience you'd want to play?

    Perhaps more importantly though, the arrival of next-gen console changes everything.

    Did you see the news that Watch Dogs is x64 only? That's just the tip of the iceberg. Developers need to go wide to make the most out of six available Jaguar cores. Jobs-based scheduling over up to eight cores will become the norm rather than the exception. The gap between i5 vs. i7 will widen. AMD FX will suddenly become a lot more interesting.

    In short order, I'd expect to see dual core CPUs and less capable quads start to look much less capable very quickly. i5 vs. i7 will see a much larger gulf in performance.

    Check out the CPU data here for the Battlefield 4 beta:

    http://gamegpu.ru/action-/-fps-/-tps/battlefield-4...

    The dual cores are being maxed out, FX-8350 is up there with the 3930K (!)

Log in

Don't have an account? Sign up now