Sleeping Dogs

Sleeping Dogs is a strenuous game with a pretty hardcore benchmark that scales well with additional GPU power when SSAO is enabled.  The team at Adrenaline.com.br is supreme for making an easy to use benchmark GUI, allowing a numpty like me to charge ahead with a set of four 1440p runs with maximum graphical settings.

One 7970

Sleeping Dogs - One 7970, 1440p, Max Settings

With one AMD GPU, Sleeping Dogs is similar across the board.

Two 7970s

Sleeping Dogs - Two 7970s, 1440p, Max Settings

On dual AMD GPUs, there seems to be a little kink with those running x16+x4 lane allocations, although this is a minor difference.

Three 7970s

Sleeping Dogs - Three 7970, 1440p, Max Settings

Between an i7-920 and an i5-4430 we get a 7 FPS difference, almost 10%, showing the change over CPU generations.  In fact at this level anything above that i7-920 gives 70 FPS+, but the hex-core Ivy-E takes top spot at ~81 FPS.

One 580

Sleeping Dogs - One 580, 1440p, Max Settings

0.4 FPS between Core2Duo and Haswell.  For one NVIDIA GPU, CPU does not seem to matter(!)

Two 580s

Sleeping Dogs - Two 580s, 1440p, Max Settings

Similarly with dual NVIDIA GPUs, with less than ~3% between top and bottom results.

Sleeping Dogs Conclusion

While the NVIDIA results did not change much between different CPUs, any modern processor seems to hit the high notes when it comes to multi-GPU Sleeping Dogs.

GPU Benchmarks: Civilization V Final Results, Conclusions and Recommendations
Comments Locked

137 Comments

View All Comments

  • tackle70 - Thursday, October 3, 2013 - link

    The 8350 is with the 2600k, not the 3930k...

    So yeah, it's a very good showing for AMD, but not as good as what you indicate. Also, according to sweclockers, an overclocked i5 is still superior to an overclocked 83xx CPU, so make of that what you wish.

    I'm just glad we're seeing games starting to use more than 2-4 threads effectively.
  • Traciatim - Thursday, October 3, 2013 - link

    Much more likely is that games will just become less and less reliant on CPU power because of the terrible netbook processors in the consoles and will instead rely more and more on the GPU. The PC versions of games will just be the same game with a high res texture pack and some extra graphics bling to use up GPU cycles while your processor sits around shuffling a little data.
  • Flunk - Friday, October 4, 2013 - link

    I'm not sure AMD will benefit that much. As soon as consumer CPUs have a reason to have more cores they're just release a new chip with more cores. There is absolutely no reason that they can't release a 8 or ever 12 core desktop processor, they're already selling them for servers.
  • Flunk - Friday, October 4, 2013 - link

    Forgot to mention, Watch Dogs is probably x64 only because they want to use more than 2GB of RAM (which is the limit for the user-mode memory partition in Win32).
  • Nirvanaosc - Thursday, October 3, 2013 - link

    Looking just at the gaming results, does this means that almost any CPU is capable to feed the GPU at 1440p and it is always GPU limited?
  • Nirvanaosc - Thursday, October 3, 2013 - link

    I mean in single GPU config.
  • Traciatim - Thursday, October 3, 2013 - link

    That's pretty much just the games they picked. If you could reliably benchmark large scale PC games like Planetside 2, or other popular large scale MMO's reliably you'd pretty much see the exact opposite. The trouble is, it seems like no MMO makers give you reliable benchmarking tools so you can't use them for tests like these.
  • ryccoh - Thursday, October 3, 2013 - link

    I would really like to see a CPU comparison for strategy games.
    For example, one could have a save game of a far advanced game in Civilization 5 or Total War with many AI players on the largest map and then see how the waiting time varies between the different CPUs. This should be feasible, shouldn't it?
    I'm running an i5 2500k @4.6ghz and it just isn't cutting it for Civilization 5 on a large map once you're far into the game, it would be nice to see whether getting hyperthreading and more cores would be worth it.
  • glugglug - Thursday, October 3, 2013 - link

    Having waited the ridiculous amounts of time between turns on Civ V, and having dual monitors, I put task manager up on the second monitor while it was running, to see that Civ V *IS NOT MULTITHREADED. AT ALL*. Setting the CPU affinity to make it use only 1 logical core makes absolutely no performance difference at all! The only thing I can think of for why a better result would be seen on quad-core systems would be that it likes having a larger L3 cache.
  • glugglug - Thursday, October 3, 2013 - link

    P.S. If my "Civ V just likes cache" theory is right, an Iris Pro laptop should be the ultimate Civ V machine.

Log in

Don't have an account? Sign up now