Gaming Tests: Deus Ex Mankind Divided

Deus Ex is a franchise with a wide level of popularity. Despite the Deus Ex: Mankind Divided (DEMD) version being released in 2016, it has often been heralded as a game that taxes the CPU. It uses the Dawn Engine to create a very complex first-person action game with science-fiction based weapons and interfaces. The game combines first-person, stealth, and role-playing elements, with the game set in Prague, dealing with themes of transhumanism, conspiracy theories, and a cyberpunk future. The game allows the player to select their own path (stealth, gun-toting maniac) and offers multiple solutions to its puzzles.

DEMD has an in-game benchmark, an on-rails look around an environment showcasing some of the game’s most stunning effects, such as lighting, texturing, and others. Even in 2020, it’s still an impressive graphical showcase when everything is jumped up to the max. For this title, we are testing the following resolutions:

  • 600p Low, 1440p Low, 4K Low, 1080p Max

The benchmark runs for about 90 seconds. We do as many runs within 10 minutes per resolution/setting combination, and then take averages and percentiles.

AnandTech Low Resolution
Low Quality
Medium Resolution
Low Quality
High Resolution
Low Quality
Medium Resolution
Max Quality
Average FPS
95th Percentile

DEMD is often considered a CPU-limited title, so when the 11700K is better than the older Intel CPUs is at the low resolution, low quality setting, that confirms that. But as we ramp up the resolution, and the quality, the 11700K falls behind ever so slightly in both averages and percentiles.

All of our benchmark results can also be found in our benchmark engine, Bench.

CPU Tests: SPEC Gaming Tests: Final Fantasy XIV
Comments Locked

541 Comments

View All Comments

  • zzzxtreme - Sunday, March 7, 2021 - link

    I wished you would have tested the XE graphics
  • Fman4 - Monday, March 8, 2021 - link

    Am I the only one find that OP plugged 4 RAMs on an X570 ITX motherboard?
  • Fman4 - Monday, March 8, 2021 - link

    @Dr. Ian Cutress
  • zodiacfml - Monday, March 8, 2021 - link

    bored. just here to say this is unsurprising though this strongly reminds me of the time where AMD is releasing new, well designed CPUs but two process node generations behind intel. I think AMD was 32nm and 28nm while Intel is 22 and 14nm. most comments were really harsh with AMD but I reasoned that it is simply due to the manufacturing superiority of Intel
  • blppt - Monday, March 8, 2021 - link

    Bulldozer and Piledriver are not the examples I would put up for "well designed".
  • GeoffreyA - Tuesday, March 9, 2021 - link

    Still, within that mess, AMD did a pretty good job raising Bulldozer's IPC and cutting down its power each generation. But the foundation being fatally flawed, it was hopeless. I believe it taught them a lot about cutting power and so on, and when they poured that into Zen, we saw the result. Bulldozer was a fantastic training ground, if one looks at it humorously.
  • Oxford Guy - Tuesday, March 9, 2021 - link

    No, AMD did an extremely poor job.

    Firstly, Bulldozer had worse IPC than Phenom. No engineers with brains release a CPU to replace the entire line while giving it worse IPC. The trap of going for high clocks was a lesson shown to the entire industry via Netburst. AMD's engineers knew all about it, yet someone at the company decided to try Netburst 2.0.

    Secondly, AMD was so sloppy and lazy that Piledriver shipped with a performance regression in AVX. It was worse to use AVX than to not use it. How incredibly incompetent can the company have been? It doesn't take a high IQ to understand that one doesn't ship broken AVX.

    AMD then refused to replace Piledriver until Zen came out. It tinkered half-heartedly with APU rubbish and focused on pushing junk like Jaguar.

    While it's true that the extreme failure of AMD (the construction core line) is due, to a large degree, to Intel abusing its monopoly to starve AMD of customers and cash — cash it needed to do R&D, one does not release a new chip with worse IPC and then very shortly after break AVX and refuse to stop feeding that junk to customers for many years. Just tinkering with Phenom would have been better (Phenom 3).

    As for the foundation claim... we have no idea how well the CMT concept could have worked out with competent engineering. Remember, they literally broke AVX in the Piledriver revision that was supposed to fix Bulldozer enough to make it sellable. Operations caching could have been stronger. The L3 cache was almost as slow as main memory. The RAM controller was weak, just like Phenom's. Etc.

    We paid for Intel's monopoly and we're still paying today. Only its monopoly and the lack of adequate competition is enabling the company to be so profitable despite failing so badly. Relying on two companies (or one 1/2, when it comes to R&D money ratio and other factors) to deliver adequate competition doesn't work.

    Google and Microsoft = Google owns the clearnet. Apparently, they have some sort of cooperation agreement which helps to explain why Bing has such a tiny index and such a poor-quality search.

    TSMC and Samsung = Can't meet demand.

    AMD and Nvidia = Nvidia keeps breaking profit records while utterly failing to meet demand. Both companies refuse to stop making their cards attractive for mining and have for a long long time. AMD refused to adequately compete beyond the lower midrange (Polaris forever, or you can buy a 'console'!) for a long time, leaving us to pay through the nose for Nvidia's prices. AMD literally competes against the PC market by pushing the console scam. Consoles are gaming PCs in disguise and they're parasitic in multiple ways, including in terms of wafer allocations. AMD's many many years of refusal to compete with Nvidia beyond the Polaris price point caused so much pent-up demand and now the company can enjoy the artificially high price points from that. It let Nvidia keep raising prices to get consumers used to that. Now that it has finally been forced to improve the 'consoles' beyond the garbage-tier Jaguar CPU it has to offer a bit more value to the PC gaming market. And so, after all these years, we have something decent that one can't buy. I can go on about this so-called competition but why bother. People will go to the most extravagant lengths to excuse the problem of lack of adequate competition — like the person who recently said it's easier to create Google's empire from scratch than it is to make a competitive GPU and sell it as a third GPU company.

    There are plenty of other areas in tech with inadequate competition, too.
  • blppt - Tuesday, March 9, 2021 - link

    "AMD then refused to replace Piledriver until Zen came out. It tinkered half-heartedly with APU rubbish and focused on pushing junk like Jaguar."

    To be fair, AMD had put a LOT of time, money and effort into Bulldozer/Piledriver, and were never a company with bottomless wells of cash to toss an architecture out immediately. Plus, Zen took a long time to design and finalize---thankfully, they made literally ALL the right moves in designing it, including hiring the brilliant Jim Keller.

    I think if Zen had been another BD like failure, that would have been the almost the end of AMD in the cpu market (leaving them basically as ATI was) The consoles likely would have gone with Intel or ARM for their next iteration. AMD once again spent tons of money that they don't have as disposable income in designing Zen. Two failures in a row would have been disastrous.

    Heck, the consoles might go with their own custom ARM design for PS6/Xbox(whatever) anyways.
  • GeoffreyA - Wednesday, March 10, 2021 - link

    blppt. Agreed, that would have been the end of AMD.
  • Oxford Guy - Wednesday, March 10, 2021 - link

    AMD did not put a lot of resources into fixing Bulldozer.

    It shipped Piledriver with broken AVX and never bothered to replace Piledriver on the desktop until Zen.

    Inexcusable. It shipped Steamroller and Excavator in cost-cut mode, cutting cores, cutting clocks, cutting the socket standards, and cutting cache. It used a dense library to save money by keeping the die small and used the inferior 28nm bulk process.

    Pathetic in basically every respect.

Log in

Don't have an account? Sign up now