Deus Ex: Mankind Divided

The latest entry to the Deus Ex series, Mankind Divided is a multi-genre, graphically-demanding DX11 title that received DX12 support a month after launch. Running through the built-in benchmark, we used Ultra settings without alterations.

After testing Vega cards under DX12, we noticed a consistent performance uplift, and switched over to that API for RX Vega 64 and 56. Unfortunately, this was not the case for the GTX 1070 Ti Founders Edition.

Deus Ex: Mankind Divided - 3840x2160 - Ultra QualityDeus Ex: Mankind Divided - 2560x1440 - Ultra QualityDeus Ex: Mankind Divided - 1920x1080 - Ultra Quality

With the DX12 performance boost on RX Vega hardware, both Vega cards actually pull ahead from the GTX 1070 Ti at lower resolutions.

Dawn of War III Grand Theft Auto V
Comments Locked

78 Comments

View All Comments

  • BrokenCrayons - Thursday, November 2, 2017 - link

    This review was a really good read. I also like that the game screenshots were dropped from it since they didn't exactly add much, but do eat a little of my data plan when I'm reading from a mobile device.

    As for the 1070 Ti, agreed its priced a bit too high. However, I think most of the current-gen GPUs are pushing the price envelope right now. Except maybe the 1030 of course which has a reasonable MSRP and doesn't require a dual slot cooler. That's really the only graphics card outside of an iGPU I'd seriously consider if I were in the market at the moment, but then again I'm not playing a lot of games on a PC because I have a console and a phone for that sort of thing.
  • Communism - Thursday, November 2, 2017 - link

    Literally the same price as a 1070 non-Ti was a week ago.

    Those cards sold so well that retailers are still gouging them to this day.
  • Communism - Thursday, November 2, 2017 - link

    And I should mention that the only reason that retail prices of 1070, Vega 56, and Vega 64 went down is due to the launch of 1070 Ti.
  • timecop1818 - Thursday, November 2, 2017 - link

    Still got that fuckin' DVI shit in 2017.
  • DanNeely - Thursday, November 2, 2017 - link

    Lack of a good way to run dual link DVI displays via HDMI/DP is probably keeping it around longer than originally intended. This includes both relatively old 2560x1600 displays that predate DP or HDMI 1.4 and thus could only do DL-DVI, and cheap 'Korean' 2560x1440 monitors from 2 or 3 years ago. The basic HDMI/DP-DVI adapters are single link and max out at 1920x1200. A few claim 2560x1600 by overclocking the data rate by 100% to stuff it down a single link worth of wires; this is mostly useless though since other than HDMI1.4 capable displays (which don't need this) virtually no DVI monitors can actually take a signal that fast. Active DP-DLDVI adapters can theoretically do it for $70-100, but they all came out buggy to one degree or another and sales were apparently too low to justify a new generation of hardware that fixed the issues.
  • Nate Oh - Saturday, November 4, 2017 - link

    This is actually precisely why I don't mind DVI too much, because I have and still use a 1-DVI-input-only Korean 1440p A- monitor from 3 years ago, overclocked to 96Hz. DVI probably needs to go away at some point soon, but maybe not too soon :)
  • ddferrari - Friday, November 3, 2017 - link

    So, that DVI port really ruins everything for ya? What are you, 14??

    There are tons of overclockable 1440p Korean monitors out there that only have one input- DVI. Adding a DVI port doesn't increase cost, slow down performance, or increase heat levels- so what's your imaginary problem again?
  • Notmyusualid - Sunday, November 5, 2017 - link

    @ ddferrari

    What are you - ddriver incarnate?

    I'm older than 14, and I wish the DVI port wasn't' there, as it is work to strip them off when I make my GPUs into single-slot water-cooled versions. Removing / modifying the bracket is one thing, but pulling out those DVI ports is another.
  • Silma - Thursday, November 2, 2017 - link

    How can you compare the Vega 56 to the GTX 1070 when it's 7 dB noisier and consumes up to 78 watts more ?
  • sach1137 - Thursday, November 2, 2017 - link

    Because the MSRP's of both the cards are same. Vega 56 beats 1070 in almost all games.yes it consumes more power and noiser too. But for some people it doesnt matter it gives 10-15% more performance than 1070. When Overclocked you can extract more from Vega 56 too.

Log in

Don't have an account? Sign up now