GPU Tests: Civilization 6 (1080p, 4K)

First up in our CPU gaming tests is Civilization 6. Originally penned by Sid Meier and his team, the Civ series of turn-based strategy games are a cult classic, and many an excuse for an all-nighter trying to get Gandhi to declare war on you due to an integer overflow. Truth be told I never actually played the first version, but every edition from the second to the sixth, including the fifth as voiced by the late Leonard Nimoy, it a game that is easy to pick up, but hard to master.

Benchmarking Civilization has always been somewhat of an oxymoron – for a turn based strategy game, the frame rate is not necessarily the important thing here and even in the right mood, something as low as 5 frames per second can be enough. With Civilization 6 however, Firaxis went hardcore on visual fidelity, trying to pull you into the game. As a result, Civilization can taxing on graphics and CPUs as we crank up the details, especially in DirectX 12.

Perhaps a more poignant benchmark would be during the late game, when in the older versions of Civilization it could take 20 minutes to cycle around the AI players before the human regained control. The new version of Civilization has an integrated ‘AI Benchmark’, although it is not currently part of our benchmark portfolio yet, due to technical reasons which we are trying to solve. Instead, we run the graphics test, which provides an example of a mid-game setup at our settings.

At both 1920x1080 and 4K resolutions, we run the same settings. Civilization 6 has sliders for MSAA, Performance Impact and Memory Impact. The latter two refer to detail and texture size respectively, and are rated between 0 (lowest) to 5 (extreme). We run our Civ6 benchmark in position four for performance (ultra) and 0 on memory, with MSAA set to 2x.

For reviews where we include 8K and 16K benchmarks (Civ6 allows us to benchmark extreme resolutions on any monitor) on our GTX 1080, we run the 8K tests similar to the 4K tests, but the 16K tests are set to the lowest option for Performance.

MSI GTX 1080 at 1920x1080

(1080p) GTX 1080: Civilization 6, Average Frame Rate(1080p) GTX 1080: Civilization 6, 99th Percentile(1080p) GTX 1080: Civilization 6, Time Under 60 FPS

MSI GTX 1080 at 4K

(4K) GTX 1080: Civilization 6, Average Frame Rate(4K) GTX 1080: Civilization 6, 99th Percentile>(4K) GTX 1080: Civilization 6, Time Under 60 FPS

ASUS GTX 1060 at 1920x1080

(1080p) GTX 1060: Civilization 6, Average Frame Rate(1080p) GTX 1060: Civilization 6, 99th Percentile(1080p) GTX 1060: Civilization 6, Time Under 60 FPS

ASUS GTX 1060 at 4K

(4K) GTX 1060: Civilization 6, Average Frame Rate(4K) GTX 1060: Civilization 6, 99th Percentile(4K) GTX 1060: Civilization 6, Time Under 60 FPS

Sapphire R9 Fury at 1920x1080

(1080p) R9 Fury: Civilization 6, Average Frame Rate(1080p) R9 Fury: Civilization 6, 99th Percentile(1080p) R9 Fury: Civilization 6, Time Under 60 FPS

Sapphire R9 Fury at 4K

(4K) R9 Fury: Civilization 6, Average Frame Rate(4K) R9 Fury: Civilization 6, 99th Percentile(4K) R9 Fury: Civilization 6, Time Under 30 FPS

Sapphire RX 480 at 1920x1080

(1080p) RX 480: Civilization 6, Average Frame Rate(1080p) RX 480: Civilization 6, 99th Percentile(1080p) RX 480: Civilization 6, Time Under 60 FPS

Sapphire RX 480 at 4K

(4K) RX 480: Civilization 6, Average Frame Rate(4K) RX 480: Civilization 6, 99th Percentile(4K) RX 480: Civilization 6, Time Under 30 FPS

Benchmarking Performance: CPU Legacy Tests GPU Tests: Shadow of Mordor DX11 (1080p, 4K)
Comments Locked

254 Comments

View All Comments

  • SkipPerk - Wednesday, May 3, 2017 - link

    These are low-end CPU's. People use those for gaming and web-surfing. I have a proper Xeon machine at work like a normal person. Not to mention, you reference video software. What tiny percentage of computer users ever own or use video software? That is a tiny industry. It reminds me of the silly youtube reviews where the reviewer assumes everyone is editing videos, when less than one percent of us will ever do so.

    Most people buying non-Xeon CPU's really will be using basic software (MS Office, WinZip,...) or games. The only time I have used non-Xeon CPU's for work was when I had software that loved clock speed. Then I got a bunch of 6-core's and overclocked them (it was funny to watch the guys at Microcenter as I bought ten $1k CPUs and cheesy AIO water coolers). Otherwise one uses the right tool for the job.
  • AndrewJacksonZA - Tuesday, April 11, 2017 - link

    On the last page, "On The Benchmark Results"
    "Looking at the results, it’s hard to notice the effect that 12 threads has on multithreaded CPU tests."
    Don't you mean that it's NOT hard to notice?
  • Drumsticks - Tuesday, April 11, 2017 - link

    I didn't see the 7600k in gaming benchmarks, was that a mistake/not ready, or is it on purpose?

    Thanks for the review guys! This new benchmark suite looks phenomenal!
  • mmegibb - Tuesday, April 11, 2017 - link

    I was disappointed not to see the i5-7600k in the gaming benchmarks. Perhaps it wouldn't be much different than the i5-7600, but I have sometimes seen a difference. For my next build, it's looking like it's between the 1600x and the 7600k.
  • fanofanand - Tuesday, April 11, 2017 - link

    "Platform wise, the Intel side can offer more features on Z270 over AM4"

    Aside from Optane support, what does Z270 offer that AM4 doesn't?
  • MajGenRelativity - Tuesday, April 11, 2017 - link

    Z270 has more PCIe lanes off the chipset for controllers and such that AM4 does not
  • fanofanand - Tuesday, April 11, 2017 - link

    I won't disagree with that, but I'm not sure a few extra pci-e lanes is considered a feature. Features are typically something like M.2 support, built-in wifi, things like that. The extra pci-e lanes allows for MORE connected devices, but is a few extra pci-e lanes really considered a feature anymore? With Optane being worthless for 99.99999% of consumers, I'm just not seeing where Z270 gives more for the extra money.
  • JasonMZW20 - Tuesday, April 11, 2017 - link

    Let's do a rundown:

    Ryzen + X370
    20 (3.0) + 8 (2.0)
    Platform usable total: 28

    Core i7 + Z270
    16 + 14 (all 3.0)
    Platform usable total: 30

    Intel's Z270 spec sheet is a little disingenuous, as yes it does have a maximum of 24 lanes, but 10 are reserved for actual features like SATA and USB 2.0/3.x. 14 can be used by a consumer, giving you a total of 2 NVMe x4 + 1 NVMe x2 leaving x4 for other things like actual PCIe slots. That 3rd NVMe slot may share PCIe lanes with a PCIe add-in slot, if configured that way.

    Ryzen PCIe config (20 lanes): 1x16, 2x8 for graphics and x4 NVMe (or x2 SATA when NVMe is not used)

    Core i7 config (16 lanes): 1x16, 2x8, or 1x8+2x4 for graphics

    They're actually pretty comparable.
  • mat9v - Tuesday, April 11, 2017 - link

    No, not more PCIEx lines, those from chipset are virtual, they all go to CPU through DMI bus that is equivalent to (at best) 4 lines of PCIEx 3.0. All those chips (Intel and AMD) offer 16 lines from CPU for graphic card, but Zen also offers 4 lines for NVMe. Chipsets are connected by DMI (in Intel) and 4 lines of PCIEx 3.0 (in AMD), so that is equal, now Intel from those DMI lines offer virtual 24 lines of PCIEx 3.0 (a laugh and half) while AMD quite correctly offers 8 lines of PCIEx 2.0 (equivalent to 4 lines of PCIEx 3.0).
  • psychobriggsy - Wednesday, April 12, 2017 - link

    Indeed. If a user is going to need more than that, they're more likely going to be plumping for a HEDT system anyway. AMD's solution is coming in a bit, but that should be able to ramp up the IO significantly.

Log in

Don't have an account? Sign up now