Gaming Tests: Civilization 6

Originally penned by Sid Meier and his team, the Civilization series of turn-based strategy games are a cult classic, and many an excuse for an all-nighter trying to get Gandhi to declare war on you due to an integer underflow. Truth be told I never actually played the first version, but I have played every edition from the second to the sixth, including the fourth as voiced by the late Leonard Nimoy, and it a game that is easy to pick up, but hard to master.

Benchmarking Civilization has always been somewhat of an oxymoron – for a turn based strategy game, the frame rate is not necessarily the important thing here and even in the right mood, something as low as 5 frames per second can be enough. With Civilization 6 however, Firaxis went hardcore on visual fidelity, trying to pull you into the game. As a result, Civilization can taxing on graphics and CPUs as we crank up the details, especially in DirectX 12.

For this benchmark, we are using the following settings:

  • 480p Low, 1440p Low, 4K Low, 1080p Max

For automation, Firaxis supports the in-game automated benchmark from the command line, and output a results file with frame times. We do as many runs within 10 minutes per resolution/setting combination, and then take averages and percentiles.

AnandTech Low Res
Low Qual
Medium Res
Low Qual
High Res
Low Qual
Medium Res
Max Qual
Average FPS
95th Percentile

Civ 6 has always been a fan of fast CPU cores and low latency, so perhaps it isn't much of a surprise to see the Core i7 here beat out the latest processors. The Core i7 seems to generate a commanding lead, whereas those behind it seem to fall into a category around 94-96 FPS at 1080p Max settings.

For our Integrated Tests, we run the first and last combination of settings.

IGP Civilization 6 480p Low (Average FPS)IGP Civilization 6 1080p Max (Average FPS)

When we use the integrated graphics, Broadwell isn't particularly playable here.

All of our benchmark results can also be found in our benchmark engine, Bench.

Gaming Tests: Chernobylite Gaming Tests: Deus Ex Mankind Divided
Comments Locked

120 Comments

View All Comments

  • Leeea - Monday, November 2, 2020 - link

    great review

    sadly i7-5775C's are still selling for $100+ on ebay. Not quite worth the upgrade over the i7-4790K, with graphics cards continuing to be by far the largest factor.

    But to me it also shows there is no need to jump into the latest and greatest cpu, because these old cpus are still keeping up just fine.
  • plonk420 - Monday, November 2, 2020 - link

    > sadly i7-5775C's are still selling for $100+ on ebay

    ohhhh, that makes me curious as to how they compare to 3100/3300X chips now
  • Roy2002 - Monday, November 2, 2020 - link

    So the conclusion is Optane could play a big role in future?
  • Leeea - Monday, November 2, 2020 - link

    no.

    Optane is slower then normal RAM.

    Optane is a faster more limited version of an SSD. Specifically it has RAM like read performance in some areas, while having SSD like write performance in other areas.
  • Jorgp2 - Monday, November 2, 2020 - link

    SSDs are much slower than Optane in writes.

    The worst case performance for Optane is better than the best performance for an SSD in writes.
  • FunBunny2 - Monday, November 2, 2020 - link

    "The worst case performance for Optane is better than the best performance for an SSD in writes."

    may haps Optane will optimize when used with code compiled to use only memory-to-memory execution and no hard I/O?
  • Tomatotech - Monday, November 2, 2020 - link

    I would have loved to see Intel embed a couple of gig of Optane on every mobo or in every CPU - at scale it would have been cheap - and we would get the benefits of instant app start, damn fast reboot etc. That would make a bigger difference to the end user experience than 15% on benchmarks. But no, it came out with poorly implemented tiering software, via expensive almost unused add-in cards. Optane had so much mass-market potential, sadly I think it’s screwed now for use outside the datacentre. Intel of all people should know how tiered storage works, why did they screw it up so badly? They even had a shining example in Apple’s Fusion drive to follow (copy) but still messed it up.
  • Jorgp2 - Monday, November 2, 2020 - link

    Have you considered asking supermicro for a skylake GT4e review sample?
  • f00f - Monday, November 2, 2020 - link

    That's intel's vision of "embedded" DRAM which is only a kind of embedded, because it is on a separate die. If you look for a proper implementation, look at POWER7 processor (2010) with L3 as eDRAM on the same die as cores.
  • jospoortvliet - Wednesday, November 4, 2020 - link

    I am a bit surprised amd didn't embed 32 or 64mb memory in the i/o chip... that would probably be relatively easy and affordable.

Log in

Don't have an account? Sign up now