Civilization 6

First up in our CPU gaming tests is Civilization 6. Originally penned by Sid Meier and his team, the Civ series of turn-based strategy games are a cult classic, and many an excuse for an all-nighter trying to get Gandhi to declare war on you due to an integer overflow. Truth be told I never actually played the first version, but every edition from the second to the sixth, including the fourth as voiced by the late Leonard Nimoy, it a game that is easy to pick up, but hard to master.

Benchmarking Civilization has always been somewhat of an oxymoron – for a turn based strategy game, the frame rate is not necessarily the important thing here and even in the right mood, something as low as 5 frames per second can be enough. With Civilization 6 however, Firaxis went hardcore on visual fidelity, trying to pull you into the game. As a result, Civilization can taxing on graphics and CPUs as we crank up the details, especially in DirectX 12.

Perhaps a more poignant benchmark would be during the late game, when in the older versions of Civilization it could take 20 minutes to cycle around the AI players before the human regained control. The new version of Civilization has an integrated ‘AI Benchmark’, although it is not currently part of our benchmark portfolio yet, due to technical reasons which we are trying to solve. Instead, we run the graphics test, which provides an example of a mid-game setup at our settings.

At both 1920x1080 and 4K resolutions, we run the same settings. Civilization 6 has sliders for MSAA, Performance Impact and Memory Impact. The latter two refer to detail and texture size respectively, and are rated between 0 (lowest) to 5 (extreme). We run our Civ6 benchmark in position four for performance (ultra) and 0 on memory, with MSAA set to 2x.

For reviews where we include 8K and 16K benchmarks (Civ6 allows us to benchmark extreme resolutions on any monitor) on our GTX 1080, we run the 8K tests similar to the 4K tests, but the 16K tests are set to the lowest option for Performance.

All of our benchmark results can also be found in our benchmark engine, Bench.

MSI GTX 1080 Gaming 8G Performance


1080p

4K

8K

16K

Benchmarking Performance: CPU Legacy Tests Gaming Performance: Shadow of Mordor
Comments Locked

545 Comments

View All Comments

  • bryanlarsen - Thursday, April 19, 2018 - link

    Just because transistors can be 15% smaller, doesn't mean that they have to be. Every IC design includes transistors of many different sizes. GF is saying that the minimum transistor size is 15% smaller than the previous minimum transistor size. And it seems that AMD chose not to use them, selecting to use a larger, higher performance transistor instead that happens to be the same size as their previous transistor.
  • bryanlarsen - Thursday, April 19, 2018 - link

    And you confirm that in the next paragraph. "AMD confirmed that they are using 9T transistor libraries, also the same as the previous generation, although GlobalFoundries offers a 7.5T design as well." So please delete your very misleading transistor diagram and accompanying text.
  • danjw - Friday, April 20, 2018 - link

    I think you are misreading that part of the article. AMD shrunk the size of the processor blocks giving them more "dark silicone" between the blocks. This allowed better thermal isolation between blocks, thus higher clocks.
  • The Hardcard - Thursday, April 19, 2018 - link

    “Cache Me Ousside, How Bow Dah?“

    Very low hanging fruit, yet still so delicious.
  • msroadkill612 - Thursday, April 19, 2018 - link

    "Intel is expected to have a frequency and IPC advantage
    AMD’s counter is to come close on frequency and offer more cores at the same price

    It is easy for AMD to wave the multi-threaded crown with its internal testing, however the single thread performance is still a little behind."

    If so, why is it given such emphasis - its increasingly a corner xase benefit as game devs begin to use the new mainstream multi core platforms. Oh so recently, the norm wa probably 2 core, so that's what they coded for - THEN.

    This minor advantage, compares to intel getting absolutely smashed on increasingly multi threaded apps, at any price point, is rarely mentioned in proximity, where it deserves to be in a balanced analysis.
  • Ratman6161 - Thursday, April 19, 2018 - link

    "its increasingly a corner xase benefit as game devs begin to use the new mainstream multi core platforms" As I often do, I'd like to remind people that not all readers of this article are gamers or give a darn about games. I am one of those i.e. game performance is meaningless to me.
  • 0ldman79 - Thursday, April 19, 2018 - link

    Agreed.

    I am a gamer, but the gaming benchmarks are nearly irrelevant at this point.

    Almost every CPU (ignoring Atom) can easily feed a modern video card and keep the framerate above 60fps. I'm running an FX 6300 and I still run everything at 1080p with a GTX 970 and hardly ever see a framerate drop.

    Gaming benches are somewhat less important than days gone by. Everything on the market hits the minimum requirement and then some. It's primarily fuel for the fanboys, "OMG!!! AMD sucks!!! Intel is faster at gaming!!!"

    Well, considering Intel is running 200fps and AMD is hitting 175fps I'm *thinking* they're both playable.
  • Akkuma - Thursday, April 19, 2018 - link

    Gaming + streaming benchmarks, as done by GamersNexus, are exactly the kind of relevant and important benchmarks more sites need to be doing. Those numbers you don't care about are much more important when you start trying to do streaming.

    Your 60fps? That isn't even what most people who game care about with high refresh rate monitors doing 144hz+. Add in streaming where you're taking a decent FPS hit and that difference between 200 and 175 fps all of a sudden is the difference between maintaining the 144hz and not.
  • Vesperan - Thursday, April 19, 2018 - link

    Yea but.. of all the people interested in gaming, those with high refresh rate monitors and/or streaming online is what - 10% of the market? Tops?

    Sure the GamersNexus reviews have relevance.. to that distinct minority of people out there. Condemning/praising CPU architectures for gaming in general due to these corner cases is non-sensical.

    Like Oldman79 said, damn near any of these CPUs is fine for gaming - unless you happen to be one of the corner cases.
  • Akkuma - Friday, April 20, 2018 - link

    You're pulling a number out of thin air and building an entire argument around a made up number. 72% of steam users have 1080p monitors. What percentage of those are high refresh rate is unknown, but 120hz monitors have existed for at least 5 years now and maybe even longer. At this stage arguing around 60fps is like arguing about sound quality of cassettes today as we are long past it.

Log in

Don't have an account? Sign up now