Civilization 6

First up in our CPU gaming tests is Civilization 6. Originally penned by Sid Meier and his team, the Civ series of turn-based strategy games are a cult classic, and many an excuse for an all-nighter trying to get Gandhi to declare war on you due to an integer overflow. Truth be told I never actually played the first version, but every edition from the second to the sixth, including the fourth as voiced by the late Leonard Nimoy, it a game that is easy to pick up, but hard to master.

Benchmarking Civilization has always been somewhat of an oxymoron – for a turn based strategy game, the frame rate is not necessarily the important thing here and even in the right mood, something as low as 5 frames per second can be enough. With Civilization 6 however, Firaxis went hardcore on visual fidelity, trying to pull you into the game. As a result, Civilization can taxing on graphics and CPUs as we crank up the details, especially in DirectX 12.

Perhaps a more poignant benchmark would be during the late game, when in the older versions of Civilization it could take 20 minutes to cycle around the AI players before the human regained control. The new version of Civilization has an integrated ‘AI Benchmark’, although it is not currently part of our benchmark portfolio yet, due to technical reasons which we are trying to solve. Instead, we run the graphics test, which provides an example of a mid-game setup at our settings.

At both 1920x1080 and 4K resolutions, we run the same settings. Civilization 6 has sliders for MSAA, Performance Impact and Memory Impact. The latter two refer to detail and texture size respectively, and are rated between 0 (lowest) to 5 (extreme). We run our Civ6 benchmark in position four for performance (ultra) and 0 on memory, with MSAA set to 2x.

For reviews where we include 8K and 16K benchmarks (Civ6 allows us to benchmark extreme resolutions on any monitor) on our GTX 1080, we run the 8K tests similar to the 4K tests, but the 16K tests are set to the lowest option for Performance.

All of our benchmark results can also be found in our benchmark engine, Bench.

MSI GTX 1080 Gaming 8G Performance


1080p

4K

8K

16K

Benchmarking Performance: CPU Legacy Tests Gaming Performance: Shadow of Mordor
Comments Locked

545 Comments

View All Comments

  • Vesperan - Sunday, April 22, 2018 - link

    If by 'pulling a number out of thin air' you mean that I looked at the same steam hardware survey as you did and also a (year old) TechReport survey (https://techreport.com/news/31542/poll-what-the-re... ) - then yes, I absolutely pulled a number out of thin air. I actually think 10% of the entire market as a max for x1080 resolution and high refresh rate monitors will be significantly too high, as the market will have a lot of old or cheap monitors out there.

    The fact is, once you say Ryzen is perfectly fine for x1080 (at 60 hz) gaming and anything at or above x1440 because your GPU limited (and I'm not saying there is no difference - but is it significant enough?), the argument is no longer 'Ryzen is worse at gaming', but is instead 'Ryzen is just as good for gaming as Intel counterparts, unless you have a high refresh rate x1080 monitor and high end graphics card.'

    Which is a bloody corner case. It might be an important one to a bunch of people, but as I said - it is a distinct minority and it is nonsensical to condemn or praise a CPU architecture for gaming in general because of one corner case. The conclusion is too general and sweeping.
  • Targon - Monday, April 23, 2018 - link

    This is where current benchmarks, other than the turn length benchmark in Civ 6, are not doing enough to show where slowdowns come from. Framerates don't matter as much if the game adds complexity based on CPU processing capability. AI in games for example, will benefit from additional CPU cores(when you don't use your maxed out video card for AI of course).

    I agree that game framerates as the end all, be all that people look at is far too limited, and we do see other things, Cinebench for example, that help expand things, but doesn't go far enough. I just know that I personally find anything below 8-cores will feel sluggish with the number of programs I tend to run at once.
  • GreenReaper - Wednesday, April 25, 2018 - link

    Monitors in use do lag the market. All of my standalone monitors are over a decade old. My laptop and tablet are over five years old. Many people have 4K TVs, but rarely hook them up to their PC.

    It's hard to tell, of course, because some browsers don't fully-communicate display capabilities, but 1920x1080 is a popular resolution with maybe 22.5% of the market on it (judging by the web stats of a large art website I run). Another ~13.5% is on 1366x768.

    I think it's safe to say that only ~5% have larger than 1080p - 2560x1440 has kinda taken off with gamers, but even then it only has 3.5% in the Steam survey - and of course, this mainly counts new installations. 4K is closer to 0.3%.

    Performance for resolutions not in use *now* may matter for a new CPU because you might well want to pair it with a new monitor and video card down the road. You're buying a future capability - maybe you don't need HEVC 10-bit 4K 60FPS decode now, but you might later. However, it could be a better bet to upgrade the CPU/GPU later, especially since we may see AV1 in use by then.

    Buying capabilities for the future is more important for laptops and all-in-one boxes, since they're least likely to be upgradable - Thunderbolt and USB display solutions aside.
  • Bourinos - Friday, April 20, 2018 - link

    Streaming at 144Hz? Are you mad???
  • Luckz - Monday, April 23, 2018 - link

    Would be gaming in 144 Hz while streaming 60 Hz, unless in Akkuma's fantasy world of 240 Hz monitors, the majority of stream viewers would want 144 Hz streams too ;)
  • Shaheen Misra - Sunday, April 22, 2018 - link

    Thats a great point. Every time i have upgraded it has been due to me not hitting 60fps. I have no interest in 144hz/240hz monitors. Had a Q9400 till GTA IV released. Bought a FX 8300 due to lag. Used that till COD WW2 stuttered (Still not sure why really). Now i own a 7700k paired with a 1060 6gb. Not the kind of thing you should say out loud but im not gonna buy a GTX 1080ti for 1080p/60HZ. The PCIe x16 slot is here to stay, i can upgrade whenever. The CPU socket on my Z270 board on the other hand is obsolete a year after purchase.
  • Targon - Monday, April 23, 2018 - link

    Just wait until you upgrade to 4k, at which point you will be waiting for a new generation of video card to come out, and then you find that even the new cards can't handle 4k terribly well. I agree about video card upgrades not making a lot of sense if you are not going above 1080p/60Hz.
  • Luckz - Monday, April 23, 2018 - link

    For 4K you've so far always needed SLI, and SLI was always either bad, bugged, or -as of recently- retired. Why they still make multi GPU mainboards and bundle SLI bridges is beyond me.
  • Lolimaster - Thursday, April 19, 2018 - link

    Zen2 should easily surpass the 200pts in CB15 ST, a minimum of 5-10% + a minum of 5-10% higher clocks, being extremely negative.
  • Lolimaster - Thursday, April 19, 2018 - link

    IPC and clock, no edit button gg.

Log in

Don't have an account? Sign up now