Scaled Gaming Performance: High Resolution

Civilization 6

Originally penned by Sid Meier and his team, the Civilization series of turn-based strategy games are a cult classic, and many an excuse for an all-nighter trying to get Gandhi to declare war on you due to an integer underflow. Truth be told I never actually played the first version, but I have played every edition from the second to the sixth, including the fourth as voiced by the late Leonard Nimoy, and it is a game that is easy to pick up, but hard to master.

Benchmarking Civilization has always been somewhat of an oxymoron – for a turn based strategy game, the frame rate is not necessarily the important thing here and even in the right mood, something as low as 5 frames per second can be enough. With Civilization 6 however, Firaxis went hardcore on visual fidelity, trying to pull you into the game. As a result, Civilization can taxing on graphics and CPUs as we crank up the details, especially in DirectX 12.

GTX 1080: Civilization VI, Average FPSGTX 1080: Civilization VI, 95th Percentile
Blue is XMP; Orange is JEDEC at Low CL

Performance in Civ VI shows there is very little benefit to be had by going from DDR5-4800 to DDR5-6400. The results also show that Civ VI actually benefits from lower latencies, with DDR5-4800 at CL32 outperforming all the other frequencies tested at CL36.

Shadow of the Tomb Raider (DX12)

The latest installment of the Tomb Raider franchise does less rising and lurks more in the shadows with Shadow of the Tomb Raider. As expected this action-adventure follows Lara Croft which is the main protagonist of the franchise as she muscles through the Mesoamerican and South American regions looking to stop a Mayan apocalyptic she herself unleashed. Shadow of the Tomb Raider is the direct sequel to the previous Rise of the Tomb Raider and was developed by Eidos Montreal and Crystal Dynamics and was published by Square Enix which hit shelves across multiple platforms in September 2018. This title effectively closes the Lara Croft Origins story and has received critical acclaims upon its release.

The integrated Shadow of the Tomb Raider benchmark is similar to that of the previous game Rise of the Tomb Raider, which we have used in our previous benchmarking suite. The newer Shadow of the Tomb Raider uses DirectX 11 and 12, with this particular title being touted as having one of the best implementations of DirectX 12 of any game released so far.

GTX 1080: Shadow of the Tomb Raider, Average FPSGTX 1080: Shadow of the Tomb Raider, 95th Percentile
Blue is XMP; Orange is JEDEC at Low CL

Looking at our results in Shadow of the Tomb Raider, we did see some improvements in performance scaling from DDR5-4800 to DDR5-6400. The biggest improvement came when testing DDR4-4800 CL32, which performed similarly to DDR5-6000 CL36.

Strange Brigade (DX12)

Strange Brigade is based in 1903’s Egypt and follows a story which is very similar to that of the Mummy film franchise. This particular third-person shooter is developed by Rebellion Developments which is more widely known for games such as the Sniper Elite and Alien vs Predator series. The game follows the hunt for Seteki the Witch Queen who has arisen once again and the only ‘troop’ who can ultimately stop her. Gameplay is cooperative-centric with a wide variety of different levels and many puzzles which need solving by the British colonial Secret Service agents sent to put an end to her reign of barbaric and brutality.

The game supports both the DirectX 12 and Vulkan APIs and houses its own built-in benchmark which offers various options up for customization including textures, anti-aliasing, reflections, draw distance and even allows users to enable or disable motion blur, ambient occlusion and tessellation among others. AMD has boasted previously that Strange Brigade is part of its Vulkan API implementation offering scalability for AMD multi-graphics card configurations. For our testing, we use the DirectX 12 benchmark.

GTX 1080: Strange Brigade DX12, Average FPSGTX 1080: Strange Brigade DX12, 95th Percentile
Blue is XMP; Orange is JEDEC at Low CL

Performance in Strange Brigade wasn't influenced by the frequency and latency of the G.Skill Trident Z5 DDR5 memory in terms of average frame rates. We do note however that 95th percentile performance does, for the most part, improve as we increased the memory frequency.

Gaming Performance: Low Resolution DDR5 Memory Scaling on Alder Lake Conclusion
Comments Locked

82 Comments

View All Comments

  • TheinsanegamerN - Tuesday, December 28, 2021 - link

    What is the hold up on video card reviews? I know there was that cali fire last year, but that was over a year ago now.
  • gagegfg - Thursday, December 30, 2021 - link

    The shortage of Chips is global and not just GPU. A Hardware tester like you, should not have problems in acquiring a GPU to do these tests, otherwise, you have a serious public relations problem.
    And returning to the point of my criticism, I think that those who understand why CPU scaling is tested in games, it is not only FPS, but also being able to evaluate longevity to upgrade to future GPUs or which CPU will generate a bottleneck sooner. that other. Today all CPUs can run games at 4k and that's not news to anyone.
    In short, if you cannot get a decent GPU to test a top-of-the-range CPU and its limitations with different hardware combinations, try to eliminate the GPU bottleneck with 720p / 1080p resolutions or by dropping detail from excene. That is the correct way to test the other points and not the GPU itself.
    This criticism is constructive, and is not intended to generate repudiation.
  • TheinsanegamerN - Monday, January 3, 2022 - link

    Erm, you do know that other reviewers have been able to get ahold of GPUs for testing, right? If you dont have the budget for the stuff you need to do your job, cant get the stuff you need to do your job, and find excuses to now do the reviews your viewerbase wants to so, that says to a lot of people that anandtech is being mismanaged into the ground.

    Come to think of it, these same excuses were used when the 3080 was never reviewed, alongside "well we have one guy who does it and he lives near the fires in cali". That was a year and a half ago.

    Perhaps Anandtech presenting excuses instead of articles is why you cant get companies to send you hardware? Just a thought.
  • Azix - Monday, January 10, 2022 - link

    I can understand manufacturers being less likely to send out a GPU if they aren't guaranteed publicity. The key is that he said just for testing, not necessarily for a review. Most other reviewers are given for marketing purposes.
  • zheega - Thursday, December 23, 2021 - link

    I didn't even notice that at first, I just assumed that they would get rid of the GPU bottleneck. How amazingly weird.
  • thestryker - Thursday, December 23, 2021 - link

    The vast majority of people play at the highest playable resolution for the hardware they have which means they're GPU bound no matter what their GPU is. The frame rates in the review are perfectly playable and indicates the amount of variation one could expect for a mostly GPU bound situation. None of the titles are esports/competitive where you'd need to be maxing out frame rate so even if they were using a 3090 it'd be a pointless reflection for that.

    So while the metrics aren't perfect from a scaling under ideal circumstances perspective it's perfectly fine for practicality.
  • haukionkannel - Friday, December 24, 2021 - link

    True. I have not latest PC hardware and still play at 1440p at highest settings. So I can confirm that is the way to test to see what we see in real world situation...
  • Ooga Booga - Tuesday, December 28, 2021 - link

    Because they haven't done meaningful GPU stuff in years, it all goes to Tom's Hardware. Eventually the card they use will be 10 years old if this site is even still around.
  • TimeGoddess - Thursday, December 23, 2021 - link

    If youre gonna use a gtx 1080 at least try and do the gaming benchmarks in 720p so that there is actually a CPU bottleneck
  • Ian Cutress - Thursday, December 23, 2021 - link

    You know how many people complain when I run our CPU reviews at 720p resolutions? 'You're only doing that to show a difference'.

Log in

Don't have an account? Sign up now