Gaming Performance

Ashes of the Singularity (DX12)

Seen as the holy child of DX12, Ashes of the Singularity (AoTS, or just Ashes) has been the first title to actively go and explore as many of the DX12 features as it possibly can. Stardock, the developer behind the Nitrous engine which powers the game, has ensured that the real-time strategy title takes advantage of multiple cores and multiple graphics cards, in as many configurations as possible.

Ashes of The Singularity on ASUS GTX 980 Strix 4GB

Performance with Ashes over our different memory settings was varied at best. The DDR4-2400 value can certainly be characterized as the lowest number near to ~45-46 FPS, while everything else is rounded to 50 FPS or above. Depending on the configuration, this could be an 8-10% difference in frame rates by not selecting the worst memory.

Rise Of The Tomb Raider (DX12)

One of the newest games in the gaming benchmark suite is Rise of the Tomb Raider (RoTR), developed by Crystal Dynamics, and the sequel to the popular Tomb Raider which was loved for its automated benchmark mode. But don’t let that fool you: the benchmark mode in RoTR is very much different this time around.

Visually, the previous Tomb Raider pushed realism to the limits with features such as TressFX, and the new RoTR goes one stage further when it comes to graphics fidelity. This leads to an interesting set of requirements in hardware: some sections of the game are typically GPU limited, whereas others with a lot of long-range physics can be CPU limited, depending on how the driver can translate the DirectX 12 workload.

Rise of The Tomb Raider on ASUS GTX 980 Strix 4GB

We encountered insignificant performance differences in RoTR on the GTX 980. The 3.3 FPS increase at average framerates from top to bottom does not exactly justify the price cost between DDR4-2400 and DDR4-3333 when using a GTX 980 - not in this particular game at least.

Thief

Thief has been a long-standing title in the hearts of PC gamers since the introduction of the very first iteration back in 1998 (Thief: The Dark Project). Thief is the latest reboot in the long-standing series and renowned publisher Square Enix took over the task from where Eidos Interactive left off back in 2004. The game itself uses the UE3 engine and is known for optimised and improved destructible environments, large crowd simulation and soft body dynamics.

Thief on ASUS GTX 980 Strix 4GB

For Thief, there are some small gains to be had from moving through from DDR4-2400 to DDR4-2933, around 5% or so, however after this the performance levels out.

Total War: WARHAMMER

Not only is the Total War franchise one of the most popular real time tactical strategy titles of all time, but Sega has delved into multiple worlds such as the Roman Empire, the Napoleonic era, and even Attila the Hun. More recently the franchise has tackeld the popular WARHAMMER series. The developers Creative Assembly have integrated DX12 into their latest RTS battle title, it aims to take benefits that DX12 can provide. The game itself can come across as very CPU intensive, and is capable of pushing any top end system to their limits.

Total War: WARHAMMER on ASUS GTX 980 Strix 4GB

Even though Total War: WARHAMMER is very CPU performance focused benchmark, memory had barely any effect on the results.

CPU Performance Conclusions on Ryzen DDR4 Scaling
Comments Locked

65 Comments

View All Comments

  • willis936 - Thursday, September 28, 2017 - link

    QDR is the same thing as DDR with a the clock running at half frequency. It's not a magical way to make your datarates higher. The same paltry MHz increase would be seen on QDR but with just tighter jitter requirements. I don't see the benefit since DDR isn't running into a power limit.
  • NeatOman - Thursday, September 28, 2017 - link

    Now that i don't play to many games I'm ok with my 5 year old FX-8320@4.5GHz and R9 280x. Although i find that it does keep up with heavy multi-tasking, like having 20-50 tabs open while playing a FHD youtube video and working in SketchUp on a 40" 4K monitor. It also runs a file server, media server that real time transcodes 1080p in high quality, and i won't really notice while browsing and watching videos other than the lights getting brighter inside the case because the fans ramp up a bit.
  • Zeed - Thursday, September 28, 2017 - link

    Well poor test in my eyes... Gyuess You dont know that pass 3200 its TIMINGS ALL THE WAY !!!! Join us at Overclockers.net for PROPER numbers and tests with carious timings ect.
  • BrokenCrayons - Thursday, September 28, 2017 - link

    I hope your comment isn't an example of Overclockers.net writing quality. Proper numbers and tests aren't very useful when the supporting writing is almost incoherent.
  • chikatana - Thursday, September 28, 2017 - link

    I'm more interested in how will the system perform when all DIMMs are fully loaded.
  • TAspect - Thursday, September 28, 2017 - link

    All gaming tests are GPU bound, and that is why the CPU shows little to no scaling. The GTX 980 is clearly the bottleneck here. Either test with a GTX 1080 /Ti or lower settings until GPU is not a bottleneck.

    Tests only show average fps, which is a mistake as faster RAM affects minimum fps more than average. You should add 99% and 99.9% minimum fps to the graphs.

    You should also include G. Skill Flare X 3200 CL14 RAM with the Stilt's 3200 fast OC profile found in the Crosshair VI Hero UEFI. On other MB's the settings are relatively simple to configure and you only have to test stability once instead of tuning all subtimings for days.
  • BrokenCrayons - Thursday, September 28, 2017 - link

    Agreed on this. Game testing at more modest resolutions and settings would remove potential GPU bottlenecks from the results. Then again, there is a little bit of support for testing at settings closer to the settings an end user would realistically used on a daily basis. It does at least demonstrate the lack of change memory timings would have in a real-world gaming scenario. It'd be optimal to do both really so readers could see results free of GPU concerns AND see how memory perfomance will impact their day-to-day gaming.
  • lyssword - Friday, September 29, 2017 - link

    I think AT is one of the worst sites to get an idea of CPU gaming performance, always GPU limited or scripted part of the game with low cpu demand. Really the only time you see difference is 10% on bulldozer vs i7, where as in real world the difference is 40%. Most of the time AT test show almost no difference between core i3 and i7 because of that testing methodology
  • DabuXian - Thursday, September 28, 2017 - link

    Trying to find a CPU bottleneck while using an old Geforce 980? Seriously? I'd expect some basic hardware knowledge from Anandtech?
  • r3loaded - Friday, September 29, 2017 - link

    I'd like to see what the effects are on Threadripper, considering that the IF spans two dies and the platform is geared towards maximising memory bandwidth.

Log in

Don't have an account? Sign up now