Rocket League

Hilariously simple pick-up-and-play games are great fun. I'm a massive fan of the Katamari franchise for that reason — passing start on a controller and rolling around, picking up things to get bigger, is extremely simple. Until we get a PC version of Katamari that I can benchmark, we'll focus on Rocket League.

Rocket League combines the elements of pick-up-and-play, allowing users to jump into a game with other people (or bots) to play football with cars with zero rules. The title is built on Unreal Engine 3, which is somewhat old at this point, but it allows users to run the game on super-low-end systems while still taxing the big ones. Since the release in 2015, it has sold over 5 million copies and seems to be a fixture at LANs and game shows. Users who train get very serious, playing in teams and leagues with very few settings to configure, and everyone is on the same level. Rocket League is quickly becoming one of the favored titles for e-sports tournaments, especially when e-sports contests can be viewed directly from the game interface.

Based on these factors, plus the fact that it is an extremely fun title to load and play, we set out to find the best way to benchmark it. Unfortunately for the most part automatic benchmark modes for games are few and far between. Partly because of this, but also on the basis that it is built on the Unreal 3 engine, Rocket League does not have a benchmark mode. In this case, we have to develop a consistent run and record the frame rate.

Read our initial analysis on our Rocket League benchmark on low-end graphics here.

With Rocket League, there is no benchmark mode, so we have to perform a series of automated actions, similar to a racing game having a fixed number of laps. We take the following approach: Using Fraps to record the time taken to show each frame (and the overall frame rates), we use an automation tool to set up a consistent 4v4 bot match on easy, with the system applying a series of inputs throughout the run, such as switching camera angles and driving around.

It turns out that this method is nicely indicative of a real bot match, driving up walls, boosting and even putting in the odd assist, save and/or goal, as weird as that sounds for an automated set of commands. To maintain consistency, the commands we apply are not random but time-fixed, and we also keep the map the same (Aquadome, known to be a tough map for GPUs due to water/transparency) and the car customization constant. We start recording just after a match starts, and record for 4 minutes of game time (think 5 laps of a DIRT: Rally benchmark), with average frame rates, 99th percentile and frame times all provided.

The graphics settings for Rocket League come in four broad, generic settings: Low, Medium, High and High FXAA. There are advanced settings in place for shadows and details; however, for these tests, we keep to the generic settings. For both 1920x1080 and 4K resolutions, we test at the High preset with an unlimited frame cap.

All of our benchmark results can also be found in our benchmark engine, Bench.

MSI GTX 1080 Gaming 8G Performance


1080p

4K

Sapphire Nitro R9 Fury 4G Performance


1080p

4K

Sapphire Nitro RX 480 8G Performance


1080p

4K

With Ryzen, we encounted some odd performance issues when using NVIDIA-based video cards that caused those cards to significantly underperform. However equally strangely, the issues we have with Ryzen on Rocket League with NVIDIA GPUs seem to almost vanish when using Threadripper. Again, still no easy wins here as Intel seems to take Rocket League in its stride, but Game mode still helps the 1950X. The Time Under graphs give some cause for concern, with the 1950X consistently being at the bottom of that graph.

CPU Gaming Performance: Rise of the Tomb Raider (1080p, 4K) CPU Gaming Performance: Grand Theft Auto (1080p, 4K)
Comments Locked

104 Comments

View All Comments

  • ddriver - Friday, August 18, 2017 - link

    Why not? We've had 16 core CPUs long before W10 was launched, and it has allegedly been heavily updated since then.

    But it is NOT the "coder"'s responsibility. Programmers don't get any say, they are paid workers, paid to do as they are told. Not that I don't have the impression that a lot of the code that's being written is below the standard, but the actual decision making is not a product of software programmers but that of software architects, and the latter are even more atrocious than the actual programmers.
  • HollyDOL - Friday, August 18, 2017 - link

    Sadly, the reality is much worse... those architects are ordered by managers, economic persons etc. who, sadly often, don't know more about computer than where's power button. And they want products with minimal cost and 'yesterday was late'.
  • ddriver - Friday, August 18, 2017 - link

    Well, yeah, the higher you go up the latter the grosser the incompetence level.
  • BrokenCrayons - Thursday, August 17, 2017 - link

    Interesting test results. I think they demonstrate pretty clearly why Threadripper isn't really a very good option for pure gaming workloads. The big takeaway is that there are more affordable processors with lower TDPs offer comparable or better performance without adding additional settings that few people will realize exist and even fewer people will fiddle with enough to determine which settings actually improve performance in their particular software library. The Ryzen 7 series is probably a much better overall choice than TR right now if you don't have specific tasks that require all those cores and threads.
  • Gothmoth - Thursday, August 17, 2017 - link

    "I think they demonstrate pretty clearly why Threadripper isn't really a very good option for pure gaming workloads."

    wow.... what a surprise.
    thanks for pointing that out mr. obvious. :-)
  • Gigaplex - Thursday, August 17, 2017 - link

    These are single GPU tests. Threadripper has enough PCIe lanes to do large multi GPU systems. More GPU usually trumps better CPU in the high end gaming scene, especially with 4k resolution.
  • BrokenCrayons - Friday, August 18, 2017 - link

    Yes, but multi-GPU setups are generally not used for gaming-centric operations. There's been tacit acknowledgement of this as the state of things by NV since the release of the 10x0 series. Features like Crossfire and SLI support are barely a bullet point in marketing materials these days. With good reason since game support is waning as well and DX12 is positioned to pretty nail the multi-GPU coffin shut entirely except in corner cases where it MIGHT be possible to leverage an iGPU alongside a dGPU if a game engine developer bothers to invest time into banging out code to support it. That places TR's generous PCIe lane count and the potential multi-GPU usage in the domain of professional workloads that need GPU compute power.
  • Bullwinkle J Moose - Thursday, August 17, 2017 - link

    I agree with ddriver

    We should not have to fiddle with the settings and reboot to game mode on these things

    Windows should handle the hardware seamlessly in the background for whatever end use we put these systems to

    The problem is getting Microsoft to let the end users use the full potential of our hardware

    If the framework for the hardware is not fully implemented in the O.S., every "FIX" looks a bit like the one AMD is using here

    I think gaming on anything over 4 cores might require a "proper" update from Microsoft working with the hardware manufacturers

    Sometimes it might be nice to use the full potential of the systems we have instead of Microsoft deciding that all of our problems can be fixed with another cloud service
  • Gothmoth - Thursday, August 17, 2017 - link

    but but.. what about linux.

    i mean linux is the savior, not?
    it has not won a 2.2% marketshare on teh desktop for nothing.

    sarcasm off....
  • HomeworldFound - Thursday, August 17, 2017 - link

    What can we expect Microsoft to do prior to a product like this launching. If a processor operates in a manner that requires the operating system to be adjusted, the company selling it needs to approach Microsoft and provide an implementation, and it should be ready for launch. If that isn't possible then why manufacture something that doesn't work correctly and requires hacky fixes to run.

Log in

Don't have an account? Sign up now