Rocket League

Hilariously simple pick-up-and-play games are great fun. I'm a massive fan of the Katamari franchise for that reason — passing start on a controller and rolling around, picking up things to get bigger, is extremely simple. Until we get a PC version of Katamari that I can benchmark, we'll focus on Rocket League.

Rocket League combines the elements of pick-up-and-play, allowing users to jump into a game with other people (or bots) to play football with cars with zero rules. The title is built on Unreal Engine 3, which is somewhat old at this point, but it allows users to run the game on super-low-end systems while still taxing the big ones. Since the release in 2015, it has sold over 5 million copies and seems to be a fixture at LANs and game shows. Users who train get very serious, playing in teams and leagues with very few settings to configure, and everyone is on the same level. Rocket League is quickly becoming one of the favored titles for e-sports tournaments, especially when e-sports contests can be viewed directly from the game interface.

Based on these factors, plus the fact that it is an extremely fun title to load and play, we set out to find the best way to benchmark it. Unfortunately for the most part automatic benchmark modes for games are few and far between. Partly because of this, but also on the basis that it is built on the Unreal 3 engine, Rocket League does not have a benchmark mode. In this case, we have to develop a consistent run and record the frame rate.

Read our initial analysis on our Rocket League benchmark on low-end graphics here.

With Rocket League, there is no benchmark mode, so we have to perform a series of automated actions, similar to a racing game having a fixed number of laps. We take the following approach: Using Fraps to record the time taken to show each frame (and the overall frame rates), we use an automation tool to set up a consistent 4v4 bot match on easy, with the system applying a series of inputs throughout the run, such as switching camera angles and driving around.

It turns out that this method is nicely indicative of a real bot match, driving up walls, boosting and even putting in the odd assist, save and/or goal, as weird as that sounds for an automated set of commands. To maintain consistency, the commands we apply are not random but time-fixed, and we also keep the map the same (Aquadome, known to be a tough map for GPUs due to water/transparency) and the car customization constant. We start recording just after a match starts, and record for 4 minutes of game time (think 5 laps of a DIRT: Rally benchmark), with average frame rates, 99th percentile and frame times all provided.

The graphics settings for Rocket League come in four broad, generic settings: Low, Medium, High and High FXAA. There are advanced settings in place for shadows and details; however, for these tests, we keep to the generic settings. For both 1920x1080 and 4K resolutions, we test at the High preset with an unlimited frame cap.

For all our results, we show the average frame rate at 1080p first. Mouse over the other graphs underneath to see 99th percentile frame rates and 'Time Under' graphs, as well as results for other resolutions. All of our benchmark results can also be found in our benchmark engine, Bench.

MSI GTX 1080 Gaming 8G Performance


1080p

4K

ASUS GTX 1060 Strix 6GB Performance


1080p

4K

Sapphire R9 Fury 4GB Performance


1080p

4K

Sapphire RX 480 8GB Performance


1080p

4K

Rocket League Notes on GTX

The map we use in our testing, Aquadome, is known to be strenuous on a system, hence we see frame rates lower than what people expect for Rocket League - we're trying to cover the worst case scenario. But the results also show how AMD CPUs and NVIDIA GPUs do not seem to be playing ball with each other, which we've been told is likely related to drivers. 

Gaming Performance: Rise of the Tomb Raider (1080p, 4K) Gaming Performance: Grand Theft Auto (1080p, 4K)
Comments Locked

140 Comments

View All Comments

  • Ian Cutress - Thursday, July 27, 2017 - link

    AMD doesn't use the R3 / R5 / R7 nomenclature - that's for graphics.
  • Gothmoth - Thursday, July 27, 2017 - link

    i don´t care about gaming or heating my house with a cpu..... so ryzen makes more sense for me. :)

    x299 was such a disappointment.
  • MrCommunistGen - Thursday, July 27, 2017 - link

    Ian, first off, thanks for the benchmark numbers! I look forward to seeing the rest once they are completed.

    As far as data is concerned, is there a chance that the DigiCortex results have the wrong numbers next to a couple CPUs?

    I'm specifically looking at the i3 7100 being the fastest Intel CPU at 0.63, compared to the rest of the offerings clustering together at 0.37-0.38. To me it looks like the 0.63 should be the i5 7400 and the 7100 should be with the other dual cores.

    On another note, it looks like the RoTR Geothermal Valley scene really HATES AMD's HyperThreading - at least on Nvidia hardware/drivers. At first I thought there might be another set of numbers transposed somewhere since the Ryzen 3 CPUs perform SO MUCH better than the 1500X. But I looked back at the 1600X review and the numbers seem consistent -- bad performance on HyperThreaded AMD on a GTX 1080. Prophet's Tomb seems to behave better. Just shows how much architecture and software optimizations for said architecture can either oppose or compliment each other.

    As for small typos, there's also a couple spots where the 1200 is referred to as "1200X". There was another one I found during my initial read that I can't find now that I'm commenting.
  • MrCommunistGen - Thursday, July 27, 2017 - link

    Not the typo I was looking for, but I just noticed that the intro/description for Civ6 looks like it has a typo I've missed in previous articles:
    "...but every edition from the second to the sixth, including the fifth as voiced by the late Leonard Nimoy, it a game that is easy to pick up, but hard to master."

    "it a game" should probably be "is a game"

    Not a criticism, just trying to help out where I can. :)
  • MrCommunistGen - Thursday, July 27, 2017 - link

    Gah... brain fart this morning. Please read my references to AMD "HyperThreading" as "SMT"... smh
  • Ian Cutress - Thursday, July 27, 2017 - link

    i3 7100 should be 0.363x on DigiCortex. I've corrected three 7100 results today in our database from my personal master copy. I think I'll have to go through them all and double check.

    RoTR Geothermal on 1080p with a GTX 1080 really loves quad cores without hyperthreading, AMD or Intel. I'm not sure what it is with that test on that benchmark - in our KBL-X review, all the i5s got top results by a good margin. I think it's been optimized specifically for quad-core, or there's something iffy in the game code/drivers.

    Appreciate the typo point outs for sure. These things are always last minute and you can never have too many eyes on it. :)
  • DanGer1 - Thursday, July 27, 2017 - link

    The review is lacking, especially the value charts. Ryzens come with a cooler, their motherboards cost less and they are overclock-able. Adjusting the cost for the motherboard and the cooler changes the value charts significantly in R3's favor. Overclocking on stock air makes makes performance and value a no contest in favor of the R3s.
  • MajGenRelativity - Thursday, July 27, 2017 - link

    Intel's processors also come with a cooler.
  • wallysb01 - Thursday, July 27, 2017 - link

    Basic 1511 boards that would go into i3/Pentium builds are really not much more, if at all, than the lower end AM4 boards. Plus, the Intel stuff has an iGPU and if you're buying a low end desktop, you probably don't care a lot about heavily multithreaded workloads. So, I'd actually argue the i3/Pentiums are getting under sold in the value charts.

    Its kinda funny how the landscape has switched, in that Intel might actually be the better low-end, value winner, while AMD is the best mid/mid-high end value winner.
  • Gothmoth - Thursday, July 27, 2017 - link

    +1 for overclocking.
    the tested intel cpus are sure not k models.

    as for intel having internal GPU.. i never used them not even on my cheapest system builds.

Log in

Don't have an account? Sign up now