Rocket League

Hilariously simple pick-up-and-play games are great fun. I'm a massive fan of the Katamari franchise for that reason — passing start on a controller and rolling around, picking up things to get bigger, is extremely simple. Until we get a PC version of Katamari that I can benchmark, we'll focus on Rocket League.

Rocket League combines the elements of pick-up-and-play, allowing users to jump into a game with other people (or bots) to play football with cars with zero rules. The title is built on Unreal Engine 3, which is somewhat old at this point, but it allows users to run the game on super-low-end systems while still taxing the big ones. Since the release in 2015, it has sold over 5 million copies and seems to be a fixture at LANs and game shows. Users who train get very serious, playing in teams and leagues with very few settings to configure, and everyone is on the same level. Rocket League is quickly becoming one of the favored titles for e-sports tournaments, especially when e-sports contests can be viewed directly from the game interface.

Based on these factors, plus the fact that it is an extremely fun title to load and play, we set out to find the best way to benchmark it. Unfortunately for the most part automatic benchmark modes for games are few and far between. Partly because of this, but also on the basis that it is built on the Unreal 3 engine, Rocket League does not have a benchmark mode. In this case, we have to develop a consistent run and record the frame rate.

Read our initial analysis on our Rocket League benchmark on low-end graphics here.

With Rocket League, there is no benchmark mode, so we have to perform a series of automated actions, similar to a racing game having a fixed number of laps. We take the following approach: Using Fraps to record the time taken to show each frame (and the overall frame rates), we use an automation tool to set up a consistent 4v4 bot match on easy, with the system applying a series of inputs throughout the run, such as switching camera angles and driving around.

It turns out that this method is nicely indicative of a real bot match, driving up walls, boosting and even putting in the odd assist, save and/or goal, as weird as that sounds for an automated set of commands. To maintain consistency, the commands we apply are not random but time-fixed, and we also keep the map the same (Aquadome, known to be a tough map for GPUs due to water/transparency) and the car customization constant. We start recording just after a match starts, and record for 4 minutes of game time (think 5 laps of a DIRT: Rally benchmark), with average frame rates, 99th percentile and frame times all provided.

The graphics settings for Rocket League come in four broad, generic settings: Low, Medium, High and High FXAA. There are advanced settings in place for shadows and details; however, for these tests, we keep to the generic settings. For both 1920x1080 and 4K resolutions, we test at the High preset with an unlimited frame cap.

All of our benchmark results can also be found in our benchmark engine, Bench.

ASRock RX 580 Performance

Rocket League (1080p, Ultra)
Rocket League (1080p, Ultra)

GPU Tests: Rise of the Tomb Raider GPU Tests: Grand Theft Auto V
Comments Locked

111 Comments

View All Comments

  • ipkh - Monday, June 11, 2018 - link

    The multiplier chart doesn't make sense.
    The single core is 5Ghz, but Intel is quoting 4.7 Ghz all core and you're showing 4.4 identical to 8700K. I understand the base frequencies are the same, but the default multiplier for the 8086K should be higher. Is this a possible bios glitch or is the multiplier chart in the CPU not correct?
  • Hxx - Monday, June 11, 2018 - link

    Boost frequencies are all the same on 5 cores. there is a youtube video with somebody testing this chip on a z370 gaming 7 and you can clearly see in that video that boost is the same on all cores except 1. Intel = lame.
  • Ian Cutress - Monday, June 11, 2018 - link

    Where is Intel promoting 4.7 GHz all core?
  • HStewart - Monday, June 11, 2018 - link

    One thing that is strange is the name - the Original IBM PC that started this whole PC industry used the intel 8088 processor and not the Intel 8086 processor. The difference is that 8088 has 8 bit external and 8086 has 16 bit external - But CPU's used 16 bit internally. No internal Floating processor until the 386 line.

    But it wild that it been 40 years - I have an original IBM PC - in my downstairs closet, I remember while at Georgia Tech - putting a 2Meg Ram card into and booting up to 1.4Meg ramdisk and loading Microsoft C 3.0 compiler on it.

    As for new one - it would be cool if they actually included the original chip also as part of collectors edition.
  • AsParallel - Monday, June 11, 2018 - link

    8088 shipped in 79, was a variant of the 8086. 8086 was the first to 1M transistors
  • peevee - Monday, June 11, 2018 - link

    "No internal Floating processor until the 386 line."

    486. 386 still used 387 AFAIR. There were even 487, but it was just renamed 486 to be installed with 486SX.
  • HStewart - Monday, June 11, 2018 - link

    Yes I forgot that - the 486 was the one with Math Coprocessor.
  • AsParallel - Monday, June 11, 2018 - link

    Addition. The 8087 was the floating point coprocessor for the 8086/88
  • 29a - Monday, June 11, 2018 - link

    You didn't put 2mb of RAM in an original IBM PC it supported 256kb max.
  • HStewart - Monday, June 11, 2018 - link

    I had a special card in the PC - it was EMS memory - that could also fill up the main system memory to 640kb - instead of normal cache mode use by the card - I configured it as ram drive. Memory above 640Kb was directly accessible by the system.

Log in

Don't have an account? Sign up now