Rocket League

Hilariously simple pick-up-and-play games are great fun. I'm a massive fan of the Katamari franchise for that reason — passing start on a controller and rolling around, picking up things to get bigger, is extremely simple. Until we get a PC version of Katamari that I can benchmark, we'll focus on Rocket League.

Rocket League combines the elements of pick-up-and-play, allowing users to jump into a game with other people (or bots) to play football with cars with zero rules. The title is built on Unreal Engine 3, which is somewhat old at this point, but it allows users to run the game on super-low-end systems while still taxing the big ones. Since the release in 2015, it has sold over 5 million copies and seems to be a fixture at LANs and game shows. Users who train get very serious, playing in teams and leagues with very few settings to configure, and everyone is on the same level. Rocket League is quickly becoming one of the favored titles for e-sports tournaments, especially when e-sports contests can be viewed directly from the game interface.

Based on these factors, plus the fact that it is an extremely fun title to load and play, we set out to find the best way to benchmark it. Unfortunately for the most part automatic benchmark modes for games are few and far between. Partly because of this, but also on the basis that it is built on the Unreal 3 engine, Rocket League does not have a benchmark mode. In this case, we have to develop a consistent run and record the frame rate.

Read our initial analysis on our Rocket League benchmark on low-end graphics here.

With Rocket League, there is no benchmark mode, so we have to perform a series of automated actions, similar to a racing game having a fixed number of laps. We take the following approach: Using Fraps to record the time taken to show each frame (and the overall frame rates), we use an automation tool to set up a consistent 4v4 bot match on easy, with the system applying a series of inputs throughout the run, such as switching camera angles and driving around.

It turns out that this method is nicely indicative of a real bot match, driving up walls, boosting and even putting in the odd assist, save and/or goal, as weird as that sounds for an automated set of commands. To maintain consistency, the commands we apply are not random but time-fixed, and we also keep the map the same (Aquadome, known to be a tough map for GPUs due to water/transparency) and the car customization constant. We start recording just after a match starts, and record for 4 minutes of game time (think 5 laps of a DIRT: Rally benchmark), with average frame rates, 99th percentile and frame times all provided.

The graphics settings for Rocket League come in four broad, generic settings: Low, Medium, High and High FXAA. There are advanced settings in place for shadows and details; however, for these tests, we keep to the generic settings. For both 1920x1080 and 4K resolutions, we test at the High preset with an unlimited frame cap.

All of our benchmark results can also be found in our benchmark engine, Bench.

ASRock RX 580 Performance

Rocket League (1080p, Ultra)
Rocket League (1080p, Ultra)

GPU Tests: Rise of the Tomb Raider GPU Tests: Grand Theft Auto V
POST A COMMENT

112 Comments

View All Comments

  • wr3zzz - Monday, June 11, 2018 - link

    K-series CPUs don't come with coolers. Reply
  • rocky12345 - Monday, June 11, 2018 - link

    They used to but Intel coolers are so bad that no one used them so instead of making one that was usable for the k CPU's they just stopped including them. At least the other guys include them still and 2 of the 3 are actually usable as coolers. Personally I would rather have some sort of cooler included so at least would be up and running if the high end air or water cooler was om back order or waiting on shipping at least can get the system built and running. Reply
  • Flunk - Monday, June 11, 2018 - link

    The ones they sent out with the older -K series processors were a joke. My i5-2500K came with a cooler that couldn't even cool it within Intel's specs running stock in a cold room. Reply
  • mkaibear - Tuesday, June 12, 2018 - link

    I'm still using the one which came with my 4790K and it works fine, and the one my 2500K came with also worked fine when I had it, even at 30C ambient temps in the middle of summer.

    Probably an installation error there Flunk.

    (yes, I bought K series processors and never overclocked them, for both of these my intention was to downclock them for reduced heat and noise but never got round to it with the 2500K and the 4790K didn't really downclock very well so I couldn't be bothered!)
    Reply
  • jimmysmitty - Friday, June 15, 2018 - link

    Absolutely incorrect. I installed tons of the stock Intel coolers on i5s and i7s and they work as specified for the stock settings of the CPUs plus were normally very quiet. Reply
  • SirMaster - Monday, June 11, 2018 - link

    "K" CPUs con't come with heatsinks or fans... Neither does the 8700K or 8600K or 7700K, etc. Reply
  • Matthmaroo - Monday, June 11, 2018 - link

    It’s been a while for you , I see - K series cpus have no cooler Reply
  • Memo.Ray - Monday, June 11, 2018 - link

    Memo.Ray - Monday, June 11, 2018 - link
    As I mentioned in my comment in the other article a couple of days ago:

    Intel managed to give away 8086 "binned" 8700K (AKA 8086K) and still make some money on top of it. win-win situation :D

    https://www.anandtech.com/comments/12940/intels-co...
    Reply
  • jimmysmitty - Friday, June 15, 2018 - link

    And you miscalculated because you used the i7 8700 cost not the 8700K cost. They made maybe $300K on them.

    You know I have never seen anyone complain about say a 40th anniversary version of a car.
    Reply
  • just4U - Wednesday, June 13, 2018 - link

    If it were more similar to the 4790K with a better thermal design (think devils canyon..) it's something I'd be interested in over the 8700K. It's not tho… and doesn't even come with a specialty cooler that might peak interest.. but rather "NO COOLER" at all.. I dunno..

    I think Intel missed the boat with this one.
    Reply

Log in

Don't have an account? Sign up now