iGPU Gaming Performance, Continued

Rise of the Tomb Raider

One of the most comprehensive games in the gaming benchmark suite is Rise of the Tomb Raider (RoTR), developed by Crystal Dynamics, and the sequel to the popular Tomb Raider which was loved for its automated benchmark mode. But don’t let that fool you: the benchmark mode in RoTR is very much different this time around.

Visually, the previous Tomb Raider pushed realism to the limits with features such as TressFX, and the new RoTR goes one stage further when it comes to graphics fidelity. This leads to an interesting set of requirements in hardware: some sections of the game are typically GPU limited, whereas others with a lot of long-range physics can be CPU limited, depending on how the driver can translate the DirectX 12 workload.

(1080p) RoTR-1-Valley, Average Frame Rate

(1080p) RoTR-1-Valley, 99th Percentile

(1080p) RoTR-1-Valley, Time Under 30 FPS

 

(1080p) RoTR-2-Prophets, Average Frame Rate

(1080p) RoTR-2-Prophets, 99th Percentile

(1080p) RoTR-2-Prophets, Time Under 30 FPS

 

(1080p) RoTR-3-Mountain, Average Frame Rate

(1080p) RoTR-3-Mountain, 99th Percentile

(1080p) RoTR-3-Mountain, Time Under 30 FPS

The GT 1030 sweeps the top spot against AMD here, though only by small margins most of the time. The AMD APUs still offer a commanding 2-3x performance jump over Intel's product line, and even more when price is factored into the equation.

Rocket League

Hilariously simple and embodying the elements of pick-up-and-play, Rocket League allows users to jump into a game with other people (or bots) to play football with cars with zero rules. The title is built on Unreal Engine 3, which is somewhat old at this point, but it allows users to run the game on super-low-end systems while still taxing the big ones. Since the release in 2015, it has sold over 5 million copies and seems to be a fixture at LANs and game shows. Users who train get very serious, playing in teams and leagues with very few settings to configure, and everyone is on the same level. Rocket League is quickly becoming one of the favored titles for e-sports tournaments, especially when e-sports contests can be viewed directly from the game interface.

With Rocket League, there is no benchmark mode, so we have to perform a series of automated actions, similar to a racing game having a fixed number of laps. We take the following approach: Using Fraps to record the time taken to show each frame (and the overall frame rates), we use an automation tool to set up a consistent 4v4 bot match on easy, with the system applying a series of inputs throughout the run, such as switching camera angles and driving around.

(1080p) Rocket League, Average Frame Rate

(1080p) Rocket League, 99th Percentile

(1080p) Rocket League, Time Under 30 FPS

As the more eSports oriented title in our testing, Rocket League is less graphically intense than the others, and by being built on DX9, also tends to benefit from a good single thread performance. The GT 1030 wins again here, most noticably in the 99th percentile numbers, but the AMD chips are hitting 30 FPS in that percentile graph, whereas in the last generation they were getting 30 FPS average. That is a reasonable step up in performance, aided both to the graphics and the high-performance x86 cores. It will be interesting to see how the memory speed changes the results here.

iGPU Gaming Performance Benchmarking Performance: CPU System Tests
Comments Locked

177 Comments

View All Comments

  • Lolimaster - Monday, February 12, 2018 - link

    I would get the Asus X370 pro and the G.Skill Flare X 3200 CL14 (ram is expensive no matter how "cheap" you wanna go)
  • coolhardware - Monday, February 12, 2018 - link

    Thank you for the recommendation!!! :-)
  • kaidenshi - Tuesday, February 13, 2018 - link

    I'm using the ASRock AB350M Pro4 with a Ryzen 3 1300X, 16GB Crucial Ballistix 2400MHz DDR4 memory, and a GTX 1060 SC. It's been a rock solid board so far, and it has two PCI-E storage slots (one is NVMe, the other is SATA) so you can use it comfortably in a case with limited storage options.

    I was nervous about it after I read some reviews on Newegg talking about stability issues, but it turned out pretty much all of those people were trying to overclock it far beyond its rated capabilities. It's perfectly stable if you don't try to burn it up on purpose.
  • Samus - Monday, February 12, 2018 - link

    Seriously. It's now obvious why Intel is using AMD graphics. Considering that its mostly on par (sometimes faster, sometimes slower) with a GT 1030, a $100 GPU that uses 30 watts alone, Intel made the right choice using VEGA.
  • Flunk - Monday, February 12, 2018 - link

    Wow, that's some impressive numbers for the price point (either of them). I think the R5 2400G would cover the vast majority of users' CPU and GPU needs to the point where they wouldn't notice a difference from anything more expensive. Anyone short of a power user or hardcore gamer could buy one of these and feel like they'd bought a real high-end system, with a $169.99 CPU. That's real value. I kinda want one to play around with, I don't know how I'll justify that to myself... Maybe I'll give it to my father next Christmas.
  • jjj - Monday, February 12, 2018 - link

    Was hoping to see GPU OC perf and power, won't scale great unless the memory controller can take faster sticks (than Summit Ridge) but we still need to figure it all out.
  • iter - Monday, February 12, 2018 - link

    Most other sites' reviews feature overclocking and power.
  • Ian Cutress - Monday, February 12, 2018 - link

    I started an initial run with higher speed memory, but nothing substantial enough to put in the article just yet. I'm planning some follow ups.
  • jjj - Monday, February 12, 2018 - link

    Looking forward to all of that.

    Anyway, they do deliver here for folks that can't afford discrete or got other reasons to go with integrated. Even the 2400G is ok if one needs 8 threads.
  • Kamgusta - Monday, February 12, 2018 - link

    Where is the i5-8400 that has the same price as the 2400G?
    Oh, yeah, they totally left it out from the benchmarks since it would have proved an absolute supremacy of the Intel offering.
    Ops.

Log in

Don't have an account? Sign up now