Ryzen 5 2400G Integrated Graphics Gaming Performance

As mentioned in the introduction, we overclocked both APUs from 1100 MHz to 1600 MHz in 50 MHz increments. With our gaming tests, we’re primarily concerned with the most common options associated with gaming including resolution; while the Ryzen APUs are marketed for 720p gaming, and while resolutions such as 2160p and 1440p are out of reach purely for performance reasons, we have opted to use moderate settings at 1080p for our testing. As also stated, frequencies between 1400 MHz and 1500 MHz have been omitted due to instability issues relating to a potential issue which we have reached out to AMD for; we will update when AMD respond to us regarding this. 

Civilization 6

First up in our APU gaming tests is Civilization 6. Originally penned by Sid Meier and his team, the Civilzation series of turn-based strategy games are a cult classic, and many an excuse for an all-nighter trying to get Gandhi to declare war on you due to an integer underflow.

Truth be told I never actually played the first version, but every edition from the second to the sixth, including the fourth as voiced by the late Leonard Nimoy, it a game that is easy to pick up, but hard to master.

Civilization 6 on iGPU, Average Frames Per Second - 2400G

Civilization 6 on iGPU, 99th Percentile - 2400G

The 2400G saw a modest bump of around 8% in total from the 1100 MHz to 1600 MHz. The 99th percentile frames are also consistent, but both did throw up an anomaly at 1250 MHz, which is actually the default iGPU frequency of the 2400G.

Ashes of the Singularity (DX12)

Seen as the holy child of DX12, Ashes of the Singularity (AoTS, or just Ashes) has been the first title to actively go and explore as many of the DX12 features as it possibly can. Stardock, the developer behind the Nitrous engine which powers the game, has ensured that the real-time strategy title takes advantage of multiple cores and multiple graphics cards, in as many configurations as possible.

AoTS on iGPU, Average Frames Per Second - 2400G

AoTS on iGPU, 99th Percentile - 2400G

Ashes of the Singularity gave an improvement of 8.5% on average frame rates when going from 1100 MHz to 1600 MHz on the Vega 11 iGPU. The 99th percentile also increased around 8% when going from bottom to top, albeit a little shakey at the default iGPU clock of the 2400G.

Rise Of The Tomb Raider (DX12)

Rise of the Tomb Raider (RoTR), developed by Crystal Dynamics, and the sequel to the popular Tomb Raider which was loved for its automated benchmark mode. But don’t let that fool you: the benchmark mode in RoTR is very much different this time around. Visually, the previous Tomb Raider pushed realism to the limits with features such as TressFX, and the new RoTR goes one stage further when it comes to graphics fidelity.

This leads to an interesting set of requirements in hardware: some sections of the game are typically GPU limited, whereas others with a lot of long-range physics can be CPU limited, depending on how the driver can translate the DirectX 12 workload.

RoTR on iGPU, Average Frames Per Second - 2400G

RoTR on iGPU, 99th Percentile - 2400G

The 2400G experienced a consistent improvement as the iGPU was overclocked at each 50 MHz increment with a 13.6% gain in average frames and 12.2% in the 99th percentile from 1100 MHz to 1600 MHz.

Overclocking Ryzen APU Integrated Graphics Ryzen 5 2400G Integrated Graphics OC (2)
Comments Locked

49 Comments

View All Comments

  • PeachNCream - Friday, September 28, 2018 - link

    Interesting analysis, though it's a bit of a foregone conclusion these days to expect a GPU overclock to improve performance in games more than a CPU overclock since the central processor, after a point, has very little role in increasing framerates.

    This one struck me as odd though - "...Ryzen APUs are marketed for 720p gaming, and while resolutions such as 2160p and 1440p are out of reach purely for performance reasons, we have opted to use moderate settings at 1080p for our testing."

    Were the tests executed at 1080p so they would align better in the Bench? It seems more reasonable to test at 720p given the various limits associated with iGPUs in general and the use of 1080p just comes across as lazy in the same way Anandtech tests CPU performance in games at resolutions so high that GPU performance masks the differences in various processors. Tom's Hardware, back when the good doctor actually ran it, yanked resolution down as low as possible to eliminate the GPU as a variable in CPU tests and it was a good thing.
  • stuffwhy - Friday, September 28, 2018 - link

    Just purely speculating, is it possible that 720p results are just great (60+ fps) and need no testing? One could hope.
  • gavbon - Friday, September 28, 2018 - link

    My reasoning for selecting 1080p gaming tests over 720p was mainly because the other scaling pieces were running at the same resolution. Not just the iGPU tests, but the dGPU testing with the GTX 1060 too. It wasn't a case of being 'lazy' but the majority of gamers who currently use steam use 1080p and as it's the most popular resolution for gamers, I figured that's where I would lay it down.
  • neblogai - Friday, September 28, 2018 - link

    Even if monitor is 1080p, a lot of 2200G users may want to run games on 1080p with resolution scaling, for better fps. In effect, at 720p or 900p. Most games support it these days. So, popularity of 1080p monitors does not really make 720p tests less useful for this level of GPU performance.
  • V900 - Friday, September 28, 2018 - link

    Would be great if you had tested just one game at 720p.

    I know this is what I would be interested in knowing/reading if I was a possible customer.
  • usernametaken76 - Sunday, September 30, 2018 - link

    I honestly think this "majority of gamers who currently use steam use 1080p" argument is affected by a) laptop users (see the high number of 1366x768 users) and therefore game at whatever resolution their laptop panel is set to...
    Which leads one to ask what the point of testing desktop parts is when you use that as a basis for what and how to test.
  • TheJian - Friday, October 5, 2018 - link

    agree 100%. They do a lot of dumb testing here. Ryan has been claiming 1440p was the "enthusiast resolution" since Geforce 660ti. I don't even think you can say that TODAY as I'm staring at my two monitors (have a 3rd also 1080p), both of which are 1200p/1080p.

    For me, I need everything turned on, and you need to show where it hits 30fps at that level. Why? Because the designers of the games didn't want you to play their game at "MODERATE SETTINGS"...ROFL. Just ask them. They design EXACTLY WHAT THEY WANT YOU TO SEE. Then for some reason, reviewers ignore this, and benchmark the crap out of situations I'd avoid at all costs. I don't play a game until I can max it on one of my two monitors with my current card. If I want to play something that badly early, I'll buy a new card to do it. All tested resolutions should be MAXED OUT settings wise. Why would I even care how something runs being degraded? Show me the best, or drop dead. This is why I come to anandtech ONLY if I haven't had my fill from everywhere else.

    One more point, they also turn cards etc, down. Run as sold, PERIOD. If it's an OC card, show it running with a simple checkbox that OC's it the max the card allows as their defaults. IE most have game mode, etc. Choose the fastest settings their software allows out of their usually 3 or 4 default settings. I'm not talking messing with OCing yourself, I mean their choses 3-4 they give in the software for defaults. Meaning a monkey could do this, so why pretend it isn't shipped to be used like this? What user comes home with an OC card and reverts to NV/AMD default ref card speeds? ROFL. Again, why I dropped this site for the most part. Would you test cars with 3 tires? Nope. They come with 4...LOL. I could go on, but you should get the point. Irrelevant tests are just that.
  • flyingpants265 - Tuesday, March 5, 2019 - link

    Hate to tell you this, but 4k is the enthusiast resolution now.
  • 808Hilo - Saturday, October 13, 2018 - link

    Is it just me or are these test just for borderline ill people?

    Playing 4k with a 1080/1800/32/S970. Works reasonably well. I also do everything else in 4k. Would I go back to lower res? No way. Artifical benchmarking is one, real world is 4k. Test this rez and we get a mixed GPU, APU, CPU bench. Build meaningful systems instead of artificially push single building blocks. Push for advancements. T
  • Targon - Friday, September 28, 2018 - link

    The big problem with these APUs is that they limit the number of PCI Express channels, so if you DO decide to add a video card, the APU in this case will reduce performance, compared to a normal CPU without the graphics.

Log in

Don't have an account? Sign up now