Discrete Graphics Performance

As stated on the first page, here we take both APUs from 3.5 GHz to 4.0 GHz in 100 MHz increments and run our testing suite at each stage. This is a 14.3% increase in clock speed, however when it comes to gaming it can be unpredictable where those gains are going to come from. 

For our gaming tests, we are only concerned with real-world resolutions and settings for these games. It would be fairly easy to adjust the settings in each game to a CPU limited scenario, however the results from such a test are mostly pointless and non-transferable to the real world in our view. Scaling takes many forms, based on GPU, resolution, detail levels, and settings, so we want to make sure the results correlate to what users will see day-to-day.

Civilization 6

First up in our CPU gaming tests is Civilization 6. Originally penned by Sid Meier and his team, the Civ series of turn-based strategy games are a cult classic, and many an excuse for an all-nighter trying to get Gandhi to declare war on you due to an integer overflow. Truth be told I never actually played the first version, but every edition from the second to the sixth, including the fourth as voiced by the late Leonard Nimoy, it a game that is easy to pick up, but hard to master.

Civilization 6 on ASUS GTX 1060 Strix 6GB - Average Frames Per SecondCivilization 6 on ASUS GTX 1060 Strix 6GB - 99th Percentile

Despite not showing any great scaling for integrated graphics, the minute we bump up to our discrete GPU we can see that Civilization gets a good bump from frequency scaling. The average frame rates climb up +7.0% for the 2400G and +9.7% for the 2200G. Percentile numbers seem to vary on the 2400G, but the 2200G gets a distinct +10.6% gain.

Ashes of the Singularity (DX12)

Seen as the holy child of DX12, Ashes of the Singularity (AoTS, or just Ashes) has been the first title to actively go and explore as many of the DX12 features as it possibly can. Stardock, the developer behind the Nitrous engine which powers the game, has ensured that the real-time strategy title takes advantage of multiple cores and multiple graphics cards, in as many configurations as possible.

Ashes of The Singularity on ASUS GTX 1060 Strix 6GB -  Average Frames Per SecondAshes of The Singularity on ASUS GTX 1060 Strix 6GB - 99th Percentile

AoTS is again a little over the place: technically there's an 8% gain in frame rates for the 2400G, however the 2200G seems to fluctuate a bit more. The better performance on the 2200G seems a bit startling too: technically with only four threads, each thread has more memory bandwidth and more core resources per thread than having eight threads together. This might improve certain latencies in the instruction list, although it is surprising so see such a big change.

Rise Of The Tomb Raider (DX12)

One of the newest games in the gaming benchmark suite is Rise of the Tomb Raider (RoTR), developed by Crystal Dynamics, and the sequel to the popular Tomb Raider which was loved for its automated benchmark mode. But don’t let that fool you: the benchmark mode in RoTR is very much different this time around.

Visually, the previous Tomb Raider pushed realism to the limits with features such as TressFX, and the new RoTR goes one stage further when it comes to graphics fidelity. This leads to an interesting set of requirements in hardware: some sections of the game are typically GPU limited, whereas others with a lot of long-range physics can be CPU limited, depending on how the driver can translate the DirectX 12 workload.

Rise of the Tomb Raider on ASUS GTX 1060 Strix 6GB -  Average Frames Per SecondRise of the Tomb Raider on ASUS GTX 1060 Strix 6GB - 99th Percentile

RoTR sees a small +3% gain in average frame rates going up to 4.0 GHz, however the percentiles get the biggest boost, showing +17.9% on the 2400G.

Integrated Graphics Performance, Cont Discrete Graphics Performance, Cont
Comments Locked

29 Comments

View All Comments

  • eastcoast_pete - Thursday, June 21, 2018 - link

    I hear your point. What worries me about buying second hand GPU especially nowadays is that there is no way to know whether it was used to mine crypto 24/7 for the last 2-3 years or not. Semiconductors can wear out if used for thousands of hours both overvolted and at above normal temps; both can really affect not just the GPU, but especially also the memory.
    The downside of a 980 or 970 (which wasn't as much at risk for cryptomining) is the now outdated HDMI standard. But yes, just for gaming, they can do.
  • Lolimaster - Friday, June 22, 2018 - link

    A CM Hyper 212X is cheap and it's one the best bang for buck coolers. 16GB of ram is expensive if you want 2400 o 3000 CL15. 8GB is just too low, the igpu needs some of it and many games (2015+ already need 6GB+ of system memory)
  • eastcoast_pete - Thursday, June 21, 2018 - link

    Thanks for the link! Yes, those results are REALLY interesting. They used stock 2200G and 2400G, no delidding, no undervolting of the CPU, and on stock heatsinks, and got quite an increase, especially when they also used faster memory (to OC memory speed also) . Downside was notable increase in power draw and the stock cooler's fan running at full tilt.
    So, Gavin's delidded APUs with their better heatsinks should do even better. The most notable thing in that German article was that the way to the overclock mountain (stable at 1600 Mhz stock cooler etc.) led through a valley of tears, i.e. the APUs crashed reliably when the iGPU was mildly overclocked, but then became stable again at higher iGPU clock speeds and voltage. They actually got some statement from AMD that AMD knows about that strange behavior, but apparently has no explanation for it. But then - running more stable if I run it even faster - bring it!
  • 808Hilo - Friday, June 22, 2018 - link

    A R3 is not really an amazing feat. It's a defective R7 with core, lane, fabric, pinout defects. The rest of the chip is run at low speed because the integrity is affected. Not sure anyone is getting their money worth here.
  • Lolimaster - Friday, June 22, 2018 - link

    I don't get this nonsense articles on an APU were the MAIN STAR IS THE IGPU. On some builds there mixed results when the gpu frequency jumped around 200-1200Mhz (hence some funny low 0.1-1% lows in benchmarks).

    It's all about OC the igpu forgetting about the cpu part and addressing/fixing igpu clock rubber band effect, sometimes disabling boost for cpu, increase soc voltage, etc.
  • Galatian - Friday, June 22, 2018 - link

    I'm going to question the results a little bit. For me it looks like that the only ”jump” in performance you get in games occurs whenever you hit an OC over the standard boost clock, e.g. 3700 MHz on the 2400G. I would suspect that you are simply preventing some core parking or some other aggressive power management feature while applying the OC. That would explain the odd numbers with when you increase the OC.

    That being said I would say a CPU OC doesn't really make sense. An undervolting test to see where the sweet spot lies would be nice though.
  • melgross - Monday, June 25, 2018 - link

    Frankly, the result of all these tests seems to be that overclocking isn’t doing much of anything useful, at least, not the small amounts we see here with AMD.

    5% is never going to be noticed. Several studies done a number of years ago showed that you need at least an overall 10% improvement in speed for it to even be noticeable. 15% would be barely noticeable.

    For heavy database workloads that take place over hours, or long rendering tasks, it will make a difference, but for gaming, which this article is overly interested in, nada!
  • Allan_Hundeboll - Monday, July 2, 2018 - link

    Benchmarks @46W cTDP would be interesting
  • V900 - Friday, September 28, 2018 - link

    The 2200G makes sense for an absolute budget system.

    (Though if you're starting from rock bottom and also need to buy a cabinet, motherboard, RAM, etc. you'll probably be better off taking that money and buying a used computer. You can get some really good deals for less than 500$)

    The 2400G however? Not so much. The price is too high and the performance too low to compete with an Intel Pentium/Nvidia 1030 solution.

    Or if you want to spend a few dollars more and find a good deal: An Intel Pentium/Nvidia 1050.

Log in

Don't have an account? Sign up now