Gaming Benchmarks: Integrated Graphics Overclocked

Given the disappointing results on the Intel HD 530 graphics when the processor was overclocked, the tables were turned and we designed a matrix of both CPU and IGP overclocks to test in our graphics suite. So for this we still take the i7-6700K at 4.2 GHz to 4.8 GHz, but then also adjust the integrated graphics from 'Auto' to 1250, 1300, 1350 and 1400 MHz as per the automatic overclock options found on the ASRock Z170 Extreme7+.

Technically Auto should default to 1150 MHz in line with what Intel has published as the maximum speed, however the results on the previous page show that this is more of a see-saw operation when it come to power distribution of the processor. With any luck, actually setting the integrated graphics frequency should maintain that frequency throughout the benchmarks.  With the CPU overclocks as well, we can see how it scales with added CPU frequency.

Results for these benchmarks will be provided in matrix form, both as absolute numbers and as a percentage compared to the 4.2 GHz CPU + Auto IGP reference value. Test settings are the same as the previous set of data for IGP.

Absolute numbers

Percentage Deviation from 4.2 GHz / Auto

Conclusions on Overclocking the IGP

It becomes pretty clear that by fixing the frequency of the integrated graphics, there is for the most part no longer the detrimental effect when you overclock the processor, or at least the reduction in performance to the same degree (which falls within standard error). On three of the games, fixing the integrated graphics to 1250 Mhz nets ~10% boost in performance, which for titles like Attila extends to 23% at 1400 MHz. By contrast, GTA V shows only a small gain, indicating that we are perhaps limited in other ways.

Gaming Benchmarks: Integrated Graphics Gaming Benchmarks: Discrete Graphics
Comments Locked

103 Comments

View All Comments

  • kmmatney - Friday, August 28, 2015 - link

    The G3258 is fun to overclock. The overclock on my Devils Canyon i5 made a difference on my server, which runs 3 minecraft servers at the same time. I needed the overclock to make up for the lousy optimization of Java on the WHS 2011 OS.
  • StrangerGuy - Saturday, August 29, 2015 - link

    Yeah, spend $340 on a 6700K, $200 on a mobo, $100 on a cooler for measly 15% CPU OC, all for a next to zero real world benefit in single GPU gaming loads compared to a $250 locked i5 / budget mobo.

    Who cares about how easy you can perform the OC when the value for money is rubbish.
  • hasseb64 - Saturday, August 29, 2015 - link

    You nailed it stranger!
  • Beaver M. - Saturday, August 29, 2015 - link

    You should have known that before, that even without overclock your CPU will be so fast that you wont be seeing any difference in most games when overclocking.
  • Deders - Saturday, August 29, 2015 - link

    If you intend to keep the hardware for a long period of time it can help. My previous i5-750's 3800MHz overclock made it viable as a gaming processor for the 5 or so years I was using it.

    For example it allowed me to play Arkham City with full PhysX on a single 560TI, at stock speeds it wasn't playable with these settings. The most recent Batman game was no problem for it even though many people were having issues, same goes for Watchdogs.
  • qasdfdsaq - Wednesday, September 2, 2015 - link

    Sure, and my 50% overclock on my i7 920 made it viable for my main gaming PC for a few years longer than it otherwise would have been, but a 10-15% overclock? With a <1% gaming performance increase? Meh.
  • Impulses - Saturday, August 29, 2015 - link

    You're exaggerating the basic requirements tho, I'm sure some do that, but I've never paid $200 or $100 for a cooler ($160/65 tops)... And if you spent more on the i7 it damn better had been for HT or you've no clue what you're doing...
  • Xenonite - Saturday, August 29, 2015 - link

    @V900: "Today, processors have gotten so fast, that even the cheap 200$ CPUs are "fast enough" for most tasks."

    For almost all non-gaming tasks (except realtime frame interpolation) this is most certainly true. The thing is, CPUs are NOT even nearly fast enough to game at 140+ fps with the 99% frame latency at a suitable <8mS value.

    I realize that no one cares about >30fps gaming and that most people even condemn it as "looking to real" (in the same sentence as "your eyes can't even see a difference anyway"), therefore games aren't being optimised for low 99% frame latencies, and neither are modern CPUs.

    But for the few of us who are sensitive to 1ms frametime variances in sub-10ms average frame latency streams, overclocking to high clock speeds is the only way to approach relatively smooth frame delivery.
    On the same note, I would really have loved to see an FCAT or at least a FRAPS comparison of 99% frametimes between the different overclocked states, with a suitably overclocked GTX 980ti and some high-speed DDR4 memory along with the in-game settings being dialed back a bit (at least for the 1080p test).
  • EMM81 - Saturday, August 29, 2015 - link

    "CPUs are NOT even nearly fast enough to game at 140+ fps" "But for the few of us who are sensitive to 1ms frametime variances in sub-10ms average frame latency streams"

    1) People may care about 30-60+fps but where do you get 140 from???
    2) You are not sensitive to 1ms frametime variations...going from 33.3ms-16.7ms(30-60fps) makes only a very subtle difference to most people and that is a 16.6ms(0.5x) delta. There is zero possible way you can perceive anywhere near that level of difference. Even if we were talking about running at 60fps with a variation of 2ms and you could somehow stare at a side by side comparison until you maybe were able to pick out the difference why does it matter??? You care more about what the numbers say and how cool you think it is...
  • Communism - Saturday, August 29, 2015 - link

    Take your pseudo-intellectualism elsewhere.

Log in

Don't have an account? Sign up now