Gaming: Integrated Graphics

Despite being the ultimate joke at any bring-your-own-computer event, gaming on integrated graphics can ultimately be as rewarding as the latest mega-rig that costs the same as a car. The desire for strong integrated graphics in various shapes and sizes has waxed and waned over the years, with Intel relying on its latest ‘Gen’ graphics architecture while AMD happily puts its Vega architecture into the market to swallow up all the low-end graphics card sales. With Intel poised to make an attack on graphics in the next few years, it will be interesting to see how the graphics market develops, especially integrated graphics.

For our integrated graphics testing, we take our ‘IGP’ category settings for each game and loop the benchmark round for five minutes a piece, taking as much data as we can from our automated setup.

IGP: World of Tanks, Average FPS IGP: Final Fantasy XV, Average FPS

Finally, looking at integrated graphics performance, I don’t believe anyone should be surprised here. Intel has not meaningfully changed their iGPU since Kaby Lake – the microarchitecture is the same and the peak GPU frequency has risen by all of 50MHz to 1200MHz – so Intel’s iGPU results have essentially been stagnant for the last couple of years at the top desktop segment.

To that end I don’t think there’s much new to say. Intel’s GT2 iGPU struggles even at 720p in some of these games; it’s not an incapable iGPU, but there’s sometimes a large gulf between it and what these games (which are multi-platform console ports) expect for minimum GPU performance. The end result is that if you’re serious about iGPU performance in your desktop CPU, then AMD’s APUs provide much better performance. That said, if you are forced to game on the 9900K’s iGPU, then at least the staples of the eSports world such as World of Tanks will run quite well.

Gaming: F1 2018 Power Consumption
Comments Locked

274 Comments

View All Comments

  • 0ldman79 - Friday, October 19, 2018 - link

    There are certainly occasions where more cores are better than clock speed.

    Just look at certain mining apps. You can drop the power usage by half and only lose a little processing speed, but drop them to 2 cores at full power instead of 4 and it is a *huge* drop. Been playing with the CPU max speed in Windows power management on my various laptops. The Skylake i5 6300HQ can go down to some seriously low power levels if you play with it a bit. The recent Windows updates have lost a lot of the Intel Dynamic Thermal control though. That's a shame.
  • Makaveli - Friday, October 19, 2018 - link

    Power consumption rules on mobiles parts why would they release an 8 core model?
  • notashill - Friday, October 19, 2018 - link

    Because you get more performance at the same power level using more cores at lower clocks. The additional cores are power gated when not in use.
  • evernessince - Saturday, October 20, 2018 - link

    Not judging by the power consumption and heat output displayed here.
  • mkaibear - Friday, October 19, 2018 - link

    9700K is definitely the way to go on the non-HEDT. 9900K is technically impressive but the heat? Gosh.

    It's definitely made me consider waiting for the 9800X though - if the 7820X full load power is 145W ("TDP" 140W) at 3.6/4.3, then the 9800X isn't likely to be too much higher than that at 3.8/4.5.

    Hrm.
  • Cooe - Friday, October 19, 2018 - link

    "9700K is definitely the way to go on the non-HEDT."

    I think you meant to say "Ryzen 5 2600 unless your GPU's so fast, it'll HEAVILY CPU-bind you in gaming" but spelt it wrong ;). The 9700K is a vey good CPU, no doubt, but to claim it the undisputed mainstream champ at it's currently mediocre bang/$ value (so important for the mainstream market) doesn't make any sense, or accurately represent what people in the mainstream are ACTUALLY buying (lots of Ryzen 5 2600's & i5-8400's; both with a MUCH saner claim to the "best overall mainstream CPU" title).
  • mkaibear - Saturday, October 20, 2018 - link

    No, I meant to say "9700K is definitely the way to go on the non-HEDT".

    Don't put words in people's mouth. I don't just game. The video encoding tests in particular are telling - I can get almost a third better performance with the 9700K than I can the r5 2600x.

    >"best overall mainstream CPU" title

    Please don't straw man either. Nowhere did I say that it was the best overall mainstream CPU (that's the R7 2700X in my opinion), but for my particular use case the 9700K or the 9800X are better suited at present.
  • koaschten - Friday, October 19, 2018 - link

    Uhm yeah... so where are the 9900k overclocking results the article claims are currently being uploaded? :)
  • watzupken - Friday, October 19, 2018 - link

    The i9 processor is expected to be quite impressive in performance. However this review also reveals that Intel is struggling to pull more tricks out of their current 14nm and Skylake architect. The lack of IPC improvement over the last few generations is just forcing them to up the clockspeed to continue to cling on to their edge. Considering that they are launching the new series this late in the year, they are at risk of AMD springing a surprise with their 7nm Zen 2 slated to launch next year.
  • SquarePeg - Friday, October 19, 2018 - link

    If the rumored 13% IPC and minimum 500mhz uplift are for real with Zen 2 then AMD would take the performance crown. I'm not expecting very high clocks from Intel's relaxed 10nm process so it remains to be seen what kind of IPC gain they can pull with Ice Lake. It wouldn't surprise me if they had a mild performance regression because of how long they had to optimize 14nm for clock speed. Either way I'm all in on a new Ryzen 3 build next year.

Log in

Don't have an account? Sign up now