Gaming: Integrated Graphics

Despite being the ultimate joke at any bring-your-own-computer event, gaming on integrated graphics can ultimately be as rewarding as the latest mega-rig that costs the same as a car. The desire for strong integrated graphics in various shapes and sizes has waxed and waned over the years, with Intel relying on its latest ‘Gen’ graphics architecture while AMD happily puts its Vega architecture into the market to swallow up all the low-end graphics card sales. With Intel poised to make an attack on graphics in the next few years, it will be interesting to see how the graphics market develops, especially integrated graphics.


An AMD APU Base Layout

The two processors on test today have very different attitudes towards integrated graphics. The AMD Athlon 200GE uses the latest Vega architecture, designed for high performance, even if AMD only uses 192 streaming processors in this design. Intel on the other hand is using its older Gen 9 graphics architecture, built for mobile processors, and is using a baseline GT1 configuration when most Intel desktop processors have GT2.

AMD vs Intel at ~$60
  AMD Athlon
200GE
Intel Pentium
Gold G5400
Cores / Threads 2 / 4 2 / 4
Microarchitecture Zen Coffee Lake
Motherboards X470, X370, B450
B350, A320, A300
Z390, Z370, Q370
H370, B360, H310
CPU Frequency 3.2 GHz 3.7 GHz
L2 Cache 512 KB/core 256 KB/core
L3 Cache 2 MB / core 2 MB / core
Integrated Graphics Vega 3
192 SPs
UHD 610
12 EUs (96 ALUs)
DDR4 Support DDR4-2933 DDR4-2666
GPU Frequency Up to 1000 MHz 350-1050 MHz
TDP 35 W 54 W (2-core die version)
58 W (4-core die version)*
Price $55 (SRP) $64 (1k/u)
* Intel harvests both 2+2 and 4+2 dies to make G5400 parts. It's impossible to know which one you have without removing the lid and measuring the die area.

Intel does have a small ray of hope here – caches are important when it comes to integrated graphics, so while the 200GE has a bigger L2 cache (512KB vs 256KB) and faster main memory (DDR4-2666 vs DDR4-2400), the AMD L3 cache is a victim cache whereas the Intel L3 cache is a fully inclusive cache that can pre-fetch data. It’s a slim chance, but Intel should take what it can.

For our integrated graphics testing, we take our ‘IGP’ category settings for each game and loop the benchmark round for five minutes apiece, taking as much data as we can from our automated setup.

IGP: World of Tanks, Average FPS

IGP: Final Fantasy XV, Average FPS

IGP: Shadow of War, Average FPS

IGP: Civilization 6, Average FPS

IGP: Car Mechanic Simulator 2018, Average FPS

IGP: Ashes Classic, Average FPS

IGP: Grand Theft Auto V, Average FPS

IGP: Far Cry 5, Average FPS

IGP: F1 2018, Average FPS

That was a white wash. AMD’s worst win was 48% in both Ashes and F1 2018, while its best wins were in Far Cry 5 at 122.2% and Civilization 6 at 112.1%.

CPU Performance: Legacy Tests Gaming: World of Tanks enCore
Comments Locked

95 Comments

View All Comments

  • kkilobyte - Monday, January 14, 2019 - link

    s/i3/Pentium. Obviously :)
  • freedom4556 - Monday, January 14, 2019 - link

    I think you messed up your charts for Civ 6's IGP testing. That or why are you testing the IGP at 1080p Ultra when all the other IGP tests are at 720p Low?
  • freedom4556 - Monday, January 14, 2019 - link

    Also, the 8k and 16k tests are pointless wastes of time. Especially in this review, but also in the others. Your low/med/high/ultra should be 720p/1080p/1440p/4k if you want to actually represent the displays people are purchasing.
  • nevcairiel - Monday, January 14, 2019 - link

    The Civ6 tests are like that because thats when it really starts to scale like the other games. Look at its IGP vs Low, which is 1080p vs 4K. The values are almost identical (and still pretty solid). Only if you move to 8K and then 16K you see the usual performance degredation you would see with other games.
  • AnnoyedGrunt - Tuesday, January 15, 2019 - link

    I second this motion. Please have settings to cover the various common monitor choices. 1080P is an obvious choice, but 1440P should be there too, along with 4K. I don't think you need to run two 4K versions, or two 1080P versions, or whatever. I have a 1440P monitor so it would be nice to see where I become GPU limited as opposed to CPU limited. Maybe Civ6 could use some extra high resolutions in the name of science, but to be useful, you should at least include the 1440P on all games.

    Thanks.

    -AG
  • eddieobscurant - Monday, January 14, 2019 - link

    Another pro intel article from Ian, who hopes that someday intel will hire him
  • PeachNCream - Monday, January 14, 2019 - link

    The numbers in the chart speak for themselves. You don't have to acknowledge the conclusion text. It's only a recommendation anyway. Even though I'd personally purchase a 200GE if I were in the market, I don't think there is any sort of individual bias coming into play. Where the 200GE is relevant, gaming on the IGP, Ian recommended it. In other cases the G5400 did come out ahead by enough of a margin to make it worth consideration. The only flaw I could tease out of this is the fact that the recommendation is based on MSRP and as others have noted, the G5400 is significantly above MSRP right now. It may have been good to acknowledge that in the intro and conclusion in a stronger manner, but that means the article may not stand up as well to the test of time for someone browsing this content six months later after searching for advice on the relevant CPUs via Google.
  • kkilobyte - Monday, January 14, 2019 - link

    Acknowledge "in a stronger manner"? Well, it is actually not acknowledged in the conclusion at all!

    The title of the article is: "The $60 CPU question". One of those CPU is clearly not being sold at $60 on average, but is priced significantly higher. I think the article should have compared CPUs that are really available at (around) $60.

    So maybe there is no personal bias - but there is clearly ignorance of the market state. And that's surprizing, since the G5400 price was above its MSRP for several months already; how could a professional journalist in the field ignore that?

    I guess it could be objected that "MSRP always was used in the past as the reference price". Granted - but it made sense while the MSRP was close to the real market price. It doesn't anymore once the gap gets big, which is the case for tbe G5400. Nobody gives a damn about the theorical price if it is applied nowhere on the market.

    And the 'numbers of chart' don't 'speak for themselves' - they are basically comparing CPUs whose retail price, depending on where you get them, show a 20-40% price gap. What's the point? Why isn't there a price/performance graph, as there were in past reviews? The graphs could just as well include high-end CPUs, and would be just as useless.

    If I want to invest ~$60 in a CPU, I'm not interested to know how a ~$90 one performs!
  • sonny73n - Tuesday, January 15, 2019 - link

    +1

    I couldn’t have said it better myself.
  • cheshirster - Wednesday, January 23, 2019 - link

    Yes, 5400 is priced nowhere near 60$ and reviewer definitely knows it, but fails to mention this in conclusion.

Log in

Don't have an account? Sign up now