Conclusion: Raising the Bar for Integrated Graphics

The march on integrated graphics has come and gone in rapid spurts: the initial goal of providing a solution that provides enough performance for general office work has bifurcated into something that also aims gives a good gaming experience. Despite AMD and NVIDIA being the traditional gaming graphics companies, in this low-end space, it has required companies with x86 CPUs and compatible graphics IP to complete, meaning AMD and Intel. While going toe-to-toe for a number of years, with Intel dedicating over half of its silicon area to graphics at various points, the battle has become one-sided - Intel in the end only produced its higher performance solutions for specific customers willing to pay for it, while AMD marched up the performance by offering a lower cost solution as an alternative to discrete graphics cards that served little purpose beyond monitor output devices. This has come to a head, signifying a clear winner: AMD's graphics is the choice for an integrated solution, so much so that Intel is buying AMD's Vega silicon, a custom version, for its own mid-range integrated graphics. For AMD, that's a win. Now with the new Ryzen APUs, AMD has risen that low-end bar again.

If there was any doubt that AMD holds the integrated graphics crown, when we compare the new Ryzen APUs against Intel's latest graphics solutions, there is a clear winner. For almost all the 1080p benchmarks, the Ryzen APUs are 2-3x better in every metric. We can conclude that Intel has effectively given over this integrated graphics space to AMD at this point, deciding to focus on its encode/decode engines rather than raw gaming and 3D performance. With AMD having DDR4-2933 as the supported memory frequency on the APUs, assuming memory can be found for a reasonable price, it gaming performance at this price is nicely impressive.

When we compare the Ryzen 5 2400G with any CPU paired with the NVIDIA GT 1030, both solutions are within a few percent of each other in all of our 1080p benchmarks. The NVIDIA GT 1030 is a $90 graphics card, which when paired with a CPU, gets you two options: either match the combined price with the Ryzen 5 2400G, which leaves $80 for a CPU, giving a Pentium that loses in anything multi-threaded to AMD; or just increases the cost fo the system to get a CPU that is equivalent in performance. Except for chipset IO, the Intel + GT 1030 route offers no benefits over the AMD solution: it costs more, for a budget-constrained market, and draws more power overall. There's also the fact that the AMD APUs come with a Wraith Stealth 65W cooler, which adds additional value to the package that Intel doesn't seem to want to match.

For the compute benchmarks, Intel is still a clear winner with single threaded tests, with a higher IPC and higher turbo frequency. That is something that AMD might be able to catch up with on 12nm Zen+ coming later this year, which should offer a higher frequency, but Zen 2 is going to be the next chance to bridge this gap. If we compare the multi-threaded tests, AMD with 4C/8T and Intel 6C/6T seem to battle it out depending if a test can use multi-threading appropriately, but compared to Kaby Lake 4C/4T or 2C/4T offerings, AMD comes out ahead.

With the Ryzen 5 2400G, AMD has completely shut down the sub-$100 graphics card market. As a choice for gamers on a budget, those building systems in the region of $500, it becomes the processor to pick.

For the Ryzen 3 2200G, we want to spend more time analyzing the effect of a $99 quad-core APU the market, as well as looking how memory speed affects performance, especially with integrated graphics. There's also the angle of overclocking - with AMD showing a 20-25% frequency increase on the integrated graphics, we want to delve into how to unlock potential bottlenecks in a future article.

Power Consumption
Comments Locked

177 Comments

View All Comments

  • Cooe - Monday, February 12, 2018 - link

    Here's an article with a bunch of graph's that include the i7-5775C if you'd prefer to peep this instead of that vid.
    https://hothardware.com/reviews/amd-raven-ridge-ry...
  • Cooe - Monday, February 12, 2018 - link

    Your i7-5775C isn't even as fast as an old Kavari A10 w/ 512 GCN2 SP's (it's close, but no cigar), so vs Vega 8 & 11 it gets it's ass absolutely handed to it... like by a lot - https://youtu.be/sCWOfwcYmHI
  • jrs77 - Monday, February 12, 2018 - link

    When I look at all the available benchmarks so far, then there's nothing this chip can play, that I can't allready play with my 5775C. 1080p with medium settings is no problem for most games like Overwatch, Borderlands, WoW, Diablo, etc. So if the 2400G can't run them at high settings, like it looks like, then I see no reason to call it the King of integrated graphics really.
  • Holliday75 - Monday, February 12, 2018 - link

    How on God's green Earth can you compare a $600+ CPU versus the 2400g? The whole point of iGPU is to be cheap. The 2400g out performs a CPU that costs over 3x as much in the exact area this chip was built for. Low end gaming.
  • jrs77 - Monday, February 12, 2018 - link

    $600 ?!? I paid €400 for my 5775C incl 24% VAT. So that would be $300 then.

    And again. I can play games in 1080p with low to medium settings just fine, so I don't see a reason to upgrade.
  • acidtech - Monday, February 12, 2018 - link

    Need to check your math. €400 = $491.
  • jrs77 - Tuesday, February 13, 2018 - link

    Back when I bought it, the Euro and the Dollar where allmost 1:1, and to get the Dollar-price you need to subtract the 24% VAT I pay over here, so yeah, back then it was around $300. Hell, the intel list-price was $328.
  • SaturnusDK - Wednesday, February 14, 2018 - link

    So what you're saying is that you paid twice the money to have under half the graphics performance and 20% lower CPU performance of a 2400G.

    Graphics-wise the 5775C was pretty bad and got beaten by ALL AMD APUs at the time. It was close but it was never very good. Time has not been kind to it.
  • SSNSeawolf - Monday, February 12, 2018 - link

    I noticed with some sadness that there's no DOTA 2 benchmarks. Was this due to time constraints or unforeseen issues? I'm crossing my fingers that DOTA 2 hasn't been dropped for good as it's a great benchmark for silicon such as this, though the other benchmarks of course do let us ballpark where it would land.
  • Ian Cutress - Monday, February 12, 2018 - link

    That's in our GPU reviews; different editors with different benchmark sets. We're looking at unifying the two.

Log in

Don't have an account? Sign up now