Conclusion: Raising the Bar for Integrated Graphics

The march on integrated graphics has come and gone in rapid spurts: the initial goal of providing a solution that provides enough performance for general office work has bifurcated into something that also aims gives a good gaming experience. Despite AMD and NVIDIA being the traditional gaming graphics companies, in this low-end space, it has required companies with x86 CPUs and compatible graphics IP to complete, meaning AMD and Intel. While going toe-to-toe for a number of years, with Intel dedicating over half of its silicon area to graphics at various points, the battle has become one-sided - Intel in the end only produced its higher performance solutions for specific customers willing to pay for it, while AMD marched up the performance by offering a lower cost solution as an alternative to discrete graphics cards that served little purpose beyond monitor output devices. This has come to a head, signifying a clear winner: AMD's graphics is the choice for an integrated solution, so much so that Intel is buying AMD's Vega silicon, a custom version, for its own mid-range integrated graphics. For AMD, that's a win. Now with the new Ryzen APUs, AMD has risen that low-end bar again.

If there was any doubt that AMD holds the integrated graphics crown, when we compare the new Ryzen APUs against Intel's latest graphics solutions, there is a clear winner. For almost all the 1080p benchmarks, the Ryzen APUs are 2-3x better in every metric. We can conclude that Intel has effectively given over this integrated graphics space to AMD at this point, deciding to focus on its encode/decode engines rather than raw gaming and 3D performance. With AMD having DDR4-2933 as the supported memory frequency on the APUs, assuming memory can be found for a reasonable price, it gaming performance at this price is nicely impressive.

When we compare the Ryzen 5 2400G with any CPU paired with the NVIDIA GT 1030, both solutions are within a few percent of each other in all of our 1080p benchmarks. The NVIDIA GT 1030 is a $90 graphics card, which when paired with a CPU, gets you two options: either match the combined price with the Ryzen 5 2400G, which leaves $80 for a CPU, giving a Pentium that loses in anything multi-threaded to AMD; or just increases the cost fo the system to get a CPU that is equivalent in performance. Except for chipset IO, the Intel + GT 1030 route offers no benefits over the AMD solution: it costs more, for a budget-constrained market, and draws more power overall. There's also the fact that the AMD APUs come with a Wraith Stealth 65W cooler, which adds additional value to the package that Intel doesn't seem to want to match.

For the compute benchmarks, Intel is still a clear winner with single threaded tests, with a higher IPC and higher turbo frequency. That is something that AMD might be able to catch up with on 12nm Zen+ coming later this year, which should offer a higher frequency, but Zen 2 is going to be the next chance to bridge this gap. If we compare the multi-threaded tests, AMD with 4C/8T and Intel 6C/6T seem to battle it out depending if a test can use multi-threading appropriately, but compared to Kaby Lake 4C/4T or 2C/4T offerings, AMD comes out ahead.

With the Ryzen 5 2400G, AMD has completely shut down the sub-$100 graphics card market. As a choice for gamers on a budget, those building systems in the region of $500, it becomes the processor to pick.

For the Ryzen 3 2200G, we want to spend more time analyzing the effect of a $99 quad-core APU the market, as well as looking how memory speed affects performance, especially with integrated graphics. There's also the angle of overclocking - with AMD showing a 20-25% frequency increase on the integrated graphics, we want to delve into how to unlock potential bottlenecks in a future article.

Power Consumption
Comments Locked

177 Comments

View All Comments

  • Lolimaster - Monday, February 12, 2018 - link

    You don't another model, just disable high clocked pstates till you get the power consumption you want.

    I can lock my Athlon II X4 to 800Mhz if I desire.
  • Lolimaster - Monday, February 12, 2018 - link

    You can simply set pstate for a lower base clock and also undervolt if you want to reduce power consupmtion even more.

    Or the lazy way, cTDP in bios to 45w.
  • Manch - Tuesday, February 13, 2018 - link

    Ask and ye shall receive

    https://www.anandtech.com/show/12428/amd-readies-r...
  • Cryio - Monday, February 12, 2018 - link

    This review kind of confused me?

    It mentioned it's going to compare the A12 9800, but this APU is nowhere to be seen in benchmarks.
    Then out of nowhere come A10 7870K, which is fine I guess, but then there's the A10 8750, which doesn't exit, I can asume it's 7850, yet a 7850 non K APU doesn't exist, so what's happening here?
  • Simon_Says - Monday, February 12, 2018 - link

    Will there be any analysis on current and potential future HTPC performance? While it won't support Netflix 4k or UHDBR (yet, thanks Playready 3.0) I for one would still like to know how it handles HDR for local media playback and Youtube, and if it will have the CPU grunt to software decode AV1.
  • Drazick - Monday, February 12, 2018 - link

    Does the Ryzen have any hardware based unit for Video trans coding?
    Could you test that as well (Speed and Quality).

    It will be interesting as this CPU can be heaven for HTPC and for NAS with Multimedia capabilities.

    Thank You.
  • GreenReaper - Wednesday, February 14, 2018 - link

    It is meant to support up to 4K H.264/5 at 30/60/120FPS for 4K/1440p/1080p resolutions. Obviously it'd be nice to see people testing this out, and the quality of the resulting video.
  • gerz1219 - Monday, February 12, 2018 - link

    Still not quite getting the point of this product. Back when it made sense to build an HTPC, I liked the idea of the Bulldozer-era APU, so that I could play games on the TV without having a noisy gaming rig in the living room. But the performance is just never quite there, and it looks like it will be some time before you can spend ~$400 and get 4K gaming in the living room. So why not just buy an Xbox One X or PS4? I also bought a Shield TV recently for $200 and that streams games from my VR/4K rig just fine onto the TV. I'm just not seeing the need for a budget product that's struggling at 1080p and costs about the same as a 4K console.
  • jjj - Monday, February 12, 2018 - link

    There are 7+ billion people on this planet and the vast majority of them will never be able to afford a console or to pay a single cent for software - consoles are cheap because they screw you on the software side.
    Vs the global average you are swimming in money.
    And ofc the majority of the PC market is commercial as consumer has been declining hard this decade.
    Most humans can barely put food on the table, if that and even a 200$ TV is a huge investment they can afford once every 15 years.
  • Pinn - Monday, February 12, 2018 - link

    But $10 per day on cigarettes is fine?

Log in

Don't have an account? Sign up now