A Great Alternative to Regular Ryzen

I’m a big fan of a cheap and efficient solution for a PC that performs well and just works. Sometimes when building these systems, the fewer parts the better, as it means less can go wrong, but there is also a desire to ensure the system will last and remain fast during its tenure.

Historically a processor with integrated graphics fit the bill. They used to be a dime a dozen (or up to $150), and when paired with a good small motherboard and a couple of memory modules, an SSD, and a good stock cooler, then someone like my father can browse the web and do office stuff on his 32-inch display and join the weekly family zoom call without having to sit there for the system to respond.

What AMD has here with the new Ryzen 5000G desktop APUs is something that fast. Equipping up to eight Zen 3 cores around 4.0 GHz in a system should cater to almost* everyone’s compute needs. The new 5000G APUs are generationally a really nice improvement in raw compute performance over 4000G, but because 4000G wasn’t really at retail, we’re looking at 3000G, and the new hardware wipes the floor here. The only downside is that AMD didn’t release the cheapest offering.

Only the Ryzen 7 5700G ($359) and Ryzen 5 5600G ($259) are coming to market. Both of these processors are around about the same performance as their desktop variants (5800X and 5600X), so it probably won’t be much of a surprise to see these parts in our CPU guides going forward where we would normally recommend the X processors. The problem with these though is that the 5300G isn’t coming to market, at something like $159.

The Ryzen 3 5300G has been a fun processor to test. In every test it surpasses the previous retail APU flagship (the R5 3400G), and even if we compare it to the OEM flagship the R7 4750G, in a few tests it beats it there as well, both in regular performance and in gaming.

So why isn’t AMD selling the Ryzen 3 5300G at retail? Perhaps because it doesn’t need to. Demand for AMD’s regular processor lineup has been strong, and it is only recently where we’re seeing processors like the 5600X and 5800X come back down to MSRP and in stock regularly. AMD makes more profit on those processors (probably), so it would rather sell those. By keeping the 5300G behind, it drives users up to the mid and high-tier 5000G parts, increasing the average selling price and revenue. And in a market where all silicon seems to be being sold, it’s a clever productization tactic. The 5300G at a $159 price point would have a special place in many builds. Until that time, users will have to make do with a 5600G.

The Ryzen 7 5700G and Ryzen 5 5600G go on sale tomorrow, on August 5th 2021.

Do APUs Make Sense For Gaming Yet?

Ever since graphics hardware has been attached to CPU cores, we’ve always wondered: ‘At what point can it be powerful enough to consume the entry-level market?’. Theoretically, year on year, the graphics capabilities of what is in the silicon has improved, and we get speed increases in the same way we see CPU core performance increases. However it is not always the silicon that matters.

Games are also getting more complex. In the same way that every year we get more performance, every year the required specifications for modern games go up. Developers get ambitious, and they want to convert their artistic vision onto a system, and there are often two main targets for those developments: playable on consoles, and the best experience on a super expensive PC. This creates a problem for the lowly integrated graphics solution, because they end up powered well below that of consoles.


A PS5 Processor with Integrated Graphics

 

Consoles have the added benefit of being a hyper-optimized and well-defined platform: the software stack is built on gaming performance and developers can cater to it. Because integrated graphics can come in many different shapes and colors, we’re relying on some of those optimizations to filter down. Not only that but new technologies such as AMD’s Fidelity SuperFX Resolution are aimed at getting a better experience with less compute power. While game requirements are getting higher, at least the tricks to get better performance are also coming along.

So why not build a big integrated graphics solution for desktops, like a console? First is the market optics – realistically AMD is the main vendor in the console game but also the only vendor taking integrated graphics solutions seriously, so there’s no desire to cross-contaminate each market segment. Second is the market size – a discrete GPU, even one at 75 W, doesn’t have to compete with a joint power budget with a CPU, whereas an integrated solution does, and how many users really want a joint power budget for a main gaming system? One could easily argue that APUs make sense for those on a budget, or someone looking for a smaller system without a discrete card, and not a lot else.

On a budget, you could easily build a Ryzen 5 5600G gaming system with good recommended components for $621, providing you with almost the best integrated gaming experience while at the same time performing near-the-same or beating last generation’s flagship APU in day-to-day tasks. Moving up to Zen 3 with a larger L3 cache has really unlocked more of the performance in these cores and in the graphics.

One of the big questions on the horizon is regarding how AMD might use its 3D V-Cache technology in the future. The current implementation is a 64 MB die that sits on top of the cache in a regular CPU chiplet. That same chiplet won’t work on an APU, but AMD could very much design one in a similar fashion for its integrated graphics – perhaps adding another 32 MB of L3 cache. The question on that then becomes how much extra will it cost, and whether the trade-off is worth it. At a time where discrete graphics solutions are still crazy expensive, it is perhaps not as farfetched as you might think. However, based on AMD’s disclosures, don’t expect a chip like this anytime soon.

Integrated Graphics Tests: Is 1080p Max Possible Yet?
Comments Locked

135 Comments

View All Comments

  • abufrejoval - Thursday, August 5, 2021 - link

    There are indeed so many variables and at least as many shortages these days. And it's becoming a playground for speculators, who are just looking for such fragilities in the suppy chain to extort money.

    I remember some Kaveri type chips being sold by AMD, which had the GPU parts chopped off by virtue of being "borderline dies" on a round 300mm wafer. Eventually they also had enough of these chips with the CPU (and SoC) portion intact, to sell them as a "GPU-less APU".

    Don't know if the general layout of the dies allows for such "halflings" on the left or right of a wafer...
  • mode_13h - Wednesday, August 4, 2021 - link

    Ian, please publish the source of 3DPM, preferably to github, gitlab, etc.
  • mode_13h - Wednesday, August 4, 2021 - link

    For me, the fact that 5600X always beats 5600G is proof that the non-APUs' lack of an on-die memory controller is no real deficiency (nor is the fact that the I/O die is fabbed on an older process node).
  • GeoffreyA - Thursday, August 5, 2021 - link

    The 5600X's bigger cache and boost could be helping it in that regard. But, yes, I don't think the on-die memory controller makes that much of a difference compared to the on-package one.
  • mode_13h - Friday, August 6, 2021 - link

    I wrote that knowing about the cache difference, but it's not going to help in all cases. If the on-die memory controller were a real benefit over having it on the I/O die, I'd expect to see at least a couple benchmarks where the 5600G outperformed the 5600X. However, they didn't switch places, even once!

    I know the 5600X has a higher boost clock, but they're both 65W and the G has a higher base frequency. So, even on well-threaded, non-graphical benchmarks, it's quite telling that the G can never pass the X.
  • GeoffreyA - Friday, August 6, 2021 - link

    Remember how the Core 2 Duo left the Athlon 64 dead on the floor? And that was without an on-die MC.
  • mode_13h - Saturday, August 7, 2021 - link

    That's not relevant, since there were incredible differences in their uArch and fab nodes.

    In this case, we get to see Zen 3 cores on the same manufacturing process. So, it should be a very well-controlled comparison. Still not perfect, but about as close as we're going to get.

    Also, the memory controller is in-package, in both cases. The main difference of concern is whether or not it's integrated into the 7 nm compute die.
  • GeoffreyA - Saturday, August 7, 2021 - link

    In agreement with what you are saying, even in my first comment. I think Cezanne shows that having the memory controller on the package gets the critical gains (vs. the old northbridge), and going onto the main die doesn't add much more.

    As for K8 and Conroe, I always felt it was notable in that C2D was able to do such damage, even without an IMC. Back when K8 was the top dog, the tech press used to make a big deal about its IMC, as if there were no other improvements besides that.
  • mode_13h - Sunday, August 8, 2021 - link

    One bad thing about moving it on-die is that this gave Intel an excuse to tie ECC memory support to the CPU, rather than just the motherboard. I had a regular Pentium 4 with ECC memory, and all it required was getting a motherboard that supported it.

    As I recall, the main reason Intel lagged in moving it on-die is that they were still flirting with RAMBUS, which eventually went pretty much nowhere. At work, we built one dual-CPU machine that required RAMBUS memory, but that was about the only time I touched the stuff.

    As for the benefits of moving it on-die, it was seen as one of the reasons Opteron was able to pull ahead of Pentium 4. Then, when Nehalem eventually did it, it was seen as one of the reasons for its dominance over Core 2.
  • GeoffreyA - Sunday, August 8, 2021 - link

    Intel has a fondness for technologies that go nowhere. RAMBUS was supposed to unlock the true power of the Pentium 4, whatever that meant. Well, the Willamette I used for a decade had plain SDRAM, not even DDR. But that was a downgrade, after my Athlon 64 3000+ gave up the ghost (cheapline PSU). That was DDR400. Incidentally, when the problems began, they were RAM related. Oh, those beeps!

Log in

Don't have an account? Sign up now