Overclocking the Athlon 200GE

In recent weeks, motherboard manufacturers have been releasing BIOS firmware that enables overcooking on the Athlon 200GE. It appears that this has come through an oversight in one of the base AMD firmware revisions that motherboard vendors are now incorporating into their firmware bundles. This is obviously not what AMD expected; the Athlon is the solitary consumer desktop chip on AMD's AM4 platform that is not overclockable. Since MSI first starting going public with new firmware revisions, others have followed suit, including ASRock and GIGABYTE.  There is no word if this change will be permanent: AMD might patch it in future revisions it sends out to the motherboard vendors, or those vendors will continue to patch around it. As it stands, however, a good number motherboards can now offer this functionality. 

The question does arise if there is even a point to overclocking these chips. They are very cheap, they usually go into cheap motherboards that might not even allow overclocking, and they are usually paired with cheaper coolers. The extra money spent on either an overclocking enabled motherboard or even spending $20 on a cooler might as well be put into upgrading the CPU to a Ryzen 3 2200G, with four cores and better integrated graphics, which comes with a better stock cooler and stomps all over Intel's Pentium line, and is also overclockable without special firmware. The standard response to 'why overclock' is 'because we can', which if you've lived in that part of the industry is more than enough justification.

Given that our resident motherboard editor, Gavin, has been on a crusade through 2018 looking at the scaling performance of the AMD APUs, I asked if he could do a few overclocking tests for us.

Overclocking the 200GE

Enabling our MSI motherboard with the latest overclocking BIOS was no different to any other BIOS flash, and with it, the multiplier options opened up for the chip. Even though AMD's chips can go in quarter multiplier steps, we could only push this processor in full multiplier jumps of 100 MHz, but with a little bit of voltage using our usual overclocking methodology, we managed to get 3.9 GHz without any trouble. 

To be fair, we are using a good cooler here, but to be honest, the thermals were not much of a problem. Our practical limit was the voltage frequency response of the chip at the end of the day, and our 3.9 GHz matches what other people have seen. The base frequency is locked, so there is little room for fine adjustments on that front.

At each stage of the overclock, we ran our Blender test. The gains went up almost linearly, leading to a 20% performance throughput increase from the stock frequency to the best frequency.

Thoughts

A 21 percent performance increase across the range of benchmarks would put the 200GE either on par with Intel on most tests or even further ahead on the tests it already wins. This now changes our conclusion somewhat, as explained on the next page.

If you want to see a full suite test at the overclocked speed, leave a comment below and we'll set something up in January. 

Power Consumption: TDP Doesn't Matter Conclusion: Split Strategy
Comments Locked

95 Comments

View All Comments

  • kkilobyte - Monday, January 14, 2019 - link

    s/i3/Pentium. Obviously :)
  • freedom4556 - Monday, January 14, 2019 - link

    I think you messed up your charts for Civ 6's IGP testing. That or why are you testing the IGP at 1080p Ultra when all the other IGP tests are at 720p Low?
  • freedom4556 - Monday, January 14, 2019 - link

    Also, the 8k and 16k tests are pointless wastes of time. Especially in this review, but also in the others. Your low/med/high/ultra should be 720p/1080p/1440p/4k if you want to actually represent the displays people are purchasing.
  • nevcairiel - Monday, January 14, 2019 - link

    The Civ6 tests are like that because thats when it really starts to scale like the other games. Look at its IGP vs Low, which is 1080p vs 4K. The values are almost identical (and still pretty solid). Only if you move to 8K and then 16K you see the usual performance degredation you would see with other games.
  • AnnoyedGrunt - Tuesday, January 15, 2019 - link

    I second this motion. Please have settings to cover the various common monitor choices. 1080P is an obvious choice, but 1440P should be there too, along with 4K. I don't think you need to run two 4K versions, or two 1080P versions, or whatever. I have a 1440P monitor so it would be nice to see where I become GPU limited as opposed to CPU limited. Maybe Civ6 could use some extra high resolutions in the name of science, but to be useful, you should at least include the 1440P on all games.

    Thanks.

    -AG
  • eddieobscurant - Monday, January 14, 2019 - link

    Another pro intel article from Ian, who hopes that someday intel will hire him
  • PeachNCream - Monday, January 14, 2019 - link

    The numbers in the chart speak for themselves. You don't have to acknowledge the conclusion text. It's only a recommendation anyway. Even though I'd personally purchase a 200GE if I were in the market, I don't think there is any sort of individual bias coming into play. Where the 200GE is relevant, gaming on the IGP, Ian recommended it. In other cases the G5400 did come out ahead by enough of a margin to make it worth consideration. The only flaw I could tease out of this is the fact that the recommendation is based on MSRP and as others have noted, the G5400 is significantly above MSRP right now. It may have been good to acknowledge that in the intro and conclusion in a stronger manner, but that means the article may not stand up as well to the test of time for someone browsing this content six months later after searching for advice on the relevant CPUs via Google.
  • kkilobyte - Monday, January 14, 2019 - link

    Acknowledge "in a stronger manner"? Well, it is actually not acknowledged in the conclusion at all!

    The title of the article is: "The $60 CPU question". One of those CPU is clearly not being sold at $60 on average, but is priced significantly higher. I think the article should have compared CPUs that are really available at (around) $60.

    So maybe there is no personal bias - but there is clearly ignorance of the market state. And that's surprizing, since the G5400 price was above its MSRP for several months already; how could a professional journalist in the field ignore that?

    I guess it could be objected that "MSRP always was used in the past as the reference price". Granted - but it made sense while the MSRP was close to the real market price. It doesn't anymore once the gap gets big, which is the case for tbe G5400. Nobody gives a damn about the theorical price if it is applied nowhere on the market.

    And the 'numbers of chart' don't 'speak for themselves' - they are basically comparing CPUs whose retail price, depending on where you get them, show a 20-40% price gap. What's the point? Why isn't there a price/performance graph, as there were in past reviews? The graphs could just as well include high-end CPUs, and would be just as useless.

    If I want to invest ~$60 in a CPU, I'm not interested to know how a ~$90 one performs!
  • sonny73n - Tuesday, January 15, 2019 - link

    +1

    I couldn’t have said it better myself.
  • cheshirster - Wednesday, January 23, 2019 - link

    Yes, 5400 is priced nowhere near 60$ and reviewer definitely knows it, but fails to mention this in conclusion.

Log in

Don't have an account? Sign up now