Ryzen 9 4900HS with DDR4-2666 and DDR4-3000

In our ASUS Zephyrus G14, we have a total of 16 GB of DDR4. This is split between a single SO-DIMM module of 8 GB, and a set of 8 GB memory soldered onto the board. AMD will offer a version with 16 / 16, however this might come at a later date.

This memory is running at the AMD recommended for these processors, DDR4-3200. Through our inspection tools, we can tell that this memory is running with subtimings of 22-22-22 with a command rate of 1T. The command rate is certainly good, however the 22-22-22 is a little slower than what we see on a desktop system running at this speed, because here we have a system that conforms to JEDEC’s subtiming requirements.

For our memory testing we wanted to see what speeds and capacities we could achieve. Corsair very kindly sent us some modules of 16 GB DDR4-3000 and a module of 32 GB DDR4-2666. This would give our system either 24 GB or 40 GB total respectively, which for a machine designed to do heavier duty workloads, having >16 GB is certainly welcome, as long as the performance hit isn’t too much.

I installed the 32 GB module, and the system booted first time with no fuss. A quick look to see if all the capacity was seen, and we had a total of 40 GB. The speed was also as expected, at DDR4-2666 but with subtimings of 20-19-19 1T.

However, when we put in the module of 16 GB DDR4-3000, to get a total of 24 GB, the detected speed inside the system was only DDR4-2666. Looking at the module settings, this was because the DDR4-3000 speed was actually an XMP profile, and ASUS has not enabled the ability to set XMP profiles here.

We were able to get DDR4-2666 on the 32 GB module because this is the base frequency and settings for the module. The same with the 8 GB module that came with the system – it was flashed so that the basic SPD setting was DDR4-3200. If users want to get high capacity modules with the faster DRAM speeds on this system, they will have to configure the primary SPD profile of their modules, which isn’t an easy thing to do.

As a result, our tests are going to come down to the 8 GB DDR4-3200 module that came with the system, and compare it to the 32 GB DDR4-2666 module. Note that the latter is an 8+32 configuration, which is expected to run in dual channel for the first 16 GB, and then single channel for the next 24 GB.

Civilization 6 AI Test

Civilization 6 AI TestCivilization 6 AI Test Low Power

With our AI test, there’s a ~20% benefit from having the faster memory, which decreases slightly when moved to a limited power budget.

Cinebench R20

We didn’t see any difference in something like Cinebench.

PCMark10 Overall Score

There was more of a difference in PCMark 10, however PCM10 isn't that great in showing where the bottlenecks are.

 

Integrated Graphics Tests

Civilization 6 (1080p Max, No MSAA)Civilization 6 (1080p Max, 8x MSAA)

For the Civ 6 graphics test, the difference in performance between the two memory settings is really significant. This sort of game cares less about FPS, however going down to 22 FPS for 1080p Max and No MSAA means that the user probably has to dial that back a bit to get something more reasonable.

Borderlands 3 (1080p Medium)Borderlands 3 (1080p Medium) Low Power

Going from plugged in to not plugged in, we didn’t see much of a change with the slower memory, however the DDR4-3200 setting still gets a serious benefit over the DDR4-2666 arrangement.

Final Fantasy XV (1080p Standard)

For Final Fantasy, there a significant change - moving up from DDR4-2666 to DDR4-3200 affords a +30% improvement.

Discrete Graphics Tests

Civilization 6 (1080p Max, No MSAA)Civilization 6 (1080p Max, 8x MSAA)Borderlands 3 (1080p Medium)Final Fantasy XV (1080p Standard)

In each case, the faster DRAM actually improves discrete graphics performance.

Quick Thoughts

Overall, 16 GB of memory in a system like this isn't the best configuration - people who need the power are going to likely want 32 GB. However, users putting in their own fast module when buying the 16 GB version are going to have to be careful about the performance. Both the integrated graphics and the discrete graphics take a knock on performance going down from DDR4-3200 to DDR4-2666.

Testing the Ryzen 9 4900HS Integrated Graphics Ryzen Mobile 4000: A Divine Wind for AMD
Comments Locked

267 Comments

View All Comments

  • Zingam - Saturday, April 11, 2020 - link

    How is the triple monitor 4K support? Is triple 4K monitor setup viable? Is it smooth? Does it lag? Does it overheat? Does it make the fans howl all the time even while idling? Does it have driver issues?
    Can I connect this laptop or any other modern laptops to two (or more if supported) K4 external monitors for a three monitor setup and type, edit text, compile code and never ever experience overheating, fan noise, lag and stuttering? Is this APU a good work driver? This type of tests I am interested in!
    And I also like to know how it compares to older CPUs not just the current. I like to know how it compares to Sandy Bridge, to Sky Lake, to Kaby Lake i7 -7700HQ, etc with and without discrete GPU (1050Ti) to know if an upgrade is worth it.

    I don't care about battery life very much but I care about performance, heat and fan noise and how portable that setup is. I don't work in coffee shops but I need to carry my laptop from my office to my home and back on my back - so I care about it being light with a small power brick too.

    It is very rarely that reviews provide that information - it is all about gaming and flashiness.
  • Zingam - Saturday, April 11, 2020 - link

    @Ian it would be great if you compare these new CPUs to older for real work professional use and even with other small form factor PCs like the NUCs and the Mac Minis.
  • Zingam - Saturday, April 11, 2020 - link

    Can it run a 2-3 hour compilation or static analysis without throttling, while watching YouTube, running an emulator and browsing the web, which is just as important as not throttling while gaming or running a game in the background while debugging it in the summer season. :)
  • Viilutaja - Saturday, April 11, 2020 - link

    I have not used any of my laptops webcams ever! And i have used work laptops (Lenovo Thinkpads) for some time now. Right now i have special cover on my laptop webcam and have not opened it since installation. I have weekly meetings with collaegues and many other meetings with clients, never ever was the webcam on. Overrated part in laptops. Who has a need to do video converences, that person buys separate 4k60fps external webcam for it anyway.
  • nils_ - Saturday, April 11, 2020 - link

    I would like this very much in a mobile workstation, but I do need Thunderbolt 3 at least for my Docking Station. I can do without the dGPU.
  • dk404 - Saturday, April 11, 2020 - link

    +1 AMD for their focus on perf per watt, more design per socket plus perf per $ (value). They definitely leading the innovation for laptop, desktop and also server markets,...
    Now time for blue to wake up even though it's too late,...
  • SeanFL - Saturday, April 11, 2020 - link

    Wondering how long before we see some ultra tiny desktops using the new AMD laptop APU's, similar to the NUC, Lenovo Thinkcentre, or the HP Elitedesk. The aforementioned systems are great for almost anything except video editing. The new 4000 series chips would be fantastic in a tiny desktop. Please AMD.
  • realbabilu - Saturday, April 11, 2020 - link

    Is it hackintoshable ?
    I wish apple also see amd as a switch too
  • Dodozoid - Saturday, April 11, 2020 - link

    Awesome review Dr. Cutress. There are two points that interest me. First - GPU Z shows the iGPU connected via PCIe 4.0 16x. Are there any power/performance implications to that or is it simply misinterpretation of infinity fabric?
    And another one is regarding the on-battery performance. There is an important piece of information missing (I am aware you show part of in in battery life section)- how long does it maintain that performance?
    (And maybe how much influence do various power/performance setting have on the the framerates/endurance tradeoffs?)
  • phoenix_rizzen - Sunday, April 12, 2020 - link

    Renoir doesn't have PCIe 4. That should say PCIe 3 x8 for the GPU.

Log in

Don't have an account? Sign up now