Ryzen 9 4900HS with DDR4-2666 and DDR4-3000

In our ASUS Zephyrus G14, we have a total of 16 GB of DDR4. This is split between a single SO-DIMM module of 8 GB, and a set of 8 GB memory soldered onto the board. AMD will offer a version with 16 / 16, however this might come at a later date.

This memory is running at the AMD recommended for these processors, DDR4-3200. Through our inspection tools, we can tell that this memory is running with subtimings of 22-22-22 with a command rate of 1T. The command rate is certainly good, however the 22-22-22 is a little slower than what we see on a desktop system running at this speed, because here we have a system that conforms to JEDEC’s subtiming requirements.

For our memory testing we wanted to see what speeds and capacities we could achieve. Corsair very kindly sent us some modules of 16 GB DDR4-3000 and a module of 32 GB DDR4-2666. This would give our system either 24 GB or 40 GB total respectively, which for a machine designed to do heavier duty workloads, having >16 GB is certainly welcome, as long as the performance hit isn’t too much.

I installed the 32 GB module, and the system booted first time with no fuss. A quick look to see if all the capacity was seen, and we had a total of 40 GB. The speed was also as expected, at DDR4-2666 but with subtimings of 20-19-19 1T.

However, when we put in the module of 16 GB DDR4-3000, to get a total of 24 GB, the detected speed inside the system was only DDR4-2666. Looking at the module settings, this was because the DDR4-3000 speed was actually an XMP profile, and ASUS has not enabled the ability to set XMP profiles here.

We were able to get DDR4-2666 on the 32 GB module because this is the base frequency and settings for the module. The same with the 8 GB module that came with the system – it was flashed so that the basic SPD setting was DDR4-3200. If users want to get high capacity modules with the faster DRAM speeds on this system, they will have to configure the primary SPD profile of their modules, which isn’t an easy thing to do.

As a result, our tests are going to come down to the 8 GB DDR4-3200 module that came with the system, and compare it to the 32 GB DDR4-2666 module. Note that the latter is an 8+32 configuration, which is expected to run in dual channel for the first 16 GB, and then single channel for the next 24 GB.

Civilization 6 AI Test

Civilization 6 AI TestCivilization 6 AI Test Low Power

With our AI test, there’s a ~20% benefit from having the faster memory, which decreases slightly when moved to a limited power budget.

Cinebench R20

We didn’t see any difference in something like Cinebench.

PCMark10 Overall Score

There was more of a difference in PCMark 10, however PCM10 isn't that great in showing where the bottlenecks are.

 

Integrated Graphics Tests

Civilization 6 (1080p Max, No MSAA)Civilization 6 (1080p Max, 8x MSAA)

For the Civ 6 graphics test, the difference in performance between the two memory settings is really significant. This sort of game cares less about FPS, however going down to 22 FPS for 1080p Max and No MSAA means that the user probably has to dial that back a bit to get something more reasonable.

Borderlands 3 (1080p Medium)Borderlands 3 (1080p Medium) Low Power

Going from plugged in to not plugged in, we didn’t see much of a change with the slower memory, however the DDR4-3200 setting still gets a serious benefit over the DDR4-2666 arrangement.

Final Fantasy XV (1080p Standard)

For Final Fantasy, there a significant change - moving up from DDR4-2666 to DDR4-3200 affords a +30% improvement.

Discrete Graphics Tests

Civilization 6 (1080p Max, No MSAA)Civilization 6 (1080p Max, 8x MSAA)Borderlands 3 (1080p Medium)Final Fantasy XV (1080p Standard)

In each case, the faster DRAM actually improves discrete graphics performance.

Quick Thoughts

Overall, 16 GB of memory in a system like this isn't the best configuration - people who need the power are going to likely want 32 GB. However, users putting in their own fast module when buying the 16 GB version are going to have to be careful about the performance. Both the integrated graphics and the discrete graphics take a knock on performance going down from DDR4-3200 to DDR4-2666.

Testing the Ryzen 9 4900HS Integrated Graphics Ryzen Mobile 4000: A Divine Wind for AMD
Comments Locked

267 Comments

View All Comments

  • Deicidium369 - Sunday, April 12, 2020 - link

    Nah you and I know it's not about hiding something - I have those Samsung TVs and they get the same round piece of black tape that all of my Webcams get unless they have a shutter.
  • shady28 - Saturday, April 11, 2020 - link

    ^^
    That. Outside of playtime, nobody uses webcams.
    That's that moment that you realize 90% of the people here haven't worked much more than burger flipping. Where I work and none of our vendors use them when doing presentations or meetings. It is all screen sharing, file and folder sharing, team or skype chat, etc. In fact, most developers I know actually put a piece of tape over their webcam, just in case.
  • Icehawk - Sunday, April 12, 2020 - link

    With our push to WFH due to C-19 everyone and their mother is asking us to enable their cameras but I agree, in actual meetings it’s maybe 10% at best that use it. Hell 1/3rd of the people don’t even login properly so you can see who they are.

    If you are going to include a camera at least integrate a shutter.
  • shady28 - Sunday, April 12, 2020 - link

    I don't even see 10%. I see near 0%, when someone turns on a web cam they instantly get messages saying 'Hey, you know your webcam is on?' - because it serves no purpose in most settings except to distract, annoy, disrupt, and lag.
    The only place I see it having value are people who want to 'face chat' 'facetime' etc type of scenarios with friends / family. Those are entirely social, and I would say 90% of people using the laptop for personal use don't care about that either (I don't, nor do many others I know). That's what the 2nd cam on your phone / tablet is for - that's where I see it being used, for family and friends.
  • erple2 - Sunday, April 26, 2020 - link

    Unless there's more than about 15 people in a telecon, we _usually_ always turn on the webcam. In a WFH situation where there's really only 1-3 other people that you see on a daily basis, I find it helpful to continue to "see" my coworkers. I'm older than most of my coworkers (work in software), and I prefer in-person talking than slack or text-based communication, and I find that webcams help keep me more engaged in the particular meeting. Note - most of what we do with a "telecon" includes screensharing, too, but I find it much easier to gauge reactions and the the other non-verbal communications if you can also see each other. So I would agree that a webcam is important.

    That having been said, I find basically all of the webcam that exist on laptops to be pretty crummy, and thus I use a separate webcam than the one that comes on my work computer (Macbook Pro). Though that doesn't stop my coworkers from using the terribad one that comes on their laptops.
  • Irata - Friday, April 10, 2020 - link

    Even on laptops that have one built in, I used a USB webcam. Much better quality and I know when I can be filmed and when not.

    The downside is that you are losing a USB port.
  • Qasar - Friday, April 10, 2020 - link

    which could be gained back, and a few more by getting a small usb hub like i did from kensington, 4 ports, 1" x3 " and a short 6 " cable, nice little portable usb 3 hub, and works just fine :-)
  • yeeeeman - Thursday, April 9, 2020 - link

    How is the fan noise situation?
  • 1_rick - Thursday, April 9, 2020 - link

    According to a review elsewhere noticeable but not obnoxious (no coil whine etc., just air whooshing). YMMV of course.
  • rrinker - Thursday, April 9, 2020 - link

    Impressive machine - but I have to laugh. OK, they get the (S)pecial 10 watt lower CPU - and then shove about 10 watts of LEDs on the cover....

Log in

Don't have an account? Sign up now