Ryzen 9 4900HS with DDR4-2666 and DDR4-3000

In our ASUS Zephyrus G14, we have a total of 16 GB of DDR4. This is split between a single SO-DIMM module of 8 GB, and a set of 8 GB memory soldered onto the board. AMD will offer a version with 16 / 16, however this might come at a later date.

This memory is running at the AMD recommended for these processors, DDR4-3200. Through our inspection tools, we can tell that this memory is running with subtimings of 22-22-22 with a command rate of 1T. The command rate is certainly good, however the 22-22-22 is a little slower than what we see on a desktop system running at this speed, because here we have a system that conforms to JEDEC’s subtiming requirements.

For our memory testing we wanted to see what speeds and capacities we could achieve. Corsair very kindly sent us some modules of 16 GB DDR4-3000 and a module of 32 GB DDR4-2666. This would give our system either 24 GB or 40 GB total respectively, which for a machine designed to do heavier duty workloads, having >16 GB is certainly welcome, as long as the performance hit isn’t too much.

I installed the 32 GB module, and the system booted first time with no fuss. A quick look to see if all the capacity was seen, and we had a total of 40 GB. The speed was also as expected, at DDR4-2666 but with subtimings of 20-19-19 1T.

However, when we put in the module of 16 GB DDR4-3000, to get a total of 24 GB, the detected speed inside the system was only DDR4-2666. Looking at the module settings, this was because the DDR4-3000 speed was actually an XMP profile, and ASUS has not enabled the ability to set XMP profiles here.

We were able to get DDR4-2666 on the 32 GB module because this is the base frequency and settings for the module. The same with the 8 GB module that came with the system – it was flashed so that the basic SPD setting was DDR4-3200. If users want to get high capacity modules with the faster DRAM speeds on this system, they will have to configure the primary SPD profile of their modules, which isn’t an easy thing to do.

As a result, our tests are going to come down to the 8 GB DDR4-3200 module that came with the system, and compare it to the 32 GB DDR4-2666 module. Note that the latter is an 8+32 configuration, which is expected to run in dual channel for the first 16 GB, and then single channel for the next 24 GB.

Civilization 6 AI Test

Civilization 6 AI TestCivilization 6 AI Test Low Power

With our AI test, there’s a ~20% benefit from having the faster memory, which decreases slightly when moved to a limited power budget.

Cinebench R20

We didn’t see any difference in something like Cinebench.

PCMark10 Overall Score

There was more of a difference in PCMark 10, however PCM10 isn't that great in showing where the bottlenecks are.

 

Integrated Graphics Tests

Civilization 6 (1080p Max, No MSAA)Civilization 6 (1080p Max, 8x MSAA)

For the Civ 6 graphics test, the difference in performance between the two memory settings is really significant. This sort of game cares less about FPS, however going down to 22 FPS for 1080p Max and No MSAA means that the user probably has to dial that back a bit to get something more reasonable.

Borderlands 3 (1080p Medium)Borderlands 3 (1080p Medium) Low Power

Going from plugged in to not plugged in, we didn’t see much of a change with the slower memory, however the DDR4-3200 setting still gets a serious benefit over the DDR4-2666 arrangement.

Final Fantasy XV (1080p Standard)

For Final Fantasy, there a significant change - moving up from DDR4-2666 to DDR4-3200 affords a +30% improvement.

Discrete Graphics Tests

Civilization 6 (1080p Max, No MSAA)Civilization 6 (1080p Max, 8x MSAA)Borderlands 3 (1080p Medium)Final Fantasy XV (1080p Standard)

In each case, the faster DRAM actually improves discrete graphics performance.

Quick Thoughts

Overall, 16 GB of memory in a system like this isn't the best configuration - people who need the power are going to likely want 32 GB. However, users putting in their own fast module when buying the 16 GB version are going to have to be careful about the performance. Both the integrated graphics and the discrete graphics take a knock on performance going down from DDR4-3200 to DDR4-2666.

Testing the Ryzen 9 4900HS Integrated Graphics Ryzen Mobile 4000: A Divine Wind for AMD
Comments Locked

267 Comments

View All Comments

  • Qasar - Thursday, April 9, 2020 - link

    peachncream i actually have opened the notebooks up after i have had them for a while, and just unplugged the ribbon cable, and then removed the tape i put over it :-)
    if the Armoury Crate option is enabled in the BIOS it will ask to install it." problem solved, just disable it in the bios. did you even read this part ? :-) :-) i bet the finger scanner could be disabled as well.... but you would probable stick with the ancient notebook you have any, so no difference :-)
  • PeachNCream - Thursday, April 9, 2020 - link

    I did miss the part where the installer could be disabled. Thanks for catching that. As for disabling finger readers, that's a setting I don't really trust to work. A physical barrier is really the only sure way to keep yourself safe.

    In the end, you are right. I will likely use older hardware, however, as time moves forward that older hardware ends up being pretty useless so I get newer older hardware. Security holes like these tend to percolate down to the secondary market over time so I hope that integration of print scanners remains a niche, but I see falling costs and slow yet steady spread so it may one day be a problem for even information security professionals like us to avoid this sort of hole.
  • eastcoast_pete - Thursday, April 9, 2020 - link

    I actually share your dislike for a built-in webcam that doesn't have a slider integrated in it. Unfortunately, that seems to be the last thing on the mind of many laptop designers. I would like a webcam in my laptop, as I often have to videoconference with clients, even when we're not under a "shelter in place" order
  • Fataliity - Friday, April 10, 2020 - link

    Just use a phone or buy a decent webcam for 20-50 bucks. The quality on built in laptop cameras is horrendous. literally anything is better.
  • Kamen Rider Blade - Thursday, April 9, 2020 - link

    Dr. Ian Cutress, your Inter-Core Latency table might have a few mistakes on it!!!!

    How is it that a 3900X can have consistent Latency when it crosses CCX/CCD boundries:
    https://i.redd.it/mvo9nk2r94931.png

    Yet your 3950X has Zen+ like latency when it crosses CCX/CCD boundaries?

    Did you screw up your table when you copied & pasted?
  • mattkiss - Thursday, April 9, 2020 - link

    For Zen 2 desktop CPUs, the CCD/IOD link for memory operations is 32B/cycle while reading and 16B/cycle for writing. I am curious what the values are for the Renoir architecure. Also, I would be interested in seeing an AIDA64 memory benchmark run for the review system for both DDR4 3200 and 2666.
  • Khato - Thursday, April 9, 2020 - link

    The investigation regarding the low web browsing battery life result on the Zephyrus G14 is quite interesting. One question though, was the following statement confirmed? "With the Razer Blade, it was clear that the system was forced into a 60 Hz only mode, the discrete GPU gets shut down, and power is saved."

    Few reasons for that question. The numbers and analysis in this article piqued my curiosity due to how close the Razer Blade and 120Hz Asus Zephyrus numbers were. Deriving average power use from those run times plus battery capacity arrives at 16.3W for the 120Hz Asus Zephyrus, 14W for the Razer Blade, and 6.1W for the 60Hz Asus Zephyrus. So roughly a 10W delta for increased refresh rate plus discrete graphics. Performing the same exercise on the recent Surface Laptop 3 review yields 6.1W for the R7 3780U variant and 4.5W for the i7 1065G7 variant. Note that the R7 3780U variant shows same average power consumption as the 60Hz Asus Zephyrus, while the Razer Blade is 9.5W/3x higher than the i7 1065G7 variant. It makes no sense for Intel efficiency to be that much worse... unless the discrete graphics is still at play.

    The above conclusion matches up with the only laptop I have access to with discrete graphics, an HP zbook G6 with the Quadro RTX 3000. On battery with just normal web browser and office applications open the discrete graphics is still active with hwinfo reporting a constant 8W of power usage.
  • Fataliity - Friday, April 10, 2020 - link

    That's partly because for a Zen2 core to ramp up to turbo, it uses much less power. Intel can hit their 35W budget from one core going up to 4.5-4.8Ghz. Ryzen can hit their 4.4 at about 12W. And it turbos faster too. So it uses less power, finishes the job quicker, is more responsive, etc.

    For an example, look at the new Galaxy S20 review on here with the 120hz screen. When its turned on it shaves off over 50% of its battery life.
  • Khato - Friday, April 10, 2020 - link

    Those arguments could have some merit if the results were particular to the web browsing battery life tests. However, the exact same trend exists for both web browsing and video playback, and h.264 playback doesn't require a system to leave minimum CPU frequency. This is clear evidence that the difference in power consumption has nothing to do with compute efficiency of the CPU, but rather the platform.

    Regarding the comparison to the S20. Performing the same exercise of dividing battery Wh by hours of web browsing battery life run time for the S20 Ultra with Snapdragon 865 arrives at 1.37W at 60Hz and 1.7W at 120Hz. Even if you assumed multiplicative scaling that would only increase the 6.1W figure for the 60Hz Asus Zephyrus up to 7.6W... and it's not multiplicative scaling.

    As far as I can tell from my own limited testing, Optimus simply isn't working like it should. It's frequently activating the discrete GPU on trivial windows workloads which could easily be handled by the integrated graphics. My guess is that this is the normal state for Intel based windows laptops with discrete NVIDIA graphics. Wouldn't necessarily affect AMD as driver setup is different, which is definitely a selling point for AMD unless Intel/NVIDIA take notice and fix their driver to match.
  • eastcoast_pete - Thursday, April 9, 2020 - link

    Thanks Ian, glad I waited with my overdue laptop refresh! Yes, it'll be Intel outside this time, unless the i7 plus dGPU prices come down a lot; the Ryzen 4800/4900 are the price/performance champs in that segment for now.
    The one fly in the ointment is the omission of a webcam in the Zephyrus. I can (prefer) to do without the LED bling on the lid cover, but really need a webcam, especially right now with "shelter in place" due to Covid. However, I don't think ASUS designed the Zephyrus with someone like me in mind. Too bad, maybe another Ryzen 4800 laptop will fit the bill .

Log in

Don't have an account? Sign up now