Testing the Ryzen 9 4900HS Integrated Graphics

Under the hood of the Ryzen 9 4900HS, aside from the eight Zen 2 cores, is an enhanced Vega 8 graphics solution. For this generation of mobile processors, AMD is keeping the top number of compute units to 8, whereas in the previous generation it went up to Vega 11. Just by the name, one would assume that AMD has lowered the performance of the integrated graphics. This is not the case.

For the new Ryzen Mobile 4000 processors, the Vega graphics here are enhanced in three main ways over the previous generation. First is that it is built on the 7nm process node, and AMD put a lot of effort into physical design, allowing for a more optimized version that has a wider voltage/frequency window compared to the previous generation. Secondly, and somewhat connected, is the frequency: the new processors top out at 1750 MHz, rather than 1400 MHz, which would naturally give a simple 25 % boost with all other things being equal. Third on the list is memory, as the new platform supports up to DDR4-3200, rather than DDR4-2400, providing an immediate bandwidth boost which is what integrated graphics loves. There’s also the nature of the CPU cores themselves, having larger L3 caches, which often improves integrated graphics workloads that interact a lot with the CPU.

Normally, with the ASUS Zephryus G14, the switching between the integrated graphics and the discrete graphics should be automatic. There is a setting in the NVIDIA Control Panel to let the system auto-switch between integrated and discrete, and we would expect the system to be on the IGP when off the wall power, but on the discrete card when gaming (note, we had issues in our battery life test where the discrete card was on, but ASUS couldn’t reproduce the issue). In order to force the integrated graphics for our testing, because the NVIDIA Control Panel didn’t seem to catch all of our tests to force them onto the integrated graphics, we went into the device manager and actually disabled the NVIDIA graphics.

This left us with AMD’s best integrated graphics in its Ryzen Mobile 4000 series: 1750 MHz of enhanced Vega 8 running at DDR4-3200.


Renoir with Vega 8 – updated to 20.4 after this screenshot was taken

Our comparison point here is actually a fairly tricky one to set up. Unfortunately we do not have a Ryzen 7 3750H from the previous generation for comparison, but we do have an Honor Magicbook 14, which has a Ryzen 5 3500U.


Picasso with Vega 8

This is a 15 W processor, running at 1200 MHz and DDR4-2400, which again makes the comparison a little tricky, but it is better than comparing it to the Intel HD630 graphics in the Razer Blade.

We also re-ran the benchmarks on the latest drivers with AMD's 65 Desktop APUs, the Ryzen 5 3400G (with Vega11) and the Ryzen 3 3200G (with Vega 8). These are running at DDR4-2933, the AMD maximum officially supported by these APUs (which means anything above this is overclocking). 

Civilization 6 (1080p Max, No MSAA)Civilization 6 (1080p Max, 8x MSAA)

This is a pretty substantial difference, no joke.

Borderlands 3 (1080p Medium)Final Fantasy XV (1080p Standard)Counter Strike Source (1080p Max)

Hopefully we will get more variants of the Ryzen integrated graphics to test, along with an Ice Lake system.

ASUS Zephyrus G14 (Ryzen 9) vs Razer Blade (Core i7): Low Power Performance Testing the Ryzen 9 4900HS with DDR4-2666 and DDR4-3000
Comments Locked

267 Comments

View All Comments

  • Deicidium369 - Sunday, April 12, 2020 - link

    Nah you and I know it's not about hiding something - I have those Samsung TVs and they get the same round piece of black tape that all of my Webcams get unless they have a shutter.
  • shady28 - Saturday, April 11, 2020 - link

    ^^
    That. Outside of playtime, nobody uses webcams.
    That's that moment that you realize 90% of the people here haven't worked much more than burger flipping. Where I work and none of our vendors use them when doing presentations or meetings. It is all screen sharing, file and folder sharing, team or skype chat, etc. In fact, most developers I know actually put a piece of tape over their webcam, just in case.
  • Icehawk - Sunday, April 12, 2020 - link

    With our push to WFH due to C-19 everyone and their mother is asking us to enable their cameras but I agree, in actual meetings it’s maybe 10% at best that use it. Hell 1/3rd of the people don’t even login properly so you can see who they are.

    If you are going to include a camera at least integrate a shutter.
  • shady28 - Sunday, April 12, 2020 - link

    I don't even see 10%. I see near 0%, when someone turns on a web cam they instantly get messages saying 'Hey, you know your webcam is on?' - because it serves no purpose in most settings except to distract, annoy, disrupt, and lag.
    The only place I see it having value are people who want to 'face chat' 'facetime' etc type of scenarios with friends / family. Those are entirely social, and I would say 90% of people using the laptop for personal use don't care about that either (I don't, nor do many others I know). That's what the 2nd cam on your phone / tablet is for - that's where I see it being used, for family and friends.
  • erple2 - Sunday, April 26, 2020 - link

    Unless there's more than about 15 people in a telecon, we _usually_ always turn on the webcam. In a WFH situation where there's really only 1-3 other people that you see on a daily basis, I find it helpful to continue to "see" my coworkers. I'm older than most of my coworkers (work in software), and I prefer in-person talking than slack or text-based communication, and I find that webcams help keep me more engaged in the particular meeting. Note - most of what we do with a "telecon" includes screensharing, too, but I find it much easier to gauge reactions and the the other non-verbal communications if you can also see each other. So I would agree that a webcam is important.

    That having been said, I find basically all of the webcam that exist on laptops to be pretty crummy, and thus I use a separate webcam than the one that comes on my work computer (Macbook Pro). Though that doesn't stop my coworkers from using the terribad one that comes on their laptops.
  • Irata - Friday, April 10, 2020 - link

    Even on laptops that have one built in, I used a USB webcam. Much better quality and I know when I can be filmed and when not.

    The downside is that you are losing a USB port.
  • Qasar - Friday, April 10, 2020 - link

    which could be gained back, and a few more by getting a small usb hub like i did from kensington, 4 ports, 1" x3 " and a short 6 " cable, nice little portable usb 3 hub, and works just fine :-)
  • yeeeeman - Thursday, April 9, 2020 - link

    How is the fan noise situation?
  • 1_rick - Thursday, April 9, 2020 - link

    According to a review elsewhere noticeable but not obnoxious (no coil whine etc., just air whooshing). YMMV of course.
  • rrinker - Thursday, April 9, 2020 - link

    Impressive machine - but I have to laugh. OK, they get the (S)pecial 10 watt lower CPU - and then shove about 10 watts of LEDs on the cover....

Log in

Don't have an account? Sign up now