Display Evaluation & Power

The displays of the Galaxy S9’s don’t change much compared to the Galaxy S8’s. The S9 uses an upgraded AMOLED DDIC S6E3HA8 instead of last year’s S6E3HA6. The new panels are AMB577PX01 on the Galaxy S9 and an AMB622NP01 on the Galaxy S9+.

A nice addition with Samsung’s Android 8.0 OS on the Galaxy S8 and Galaxy S9 is the ability have fine-grained control over colour temperature – although it’s disappointing to see that this is limited to the Adaptive Display colour mode. The remaining colour modes continue Samsung’s tradition to provide different colour space modes. Adaptive Display is a wide gamut mode which is intentionally wide and doesn’t correspond to any standard. AMOLED Cinema targets the DCI P3 colour space, AMOLED Photo targets Adobe RGB and the simple “Basic” mode targets sRGB accuracy.

One big introduction of Android 8.0 was supposed to be the inclusion of wide colour gamut colour management support. This was enabled on Google’s Pixel 2 devices. To find out more how the Galaxy S9 behaves I wrote a quick app which check’s Android’s APIs. Unfortunately the Galaxy S9 doesn’t have any colour management support and switching colour modes through the APIs does nothing. There’s still a lot of questions remaining in terms of wide gamut support on Android, particularly for Samsung devices who make extensive usage of colour management and display modes through the mDNIe solutions on their AMOLED devices.

There’s a plethora of reasons of why Samsung could have decided not to enable support, some of them which are hardware requirements on the display pipeline. The Exynos 9810 and Snapdragon 845 both should support 10-bit display pipelines (which is not necessarily a requirement, but simplifies things) but for example the S6E3HA6 was still an 8-bit DDIC which complicates things and requires tone remapping and possibly dithering techniques. The situation is a bit of a conundrum and it’ll probably take some time before Samsung introduces a full 10-bit wide gamut software-to-display device. The Galaxy S9 for now remains the same as previous generation in terms of colour gamuts and colour depths.

And as always, measurements are performed with an X-Rite i1Pro 2 spectrophotometer, with the exception of black levels which are measured with an i1Display Pro colorimeter to achieve the most accurate results possible in an area where the i1Pro 2 can be somewhat unreliable. Data is collected and examined using SpectraCal's CalMAN software.

 
sRGB (Basic mode)   
SpectraCal CalMAN

SpectraCal CalMAN

 

The Galaxy S9 and S9+ display generally the same characteristics as the Galaxy S8. Minimum brightness goes down to 1.5 cd/m² while manual maximum brightness tops out at 300-320 depending on colour mode. Auto-brightness boost in bright conditions will overdrive the panel at up to 625 cd/m² at 100 APL (full white) which is the same brightness as the S8, both in manual and auto modes.

The Galaxy S9 and S9+ units I have here still suffer from slightly too low colour temperatures both in sRGB and DCI P3 modes, coming in at around 6250K, slightly better than the Galaxy S8 unit I have which also was too red at 6150K. Samsung’s Adaptive Mode default to higher colour temperature of 7000K, however in that mode it’s a non-issue as you’re able to adjust the colour balance to one’s preference. The Galaxy S9 and S9+ showcase better total gamma than the S8 units I have here, coming in at 2.23 on the S9’s vs 2.13 on the S8.

Greyscale accuracy is good even though we’re veering slightly too much into the reds because of the under-targeted colour temperature.

In direct sunlight the S9’s retrain fantastic readability thanks to the high-brigthness mode that the phone switches to. In this mode the display ignores the selected display mode and goes into a special very saturated and very low gamma mode to improve legibility.


sRGB (Basic mode) & DCI P3 (AMOLED Cinema) Saturations
SpectraCal CalMAN

In terms of gamut and saturations accuracy the S9+ behaves excellently in sRGB and DCI P3 modes. However I did see that the mid-level red saturation points were too high and this prohibited the S9’s from reaching lower overall dE2000 figures. Because I measured the exact same deviation between the S9 and S9+ I believe this to be a calibration issue of the mDNIe profiles rather than an issue of the panels which Samsung could theoretically fix through software if they wanted to (along with the colour temperature being too red).

 
sRGB (Basic mode) & DCI P3 (AMOLED Cinema) GMB
SpectraCal CalMAN

SpectraCal CalMAN

In the GMB charts the Galaxy S9 and S9+ again posted identical figures with overall good accuracy in sRGB and DCI P3 modes. Again the biggest mis-alignments here happen in the red tones are they are too saturated than what they should have been.

Something we’ve never covered before is a certain behaviour of AMOLED screens at low brightness and dark contents. Samsung has for generations had issues with transitions from complete black (pixels off) and lowest level colours. Now with having more competition in the OLED scene both from LG in terms of panels and from Apple with the iPhone X, I found it interesting to compare how the different devices behave.

This sort of evaluation is extremely hard to capture as it can’t really be measured with tools into a quantized figure. I resorted to simply capturing the phone’s screens with a DSLR at long exposure times. Alongside the long-exposures which exaggerate the brightness of the scene compared to the reference image, I also included the same image captures but with the high-lighted shadow clipping showcasing the areas of complete black of the screens. The phones had all been calibrated to a fixed 20cd/m² brightness to have an apples-to-apples comparison. 

The Galaxy S9 and S9+ both are darker and more even in brightness than the Galaxy S8. This matches our measurement which showed the S9’s have higher gamma than the S8 (for our units). The Galaxy S9’s provided a better representation of the source material than the S8. There is a difference between the S9 and S9+ as the S9+ seemed to have a higher gamma or more clipping between black and the darkest areas. The problem here is that this clipping gradient isn’t smooth enough and in motion this results in very noticeable moving artefacts.

The iPhone X behaved very differently than any Samsung devices and provided a significant image quality advantage in dark scenes. When looking at the shadow clipping highlighting that Apple is doing some very fine dithering between fully dark areas and the next highest brightness levels. When in motion the iPhone X just provides an extremely good experience in dark scenes with little to no visible artifacts.

The Pixel 2 XL comes with an LG panel and DDIC. The results here are a complete mess as not only does the Pixel 2 XL have issues with the dark areas, but actually the gamma curve at low levels is far too high and this clips actual detail of the image into complete black. The LG V30 has the same issues and I hear this is a hardware limitation on the way LG handles brightness control through PWM – it’s not able to retain sufficient ADC bit depth resolution at low brightness and causes a more compressed image.

Apple’s screen also doesn’t suffer from the “purple smudging” when transitioning between black areas. This seems to be caused by a lag in the response-time of the blue subpixels, not able to shut off quickly enough. The point here is that if Apple can handle dark scenes at low brightness levels at good quality, then so should Samsung, so here’s hoping Samsung’s engineers can focus on this issue and improve it in future generations.

Screen Luminance Power Efficiency
100% APL / White
Device Screen Luminance Power
at 200cd/m²
Luminance Power (mW) /
Screen area (cm²)
Efficiency
Galaxy Note 5 504 mW 5.64
Galaxy S6 442 mW 5.99
Galaxy S9 563 mW 6.69
Galaxy S8 590 mW 7.01
Galaxy S5 532 mW 7.21
Galaxy Note 4 665 mW 7.22
Galaxy S5 LTEA 605 mW 8.20
Galaxy S4 653 mW 9.22

A big question I wanted to see an answer to is if the Galaxy S9 had improved in terms of power consumption and efficiency. As it stands, power on the S8 and S9 were nearly identical and the measured difference was within 5%. We haven’t seen an improvement in AMOLED emission power efficiency in a few generations now so I do wonder if my projection of AMOLEDs surpassing LCDs in overall efficiency from 3 years ago has actually happened or not. I didn’t have time to go in-depth in other current generation devices for this article, but I’ll make sure to give an update in a separate piece in the near future.

Overall the Galaxy S9 screens behave mostly the same as the ones on the Galaxy S8’s. The only differences between the screens that will be visible is the higher gamma at low brightness levels which slightly improve the quality. The Galaxy S9’s screens are still one of the best on the market and I don’t really see any deal-breaking issues with the phones in that regard.

GPU Performance & Power Battery Life - A Stark Contrast
Comments Locked

190 Comments

View All Comments

  • id4andrei - Tuesday, March 27, 2018 - link

    All reviewers go gaga for geekbench scores with iphones/ipads as well. In this case the GB scores prove that at least in chip design Samsung has made a huge leap. As the review has outlined, the problem lies with the scheduler and DVFS which Samsung can and should address.

    If "Samdung" is so bad at hardware design, how do you call Apple's high priced iphones of the last 3 years that could not sustain chip performance and had to be throttled so as to not crap out. All initial reviews were glowing but they were all impervious to the impeding throttling.
  • name99 - Tuesday, March 27, 2018 - link

    Dude, you really do yourself no favors by struggling so hard to criticize Apple.
    Apple's throttling has NOTHING to do with the CPU per se (ie the CPU is not generating excessive heat beyond spec, or because it has been running too fast for too long), it has to do with the BATTERY and with a concern that, if CPU performance were to spike the battery could not supply enough current.

    Very different problem, nothing to do with the CPU design. A real problem yes but totally irrelevant to the issues being discussed here.
  • Matt Humrick - Wednesday, March 28, 2018 - link

    Apple's big CPU and GPU are susceptible to thermal throttling when running sustained workloads too.

    Also, having to throttle a processor within a year of sale because its transient current requirements overwhelm the power delivery system is most definitely a design flaw.
  • Icehawk - Friday, April 6, 2018 - link

    My wife’s 6S is still working at 100% after several years, I get the feeling the amount of people affected is overblown as pretty much anything anti-Apple is. I do think Apple needs to look at a better way of dealing with this but it’s also not the armeggedon somemake it out to be. I am far from a Apple fanboy but I do like their iOS products but I am sure someone will make a retort of that nature. I’d say the same thing about the Samsung chip - not great but it is performant, perhaps if we stop thinking each year a new phone should blow us away it would help us be more realistic.
  • Lavkesh - Tuesday, March 27, 2018 - link

    "In this case the GB scores prove that at least in chip design Samsung has made a huge leap" - Please explain huge leap here? The new chip barely outperforms the older SOC.
  • ZolaIII - Monday, March 26, 2018 - link

    I am very disappointed with both SoC's. Qualcomm wasted so much space on bad L4 cache which only added to latency & generally wasted more. The 30% is enormous even if new A75 cores are 35% bigger (would be 50% with ARM's L2 reference cache size) I don't know about A630 vs A540 size but if it grown-up let's say 10% the cores & GPU would together accommodate for around 15~20% leaving L3 & L4 responsible for the rest. Would be much better they used it for GPU as it could had been 2x the size then. I am also very disappointed with new cache hierarchy as it turns out to be stupid and a waist of silicone. Seams to me neither SoC used good scheduler nor scheduling by the looks of things it seems Samsung used the CAF HPM sched settings for Snapdragon SoC very aggressive patched interactive without any restraints whatsoever & no hotplug whatsoever which is very south from optimal, reference QC platform seams to had at least used hotplug (as their is no other way to explain the difference of almost 1W in GPU testing as two vs four A75's active). On the other hand seems Samsung used Power aware schaduler instead HPM & very granulated hotplug producing very bad results as those are directly confronted two things & when splashed together can only result in catastrophic result. I prefer HPM configured to be used with limited task packing and a high priority tasks enabled with significant increase of time interval for it (so that it can skip CPU sched limit), for CPU sched interactive traditional not patched with tree step load limitations (idle so that it doesn't jump erratic on any back shade task, ideal that is considered as best sustainable leakage for given lithography & max sustainable for two core's [only on big cores] i also use boost enabled & set to ideal frequency one [same as in interactive]). Preferred to use core_ctl hotplug disabled for the two little & two big cores so that they never get switched off from it. I won't go further in details about it hire as its pointless. I find this idea balanced between always available/needed/total performance as most of the times two of each course are enough for most of tasks & if not it's not a biggie to wait for other two to kick in. There is a minor drow back in responsiveness on lite task's but actually it works as fast as possible on hard one's flagged as heavy tasks like for instance Chrome rendering. It's also very beneficial to GPU workloads where even switching of two little core's and giving even 100~150mv headroom to GPU means much.

    Sorry for getting a bit deep regarding how complete scheduling mechanism should be done but I had an urge to explain how it should be done as it's so terrible done in the both cases examined hire.
  • tuxRoller - Wednesday, March 28, 2018 - link

    It's not at all clear that the hpm is meaningfully better (much faster or much more power efficient) than a proper schedtune + energy model implementation.
    Scheduling is just ridiculously hard. Adding the constraints of: soft-realtime requirements, minimal battery usage, AND an asmp and you've got the current situation where there's not yet a consensus design. We are, however, starting to see signs of convergence, imho.
  • zeeBomb - Monday, March 26, 2018 - link

    I came...and I finally saw
  • phoenix_rizzen - Monday, March 26, 2018 - link

    Ouch. The Exynos S9 is just barely better than the Exynos S7. :( And that's what Canada's going to get.

    Here's hoping they can improve things via software updates. Was considering the S9 to replace the wife's now dead S6. She's been using my S7 for the past two months while I limp along with a cracked-screen Note4. Other than the camera and screen, this isn't looking like much or an upgrade for being two generations newer.

    Maybe we'll give the ZTE, Huawei, and Xiaomi phones another look ...
  • mlauzon76 - Monday, March 26, 2018 - link

    Samsung Exynos 9810 (Europe & Rest of World)

    Canada is the 'rest of [the] world', but we don't get that version, we never get anything with the Exynos processor, we get the following one:

    Qualcomm Snapdragon 845 (US, China, Japan)

Log in

Don't have an account? Sign up now