Display

As always, the display of any mobile device is a critical part of the overall user experience. A poor display in any way is often going to sour the entire experience.

On a personal note, there are a number of mobile devices that I’ve used over the course of the previous year that frankly just weren’t good enough for me to use as a daily driver because the display just wasn’t good enough. My laptop is quite closely calibrated to sRGB and it’s used to edit all of my device photos, so I’ve really come to appreciate a device that has sufficiently accurate color that I can actually use a phone or tablet as a reference monitor of sorts to verify that images look the way I want them to.

In order to test this critical portion of the user experience, we turn to our standard test suite which uses SpectraCal’s CalMAN 5, a custom workflow for testing basic metrics like brightness, contrast, and calibration accuracy, and X-Rite’s i1Pro2 and i1DisplayPro.

Starting off with a microscope's view of the Galaxy S7's display, it looks like Samsung has elected to keep most aspects of the display constant when comparing the Galaxy S6 and S7. At a high level, the display is the same 5.1” display size that we’ve seen for a few generations now, and the 1440p resolution is shared with previous devices. Samsung continues to use their diamond PenTile layout, but it’s hard for me to say whether there’s been an adjustment to the size of the emitters as the microscope I have on hand isn’t quite sufficient for making such measurements. It’s likely that under the hood there are changes to the display driver IC in order to enable features like Always-On Display, but as we’ll soon see it’s rather unlikely that there are any generational changes in things like the emitter material or TFT backplane.

Display - Max Brightness

One of our first tests here is a pretty standard test of maximum luminance. Here, we see that the Galaxy S7 and S7 edge both are in the same general ballpark as the Galaxy Note5, which suggests that both devices are likely to be in the same generation of AMOLED panel. This brightness was achieved by using the auto-brightness mode, so it’s important to note that the max luminance in manual mode will be much lower. Of course, this brightness figure was determined with a full white display so reducing the APL will result in a higher maximum luminance as the power budget can we spent on fewer pixels which means that a higher duty cycle can be achieved in each pixel.


Galaxy S7


Galaxy S7 edge

Display - Grayscale Accuracy

Display - White Point

The next part of our testing is grayscale. As always, we target the industry standard of a 2.2 power gamma with 6504k white point. Relative to the Galaxy S6 and Note5, we see a pretty significant improvement in white point accuracy as it’s pretty much consistently quite close to a neutral white rather than a warmer color balance. Unfortunately though, in both review units I received the display has a noticeable green tint for many shades of grey, which seems to be somewhat of a perpetual problem with Samsung AMOLED displays. This really does affect quite a bit of the UI, as Material Design greys have this noticeable green tint to them that really makes things look off.

The same issue seems to not be present on the Galaxy S7 edge, which leads to a significant improvement overall in calibration quality for this portion of the testing, but both devices have a noticeably lower gamma than expected, which does have some effect on accuracy but for the most part can help to serve as a compensation mechanism for reflectance when dealing with ambient light. It’s likely that the green tint issue may only appear on a device to device basis, but to see that such issues haven’t been resolved for years is somewhat concerning given that phones costing hundreds of dollars less don’t seem to have the same problems.


Galaxy S7


Galaxy S7 edge

Display - Saturation Accuracy

The next portion of our testing is the standard saturation sweep test. Here, the Galaxy S7 and S7 edge are basically perfect. It’s great to see that Samsung continues to provide their Basic color mode with a real focus on providing accurate color calibration for those that care about these things, and the user experience with getting to the right color calibration is pretty much as painless as it can be compared to some other devices where things like saturation curves, white balance, and other parts of a display calibration can only be adjusted using unitless sliders that basically require a spectrophotometer to actually use.


Galaxy S7


Galaxy S7 edge

Display - GMB Accuracy

In our Gretag MacBeth ColorChecker test, we see that there are some issues with grayscale accuracy, but overall color accuracy remains quite good. In terms of overall display quality, I don’t really think there’s any meaningful improvement over the Galaxy S6, but that’s mostly because the Galaxy S6 set a ridiculously high bar for display quality.

However, I don’t believe that Samsung has run out of things to improve for future AMOLD displays. In addition to the grayscale problems mentioned earlier, Samsung clearly has not resolved issues with color shifting that occurs with viewing angle changes. LCDs definitely have more luminance degradation as you move away from the normal of the display plane, but at almost every angle change I can see whites get noticeably colder and interference patterns, in addition to a general color shift that is noticeably more than most LCDs used in high end smartphones and tablets. It’s obvious that this is a hard problem to solve due to uneven subpixel aging, but for things like tablets, laptops, and desktops color shifting is going to be a much more significant issue.

GPU Performance Software UX
Comments Locked

202 Comments

View All Comments

  • ah06 - Tuesday, March 8, 2016 - link

    Samsung's core is already done I think, its the Mongoose core in the 8890, the one in the international variants. It's a slightly weaker core than Kryo
  • Speedfriend - Tuesday, March 8, 2016 - link

    A7 to A8 tock was 15%, and that was partly higher clock speed in a bigger body. Will be interesting to see what this tock brings.
  • lilmoe - Tuesday, March 8, 2016 - link

    Unlike what most reviewers want to believe, when designing application processor cores, companies like ARM, Qualcomm and Samsung aim for a "sweet spot" of load-to-efficiency ratios, not MAX single threaded performance.

    Their benchmark is common Android workloads (which btw, rarely saturates a Cortex A57 at 1.8GHz), since it's what makes the vast majority of the mobile application processor market. They measure the average/mean workload needs and optimize efficiency for that.

    Android isn't as efficient as iOS and Windows Phone/10 Mobile in hardware acceleration and GPU compositing; it's much more CPU bound. It doesn't benefit as much from race to sleep in mobile devices. CPU cores remain significantly more active when rendering various aspects of the UI and scrolling.
  • tuxRoller - Tuesday, March 8, 2016 - link

    Can you explain how you measure the relative "efficiencies" of the "hardware acceleration and GPU compositing"?
  • lilmoe - Wednesday, March 9, 2016 - link

    By measuring CPU and RAM utilization when performing said tasks. More efficient implementations would offload more of the work to dedicated co-processors, (in this case, the GPU) and would use less RAM.

    Generally, the more CPU utilization you need for these tasks, the less efficient the implementation. Android uses more CPU power and more RAM for basic UI rendering than iOS and WP/10M.
  • tuxRoller - Saturday, March 12, 2016 - link

    How do you measure this so that you can ignore differences in the system (like textures chosen)? Then you'd have to make sure they're running on the same hardware.
    The best you can do is probably test Android and Windows on the same phone (this will put Windows at a bit of a disadvantage as Android allows very close coupling of drivers as their HAL is pretty permissive). Then you run a native game on each.
    If you've found a way to do this I, and Google, would love to see the results.
    Other than for 2d (which NOBODY, including directdraw/2d or quartz, fully accelerates), Google really hammers the GPU through use of shared memory, overlays and whatever else may be of use. There's obviously more optimization for them to do as they still overdraw WAY too much on certain apps, and they've obviously got a serious issue with their input latency, but it's a modern system. Probably the most modern as its been developed from scratch most recently.
  • Dobson123 - Tuesday, March 8, 2016 - link

    In the 2016 web browsing battery life test, the S6 Edge is 20% worse than the S6, and the LG G4's number is also way too low.
  • lilmoe - Tuesday, March 8, 2016 - link

    I also thought the difference in battery life between the S6 and S6 Edge was off. They either posted wrong data, or something wrong happened while testing.
  • MonkeyPaw - Tuesday, March 8, 2016 - link

    I'd agree. When one of the phones goes from being upper middle of the pack on the old benchmark to being dead last--and woefully so--then I would have to wonder if something is really wrong with the new test. I've used the G4 for 6 months and have rarely had battery concerns over a day of "regular" use. I've owned several phones, and the G4 is a trooper.
  • Ryan Smith - Tuesday, March 8, 2016 - link

    We're re-checking the S6 Edge. We've had issues before with that specific phone.

Log in

Don't have an account? Sign up now