Display

One of my only issues with the Note 2 after using it for a long time was resolution. Although the move to a subpixel matrix with a full 3 subpixels per pixel on the Note 2 honestly really put most of my concerns at bay at launch, 720p started to feel limiting further on in that product’s cycle. Having more display area is great, however in Android what really matters is resolution. It was ironic having this phone with a huge display, but 720p resolution that was quickly eclipsed by devices with so much smaller displays. With the Note 3 Samsung moves to 1080p at 5.7 inches, up from the 720p at 5.5 inches in the Note 2, and 1280x800 at 5.3 inches from the original Note.

A question that immediately comes up every time we get a Samsung phone with AMOLED is first, what kind, and second what subpixel unit cell is behind it all, be it an RGB stripe or some other RG,BG alternative unit cell. In the case of the Note 3 we unsurprisingly see Samsung use the same unit cell as they did on SGS4, an offset pattern with green on one line and red and blue on another line. There’s a square blue subpixel with more area than the circular red and green subpixels as well to compensate for the difference in luminous efficiency of the material used in each subpixel type. As I’ve said in the past this isn’t PenTile (although people have started using that brand as a proxy for RG,BG alternatives) but something else entirely, but the ultimate end is still the same, two subpixels per unit pixel and not an RGB stripe.


The question for most normals then becomes – is this a big deal or can a normal human being see it? I’d argue that the subpixels on the Note 3, like the SGS4, are now small enough that they can’t be seen. I used to very picky about this, but I don’t find the new offset RG,BG pattern distracting at all on the Note 3. Subpixel size moves from just above 1 arcminute (1.006 and 1.073 for the Note and Note 2 respectively) down to 0.741 for the Note 3, making them small enough to in theory exceed human eye resolution of 1 arcminute. I won’t break out the huge table or chart, or go over all of that again, but it’s nice to see that finally be the case with the Note 3.

Brightness (White)

The Note 3 has the same display mode settings as we’ve seen in other generations, these mDNIe toggles allow some control over display color curves. They’re ultimately not a mitigation for Android’s lack of a real CMS and don’t completely solve the oversaturation issue that comes hand in hand with AMOLED’s different spectral curves, but they do help somewhat. These are unchanged as well from the SGS4 – Adapt Display is checked by default and will select which mode to use automatically for first party apps and a few others, but you can select between dynamic, standard, professional photo, and movie manually, which have different tunings for white point, gamut, and saturation. There’s also still the toggle for automatically adjusting screen tone depending on what’s being displayed.

Of the modes and configuration options available, I don’t doubt for a second that the most used one will be the defaults, however if you’re looking for the most sane from a color accuracy perspective it’s still Movie mode with the auto screen tone toggle unchecked. I gave Samsung the benefit of the doubt and ran all my measures in Movie mode as a result, but also took saturation measures of the other modes so you can see the difference in gamut and saturation with what you get under those.


The Standard and Dynamic modes have a ton of oversaturation, extending far beyond sRGB. In Dynamic mode we can also see some compression going on at the higher saturation levels, effectively blowing out those colors even more, with the second to last point almost on top of the last point. Pro Photo mode clamps down gamut and makes saturation a bit more linear, but has some odd other artifacts that show up. With the Movie selection made, the Note 3 display is considerably more controlled and linear, and makes a dramatic difference in how everything appears on the Note 3 during normal use. If you care about display really this is the only setting you should be using.


White point in movie mode is still bluer than I’d like at an average of just over 7100K, but in the all important Gretag Macbeth patch test, Delta-E is pretty low and puts it among iPhone 5, HTC One, and G2 territory. The results under movie mode from the Note 3 are actually nicely controlled. It still isn’t perfect, but there’s at least been an attempt made to give users that option if they don’t want garish colors that might look great on a store display but not so great if you care about matching photos you’ve taken to a display or print later, or web content between desktop and mobile.

CalMAN Display Performance - White Point Average

CalMAN Display Performance - Grayscale Average dE 2000

CalMAN Display Performance - Gretag Macbeth Average dE 2000


CalMAN Display Performance - Saturations Average dE 2000


 

Performance: CPU, GPU, NAND & USB 3.0 Camera
Comments Locked

302 Comments

View All Comments

  • Spunjji - Tuesday, October 8, 2013 - link

    Says you.
  • kapg - Wednesday, October 2, 2013 - link

    I really respect Anandtech and consider it to be the top benchmark for all tech related stuff.

    That said I am not sure of a couple of the following things and would really appreciate if someone can please throw some light on these to help me understand these better:
    - Browsing benchmarks like Sunspider Javascript Benchmark 1.0, Google Octane Benchmark v1, Mozilla Kraken Benchmark – 1.1 & Browsermark 2.0...are these dependenton/affected by the screen resolution, if so then should they not be run on the same resolution for different devices to provide an ideal representation of the CPU??
    - Why is it that all benchmarks that Anantech runs are not run with the same set of devices, some benchmarks are with a certain set and with other benchmarks devices are added. I can understand that not all devices support the same set of benchmark tools but as that is the case should we not test only on those benchmarks that are common to all devices (or on which all devices can be made to run/simulated).....this is just coz it is pretty confusing for a non-expert as me to compare two devices (say Apple iPhone 5s vs Nokia 925 vs Samsung S4)

    Am sorry if these querries are noobish as I do not understand the in-depth details of these benchmarks and hope someone can clarify.

    peace,
    ~kg
  • thunng8 - Wednesday, October 2, 2013 - link

    JavaScript and browser benchmarks are not dependent on screen resolution
  • Samunosuke - Wednesday, October 2, 2013 - link

    In the pc world, if it was discovered that Sager's GT780m consistently benched higher than equivalent Alienware/MSI/Asus etc 780m but yet performed the same in games, what would be the reaction? I was surprised when the galaxy s4's benchmark boost was glossed over just because some of samsung's apps were included. That doesn't make it acceptable. The cpu/gpu is a known factor and should be the same for all apps regardless of origin or use. Boosting benchmarks is wrong, plain and simple. All manufacturers who do it should be called out. There are several ways to curb this:
    1. Do what arstechnica did and circumvent the benchmark boost by renaming the benchmark software (you can keep that and use on all devices from here on out and updating when necessary).
    2. Run battery life tests in the boosted state (by renaming the browser/media player/whatever you use to run the battery life tests to a boosted app).
    Its not fair if other devices either have lower battery life due to increased performance or higher battery life due to reduced performance and yet others find a way to inflate their scores and get the best of both worlds.
  • kapg - Wednesday, October 2, 2013 - link

    "2. Run battery life tests in the boosted state (by renaming the browser/media player/whatever you use to run the battery life tests to a boosted app)."

    I do not agree with running battery life test in the boosted state as that is not the regular mode in which any of those apps will function and thus the results obtained will not be realistic. In my view battery tests should be run with the device(s) in standard state and having the same set of apps across all devices and the same activity being performed (in a loop if needed) across all devices.

    peace,
    ~kg
  • Origin64 - Wednesday, October 2, 2013 - link

    Sitting here looking at my SGS (1) all I can think is how little has changed in over 3 years. Screens got a little bigger, resolutions went up, so did the prices, but functionality is just the same as it ever was. Really disappointing, but I guess I can blame the extremely limited data plans for that. Bandwidth-intensive mobile applications cost a lot to use, so we're not even doing half of what our mobile computing could do.

    Good news is that there's still no incentive to upgrade whatsoever. I can wait a second or 2 for an app to open, and I can spend time opening apps because I dont have to work long hours to spend 600 dollars a year on a phone. See how that all comes back together?
  • Kathrine647 - Wednesday, October 2, 2013 - link

    like Gregory said I am alarmed that a stay at home mom able to earn $5886 in 1 month on the internet. visit their website............
  • Kathrine647 - Wednesday, October 2, 2013 - link

    like Gregory said I am alarmed that a stay at home mom able to earn $5886 in 1 month on the internet. visit their website............B u z z 5 5 . com
  • Kathrine647 - Wednesday, October 2, 2013 - link

    like Gregory said I am alarmed that a stay at home mom able to earn $5886 in 1 month on the internet. visit their website............B u z z 5 5 . com open the link without spaces
  • zoob - Wednesday, October 2, 2013 - link

    Am I missing something? I see a paragraph describing the IR port and headphone jack, but I do not see an accompanying photo.

Log in

Don't have an account? Sign up now