Display Measurement

When it comes to displays, last year's iPhone XS didn’t showcase any major display changes compared to the original iPhone X, as the two phones seemingly shared the same display panel. In contrast to that situation, for the new iPhone 11 Pros, Apple is advertising using a newer generation panel which brings notable improvements with it.

In terms of dimensions or resolution, there’s no visible changes on the new panels, and you’d have to look under the hood to see what has actually changed. The most notable improvement this year is a switch in the OLED emitter material that’s been used by Samsung in producing the new screen. The new generation emitter was first introduced in the display panel of the Galaxy S10, and to my knowledge it has subsequently only been used in the Note10 series as well as the new OnePlus 7T (regular version only). The iPhone 11 Pro phones now join this limited group of devices, and the biggest improvements to the user experience will be higher maximum brightness levels as well as improved power efficiency.

The regular iPhone 11, on the other hand does not seem to have changed much from the iPhone XR. It remains a relatively lower resolution LCD screen, although its display characteristics remain excellent.

We move on to the display calibration and fundamental display measurements of the iPhone 11 screens. As always, we thank X-Rite and SpecraCal, as our measurements are performed with an X-Rite i1Pro 2 spectrophotometer, with the exception of black levels which are measured with an i1Display Pro colorimeter. Data is collected and examined using SpectraCal's CalMAN software.

Display Measurement - Maximum Brightness 

In terms of maximum brightness, Apple has advertised that the new iPhone 11 Pro’s can reach up to 800nits of brightness displaying regular content. We’re able to verify this, as our 11 Pro Max sample reached 807 nits while the 11 pro reached 790 nits. Consequently, it’s quite odd to see that the LCD-based iPhone 11 is now the lowest brightness device in the line-up. As always, Apple doesn’t make use of any brightness boost mechanism and thus allows its peak brightness to be achieved in any scenario.

Apple also advertises that the screen does go up to 1200 peak brightness in HDR content, however I haven’t been able to go ahead to verify this in our current test suite.

 
SpectraCal CalMAN
               iPhone 11: 
        iPhone 11 Pro: 
iPhone 11 Pro Max: 

In the greyscale tests, all the iPhones perform extremely well, as expected. The Pro models do showcase a tendency to have slightly too strong red levels, so their color temperature is ever so slightly too warm. This characteristic diminishes the higher in brightness we go on the Pro models. The iPhone 11 has a weakness in the greens, so its color temperature is a above the 6500K white point target.

Gamma levels are excellent and target levels of 2.2. The Pro models are veering off towards higher gamma at higher picture levels, something that isnt as prominently exhibited by the iPhone 11. I’m not sure if this is due to a non-linear APL compensation of the phone screen during our measurement patterns, or if there’s an actual issue of the calibration.


iPhone 11 / SpectraCal CalMAN
iPhone 11 Pro / SpectraCal CalMAN
iPhone 11 Pro Max / SpectraCal CalMAN

Display Measurement - Greyscale Accuracy

The dE2000 deviation scores for the Pro models this year are slightly worse than what we saw in last year’s XS devices, however it’s still firmly among the best in class devices out there in the market, and you’d be hard pressed to perceive the small deviations. The iPhone 11 oddly enough does fare a bit worse off than the iPhone XR due to the larger deviations in color balance.


iPhone 11 / SpectraCal CalMAN

In the sRGB color space (default device content), the iPhone 11 performs extremely well with only minor shifts in hue in the greens.


iPhone 11 Pro / SpectraCal CalMAN

iPhone 11 Pro Max / SpectraCal CalMAN

In the same test, both the Pro models are showcasing exemplary accuracy.

Display Measurement - Saturation Accuracy - sRGB dE2000

The Pro models are just a bit worse off than the XS models of last year, but again these are among the most accurate displays you’ll find out there – mobile devices or not. The iPhone 11 is still excellent, although showing a bit larger deviation compared to the XR.


iPhone 11 / SpectraCal CalMAN


iPhone 11 Pro / SpectraCal CalMAN


iPhone 11 Pro Max / SpectraCal CalMAN

Display Measurement - Saturation Accuracy - Display-P3

For Display P3 content, the iPhone 11 Pro models showcase the best saturation accuracies we’ve ever measured on any display. This time around, the iPhone 11 is in line with the XR.


iPhone 11 / SpectraCal CalMAN

In the Gretag-MacBeth test of common tones, the only real issue of the iPhone 11 is the whites which had showcased a weakness of greens. Notice how the luminosity of the tones are essentially absolutely perfect.


iPhone 11 Pro / SpectraCal CalMAN


iPhone 11 Max Pro / SpectraCal CalMAN

Display Measurement - Gretag–Macbeth Colour Accuracy

Overall in terms of the color calibration and screen quality, the iPhones are the very best in the industry. There’s really nothing I can say about them as they’re class-leading in every regard.

The iPhone 11’s LCD screen isn’t for my taste due to the lower resolution, which frankly does bother me, and it certainly doesn’t have the same contrast characteristics as the Pro models. So while colors are still extremely good, it remains a compromise in 2019 when essentially every manufacturer has moved on to adopt OLED screens.

Display Power Measurements - Generational Improvements

Naturally, we didn’t want to finish the display evaluation section without verifying Apple’s claims about the new improved power efficiency of the iPhone 11 Pro panels.

Comparing the three generations of identical format iPhones, we again see that the display power consumption between the original iPhone X and the XS didn’t differ much at all. Plotting the new iPhone 11 Pro in the chart however we immediately see the difference in the new generation.

At equal brightness levels, Apple has indeed been able to improve the power efficiency of the panel by 15% - just as Apple’s marketing described it. We also see how the new panel expands past the brightness limits of the X and XS, reaching 800nits. This does come at a cost however, as the improved power efficiency isn’t able to completely make up for the larger brightness increase, so the maximum power consumption of the screen displaying full white does rise from 2.6W to 3.1W.

GPU Performance & Power Battery Life - A Magnitude Shift
Comments Locked

242 Comments

View All Comments

  • Irish910 - Friday, October 18, 2019 - link

    Why so salty? If you hate Apple so much why are you here reading this article? Sounds like you’re insecure with your android phone which basically gets mopped up with by the new iPhones in every area where it counts. Shoo shoo now.
  • shompa - Thursday, October 17, 2019 - link

    Desktop performance. Do you understand the difference between CPU performance and App performance? X86 has never had the fastest CPUs. They had windows and was good enough / cheaper than RISC stuff. The reason why for example Adobe is "faster" in X86 is that Intel adds more and more specific instructions AVX/AVX512 to halt competition. Adobe/MSFT are lazy companies and don't recompile stuff for other architectures.
    For example when DVD encoding was invented in 2001 by Pioneer/Apple DVD-R. I bought a 10K PC with the fastest CPU there was. Graphics, SCSI disks and so on. Doing a MPEG 2 encoding took 15 hours. My first mac was a 667mhz PowerBook. The same encoding took 90 minutes. No. G4 was not 10 times faster, it was ALTIVEC that intel introduced as AVX when Apple switched to Intel. X86 dont even have real 64bit and therefore the 32bit parts in the CPU cant be removed. X86 is the only computer system where 64bit code runs slower than 32bit (about 3%). All other real 64bit systems gained 30-50% in speed. And its not about memory like PC clickers belive. Intel/ARM and others had 38bit memory addressing. That is 64gig / with a 4gig limit per app. Still, today: how many apps use more than 4gig memory? RISC went 64bit in 1990. Sun went 64bit / with 64bit OS in 1997. Apple went 64bit in 2002. Windows went 64bit after Playstation4/XboxOne started to release 64bit games.

    By controlling the OS and hardware companies can optimize OS and software. That is why Apple/Google and MSFT are starting to use own SoCs. And its better for customers. There are no reason a better X86 chip cost 400 dollars + motherboard tax 100 dollars. Intel 4 core CPUs 14nm cost less than 6 dollars to produce. The problem is customers: they are prepared to pay more for IntelInside and its based on the wrong notion "its faster". The faster MSFT moves to ARM / RISCV. The better. And if the rumors are right, Samsung is moving to RISCV. That would shake up the mobile market.
  • Quantumz0d - Thursday, October 17, 2019 - link

    Samsung just killed Texas team funding. And you don't want to pay for a socketed board and industry standard but rather have a surfacw which runs on an off the shelf processor and has small workload target in a PC ?

    Also dude from where you are pulling this $6 of Intel CPUs and I presume you already know how the R&D works right in Lithography ? ROI pays off once the momentum has began. So you are frustrated of 4C8T Intel monopoly amd want some magical unicorn out of thin air which is as fast that and is cheap and is portable a.k.a Soldered. Intel stagnated because of no competition. Now AMD came with better pricing and more bang for buck.

    Next from Bigroom Mainframes to pocket PC (unfortunate with iOS its not because of no Filesystem anf Google following same path of Scoped Storage) microsoft put computers in homes and now they recently started moving away into SaaS and DaaS bs. And now with thin client dream of yours Itll be detrimental to the Computer HW owners or who want to own.

    We do not want a Propreitary own walled gardens with orwellian drama like iOS. We need more Linux and more powerful and robust OS like Windows which handles customization despite getting sandbagged by M$ on removing control panel slowly and migrating away from Win32. Nobody wants that.

    https://www.computerworld.com/article/3444606/with...
  • jv007 - Wednesday, October 16, 2019 - link

    The lighting big cores are not very impressive this time.
    From 4 Watt to 5 Watt a 25% increase in power for 17% more performance.
    Good for benchmarks (and the phone was actively cooled here), but not good for throttling.
    7nm and no EUV, maybe next year with 5nm and EUV will improve seriously.
    I wonder if we will see a A13X.
  • name99 - Wednesday, October 16, 2019 - link

    "The lighting big cores are not very impressive this time"

    A PHONE core that matches the best Intel has to offer is "not impressive"?
    OK then...
  • Total Meltdowner - Thursday, October 17, 2019 - link

    Comparing this CPU to intel is silly. They run completely different instructions.
  • Quantumz0d - Sunday, October 20, 2019 - link

    It has been overblown. The Spec score is all the A series chips have. They can't replace x86 chips even Apple uses x86 cores with Linux RHEL or Free OS Linux distribution to run their services. Whole world runs on the same ISA. These people just whiteknight it like a breakthrough while the whole iOS lacks basic Filesystem access and the latest Catalina cannot run non notarized apps.

    Also to note the Apple First party premium optimization, that Apple pays for companies like Adobe. If you run MacOS / Trashbook Pro BGA / iOS on any non optimized SW it will be held back both on power consumption and all. It's just a glorified Nix OS and with the first party support it keeps floating. They missed out on the mass scale deployment like Windows or Linux and that's going to be their Achilles heel along with the more transformation of MacOS into iOS rather than opposite.

    It's really funny when you look how 60% of the performance is max that one can get from MacOS based HW/Intel machines due to severe thinning on chassis for that sweet BGA appeal and non user serviceable HW while claiming all recycled parts and all. I'm glad that Apple can't escape Physics. VRM throttling, low quality BGA engineering with cTDP garbage etc. Also people just blatantly forget how the DRAM of those x86 processors scales with more than 4000MHz of DDR4 and the PCIe lanes it pushes out with massive I/O while the anemic trash on Apple Macs is a USB C with Dongle world, ARM replicating the same esp the Wide A series with all the Uncore and PCIe I/O support ? Nope. It's not going to happen. Apple needs to invest Billions again and they are very conservative when it comes to this massive scale.

    Finally to note, ARM cannot replace x86. Period. The HPC/ DC market of the Chipzilla Intel and AMD, they won't allow for this BS, Also the ISA of x86 is so mature plus how the LGA and other sockets happen along. While ARM is stuck with BGA BS and thus they can never replace these in the Consumer market.

    Let the fanboys live in their dream utopia.
  • tipoo - Thursday, October 17, 2019 - link

    Being that the little cores are more efficient, and the battery is significantly larger, maybe they allowed a one time regresion in peak performance per watt to gain that extra performance, without a node shrink this year.
  • zeeBomb - Wednesday, October 16, 2019 - link

    the time has come.
  • joms_us - Wednesday, October 16, 2019 - link

    Show us that A13 can beat even the first gen Ryzen or Intel Skylake , run PCMark, Cinebench or any modern games otherwise this nonsense desktop level claim should go to the bin. You are using a primitive Spec app to demonstrate the IPC?

    I can't wait for Apple to ditch the Intel processor inside their MBP and replace with this SoC. Oh wait no, it won't happen in a decade because this cannot run a full pledge OS with real multi-tasking. =D

Log in

Don't have an account? Sign up now