Camera - Daylight Evaluation Continued

We continue on with more HDR heavy shots as well as going into indoor shots.

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ] - [ Pixel 3 ]
[ P30 Pro ] - [ Xperia 1 ] - [ G8 ]

The first scene on this page showcases similar changes for the new iPhone 11: the new HDR is able to extract better detail and tone down the overexposed areas compared to the XS. Also very evident is the presence of more saturation in the trees, more accurately depicting their color.

The telephoto showcases the same SmartHDR changes as it’s able to better handle the highlights.

A very good showcase by the wide-angle camera in this scene – it’s among the best renditions.

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ] - [ Pixel 3 ]
[ P30 Pro ] - [ Xperia 1 ] - [ G8 ]

I’ve noticed a lot of phones have issues with this shot in terms of their color balance, as sometimes they tend to veer off too much in the grey. The new iPhone 11 Pro here improves in comparison to the XS as it’s able to more properly maintain the greens of the leaves.

The telephoto module takes advantage of a better color accuracy, but here’s extremely evident that it’s a downgrade in terms of detail compared to the XS.

The wide-angle is excellent in term of exposure, however detail is drastically lacking throughout the whole scene, as it’s quite a blurry mess, and very much lagging behind the S10, particularly the Exynos variant.

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ] - [ Pixel 3 ]
[ P30 Pro ] - [ Xperia 1 ] - [ G8 ]

Moving indoors with still quite good lighting, it’s again hard to tell apart the iPhone 11 from the XS. The 11 is a bit brighter but other than that they’re pretty much equal. The telephoto shots are also too close to clearly determine which one is better.

The wide-angle is good, but lacks the same sharpness as showcased on the S10. Apple here both on the main and wide-angle seems to have a limited dynamic range compared to the Samsung, as evidenced by the blown out outdoors part of the shot.

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ] - [ Pixel 3 ]
[ P30 Pro ] - [ Xperia 1 ] - [ G8 ]

The local tone mapping of the HDR of the iPhone 11 improved a bit on the XS, however it’s still not handling some elements correctly, and blowing out the stained glass as well as the orange commercial sign on the left, both which certainly weren’t that bright.

The telephoto on the 11 is a lot better in this shot and the legibility of the signage is definitely better.

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ] - [ Pixel 3 ]
[ P30 Pro ] - [ Xperia 1 ] - [ G8 ]

In indoor shoots again for the main camera it’s a wash between the 11 and the XS. In some areas the 11 fares better while on other textures the XS seems sharper. Both the phones however had issues with color temperature as it’s too warm.

Portrait Mode

Click for full image
[ iPhone 11 Pro ]
[ iPhone XS ] - [ iPhone X ]
[ Galaxy S10+ (E) ] - [ Galaxy S10+ (S) ]

For portrait pictures, the big new addition for the iPhone 11 series is that you can now capture with the main camera sensor while the wide-angle serves as the depth sensor. It’s still possible and sometimes maybe preferable to use the telephoto lens for portrait shots.

The problem is that it seems that Apple hasn’t really improved the segmentation algorithm on the new iPhones and things can be relatively imperfect. This is particularly visible in the wider-angle shots with the “whiteout” effect, and the results just aren’t very good.

The fun thing about this scene with the swing is we can see the gradual effects of the bokeh on the ropes – that is, we can see that it’s not very gradual on the iPhones as we can clearly delineate where different levels of bokeh blur are applied. This is also partly visible on the Exynos S10, but the Snapdragon S10 has excellent segmentation as well as a smooth and gradual 3D depth blur.

iOS 13.2 Deep Fusion

I had started off the review with iOS 13 including most the daylight pictures, after which I switched over to iOS 13.1 for most testing. Finally, Apple had released a beta for iOS 13.2 and I had to take a look at the new Deep Fusion feature and how it behaves.

Click for full image
[ iPhone 11 Pro iOS 13.2 ]
[ iPhone 11 Pro iOS 13.1.2 ]
[ Galaxy S10+ (E) ]

I was rather shocked to see the difference in detail that the new Deep Fusion feature can make, and you definitely don’t even have to view the pictures at full resolution to notice a difference in sharpness as well as increased detail.

Essentially Deep Fusion should work similarly to Google’s super resolution zoom technology, just Apple is using it to increase the amount of details captured at the full frame resolution. With the feature enabled the camera is able to bring out finer textures in textiles or rougher materials with fine-grained details that otherwise were blurred out by the camera.

I tried a few shots outdoors, however as Apple mentioned it doesn’t seem to work in coordination with Smart HDR and the last comparison shot doesn’t really show any major difference in detail between iOS 13.1 and 13.2.

Daylight Camera Capture Conclusion – Wait for Deep Fusion retake?

The main selling point of the new iPhones was the addition of the ultra-wide-angle camera module. Indeed, this opens up a totally new capture experience for users and I do think it makes a lot of sense to retain this module on the regular iPhone 11 rather than having a telephoto module. The wide-angle camera had been pioneered by LG a few generations ago, but last year it was Huawei which brought it to the mainstream. And now in 2019 it’s been a must-have for every vendor, and it would have been shocking if Apple hadn’t adopted it.

Quality-wise, Apple's wide-angle module does adequately well, as it’s definitely one of the better modules out there. Still, there’s been many shots where the pictures ended up notably less sharp than on the Galaxy S10 or Huawei’s phones. HDR had also been a bit better for the competition in some scenarios.

On the main camera, improvements for this generation were relatively muted when it comes to the daylight results. There just isn’t very much difference to the XS. We do note that the color temperature is slightly improved, saturation is sometimes more accurately captured, and HDR is able to now handle highlights better. Still I had expected a bit more – sometimes the competition is able to showcase better dynamic range and thus capture more of a scene. The level of detail between the iPhone 11 series and the XS are essentially identical.

The telephoto module changes on the 11 Pro are a bit odd. A lot of the scenes showcased the new phone as producing noisier shots or just having less detail. The optics of the module have changed, as it moved from an f/2.4 aperture to an f/2.0, so I do wonder if that’s the reason for the discrepancy. Sometimes the new module wins out, but other times there isn’t any improvement or even slight regressions. It’s not a deal-breaker or a problem at all, but it’s still odd to see this development from Apple.

Portrait mode on the main sensor is a new addition to the camera experience, but the issue is that Apple really hasn’t improved its segmentation and depth sensing capabilities. Qualcomm’s ISP here looks to be superior as it’s able to produce better bokeh effects.

Finally, Deep Fusion could very well be a game-changer for the camera. I was extremely surprised by the increased quality in sharpness and detail that the new mode brings. I didn’t have sufficient time to properly evaluate it in a wider range of scenarios and against more phones, but it could very well be one of the features that puts the iPhone 11 series ahead of other phones. It’s something we definitely have to revisit in the upcoming Pixel 4 and Mate 30 Pro reviews as we redo the whole camera comparison with iOS 13.2.

Camera - Daylight Evaluation: Triple Cameras Camera - Low Light Evaluation
Comments Locked

242 Comments

View All Comments

  • Jon Tseng - Wednesday, October 16, 2019 - link

    Nice! Any additional thoughts on the U1 UWB chip. I guess not much you can do with it yet but to me the possibilities are intriguing...
  • Andrei Frumusanu - Wednesday, October 16, 2019 - link

    I think Apple has more plans with it in the future, but yes right now it doesn't do very much.
  • tipoo - Wednesday, October 16, 2019 - link

    Definitely think it's getting the hardware ready for the AR glasses. Hyper precise location tracking just by putting your phone down on a desk and having the U1 chips communicate.
  • Diogene7 - Wednesday, October 16, 2019 - link

    I am dreaming of that the Apple U1 UWB chip could be used in a not too distant future (2020 / 2021) for precise spatial locasization for at (short) distance wireless charging : by knowing where exactly in space an Apple device is, Apple might be able to dynamically and efficiently focus wireless energy transfer maybe through wireless resonant charging (Airfuel) for an iPhone or through RF charging like Energous / Ossia for recharging Apple Airpods from an iPhone...

    I think I am dreaming, but just hope that Apple is working hard to make wireless power at a short distance a reality : I would dream to be able to drop my iPhone anywhere on my bedside table, and that it automatically recharge during the night from a base station up to a distance of 1,5 foot / 50cm : it would bring sooooo much more convenience than Qi wireless charging...
  • patel21 - Wednesday, October 16, 2019 - link

    Man, you are lazy.
  • Diogene7 - Wednesday, October 16, 2019 - link

    @patel21 : How many times do you still plug an Ethernet cable to your laptop to surf on internet instead of using WIFI ? WIFI is simply more convenient...

    Similarly, wireless charging at a distance (up to ~ 1,5 foot (50cm)) would be so much more convenient than to have to plug a cable to recharge a device

    It also true for Internet of Things (IoT) devices : tjere seems to be some studies showing that consumers stop using many IoT devices that work on batteries because they have to change the batteries

    I strongly believe that wireless charging at a short distance is a requirement for the sale of IoT sensors to really take off because managing 10s or 100s or more of IoT devices with batteries is not really manageable by consumers in the long run...
  • Molbork - Tuesday, June 16, 2020 - link

    And you just halved the power efficiency of your laptop and devices. EM transmission power is 1/r^2, checking your laptop could cost you 2-3x more than a direct connection at longer distances.
  • Henk Poley - Friday, October 18, 2019 - link

    I wonder if they'll do things like heart- and breathing-rate measurement, and counting of people around you (how many hearts). Such as was demonstrated for radar based baby monitoring. Fairly low power, a 'cigarette pack' size device attached to a baby cot could work for half a year by only periodically measuring.

    Could be interesting for meetings, that your phone knows everyone has arrived, people were agitated, etc.
  • Adonisds - Wednesday, October 16, 2019 - link

    Why is it required less than double the power to produce twice the display brightness?
  • michael2k - Wednesday, October 16, 2019 - link

    Displays aren't actually perfectly transparent, and the light generating devices might absorb some of the energy instead of transmitting it.

    Increasing transparency is one way to produce more brightness with less energy.
    Reducing the amount of energy absorbed by the LEDs (and thus transformed into heat) is another way to produce more brightness with less energy.
    Changing the LEDs basic chemistry to more efficiently transform electricity into light is a third way.

    Fundamentally less waste heat, more light.

Log in

Don't have an account? Sign up now