Camera - Daylight Evaluation Continued

We continue on with more HDR heavy shots as well as going into indoor shots.

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ] - [ Pixel 3 ]
[ P30 Pro ] - [ Xperia 1 ] - [ G8 ]

The first scene on this page showcases similar changes for the new iPhone 11: the new HDR is able to extract better detail and tone down the overexposed areas compared to the XS. Also very evident is the presence of more saturation in the trees, more accurately depicting their color.

The telephoto showcases the same SmartHDR changes as it’s able to better handle the highlights.

A very good showcase by the wide-angle camera in this scene – it’s among the best renditions.

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ] - [ Pixel 3 ]
[ P30 Pro ] - [ Xperia 1 ] - [ G8 ]

I’ve noticed a lot of phones have issues with this shot in terms of their color balance, as sometimes they tend to veer off too much in the grey. The new iPhone 11 Pro here improves in comparison to the XS as it’s able to more properly maintain the greens of the leaves.

The telephoto module takes advantage of a better color accuracy, but here’s extremely evident that it’s a downgrade in terms of detail compared to the XS.

The wide-angle is excellent in term of exposure, however detail is drastically lacking throughout the whole scene, as it’s quite a blurry mess, and very much lagging behind the S10, particularly the Exynos variant.

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ] - [ Pixel 3 ]
[ P30 Pro ] - [ Xperia 1 ] - [ G8 ]

Moving indoors with still quite good lighting, it’s again hard to tell apart the iPhone 11 from the XS. The 11 is a bit brighter but other than that they’re pretty much equal. The telephoto shots are also too close to clearly determine which one is better.

The wide-angle is good, but lacks the same sharpness as showcased on the S10. Apple here both on the main and wide-angle seems to have a limited dynamic range compared to the Samsung, as evidenced by the blown out outdoors part of the shot.

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ] - [ Pixel 3 ]
[ P30 Pro ] - [ Xperia 1 ] - [ G8 ]

The local tone mapping of the HDR of the iPhone 11 improved a bit on the XS, however it’s still not handling some elements correctly, and blowing out the stained glass as well as the orange commercial sign on the left, both which certainly weren’t that bright.

The telephoto on the 11 is a lot better in this shot and the legibility of the signage is definitely better.

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ] - [ Pixel 3 ]
[ P30 Pro ] - [ Xperia 1 ] - [ G8 ]

In indoor shoots again for the main camera it’s a wash between the 11 and the XS. In some areas the 11 fares better while on other textures the XS seems sharper. Both the phones however had issues with color temperature as it’s too warm.

Portrait Mode

Click for full image
[ iPhone 11 Pro ]
[ iPhone XS ] - [ iPhone X ]
[ Galaxy S10+ (E) ] - [ Galaxy S10+ (S) ]

For portrait pictures, the big new addition for the iPhone 11 series is that you can now capture with the main camera sensor while the wide-angle serves as the depth sensor. It’s still possible and sometimes maybe preferable to use the telephoto lens for portrait shots.

The problem is that it seems that Apple hasn’t really improved the segmentation algorithm on the new iPhones and things can be relatively imperfect. This is particularly visible in the wider-angle shots with the “whiteout” effect, and the results just aren’t very good.

The fun thing about this scene with the swing is we can see the gradual effects of the bokeh on the ropes – that is, we can see that it’s not very gradual on the iPhones as we can clearly delineate where different levels of bokeh blur are applied. This is also partly visible on the Exynos S10, but the Snapdragon S10 has excellent segmentation as well as a smooth and gradual 3D depth blur.

iOS 13.2 Deep Fusion

I had started off the review with iOS 13 including most the daylight pictures, after which I switched over to iOS 13.1 for most testing. Finally, Apple had released a beta for iOS 13.2 and I had to take a look at the new Deep Fusion feature and how it behaves.

Click for full image
[ iPhone 11 Pro iOS 13.2 ]
[ iPhone 11 Pro iOS 13.1.2 ]
[ Galaxy S10+ (E) ]

I was rather shocked to see the difference in detail that the new Deep Fusion feature can make, and you definitely don’t even have to view the pictures at full resolution to notice a difference in sharpness as well as increased detail.

Essentially Deep Fusion should work similarly to Google’s super resolution zoom technology, just Apple is using it to increase the amount of details captured at the full frame resolution. With the feature enabled the camera is able to bring out finer textures in textiles or rougher materials with fine-grained details that otherwise were blurred out by the camera.

I tried a few shots outdoors, however as Apple mentioned it doesn’t seem to work in coordination with Smart HDR and the last comparison shot doesn’t really show any major difference in detail between iOS 13.1 and 13.2.

Daylight Camera Capture Conclusion – Wait for Deep Fusion retake?

The main selling point of the new iPhones was the addition of the ultra-wide-angle camera module. Indeed, this opens up a totally new capture experience for users and I do think it makes a lot of sense to retain this module on the regular iPhone 11 rather than having a telephoto module. The wide-angle camera had been pioneered by LG a few generations ago, but last year it was Huawei which brought it to the mainstream. And now in 2019 it’s been a must-have for every vendor, and it would have been shocking if Apple hadn’t adopted it.

Quality-wise, Apple's wide-angle module does adequately well, as it’s definitely one of the better modules out there. Still, there’s been many shots where the pictures ended up notably less sharp than on the Galaxy S10 or Huawei’s phones. HDR had also been a bit better for the competition in some scenarios.

On the main camera, improvements for this generation were relatively muted when it comes to the daylight results. There just isn’t very much difference to the XS. We do note that the color temperature is slightly improved, saturation is sometimes more accurately captured, and HDR is able to now handle highlights better. Still I had expected a bit more – sometimes the competition is able to showcase better dynamic range and thus capture more of a scene. The level of detail between the iPhone 11 series and the XS are essentially identical.

The telephoto module changes on the 11 Pro are a bit odd. A lot of the scenes showcased the new phone as producing noisier shots or just having less detail. The optics of the module have changed, as it moved from an f/2.4 aperture to an f/2.0, so I do wonder if that’s the reason for the discrepancy. Sometimes the new module wins out, but other times there isn’t any improvement or even slight regressions. It’s not a deal-breaker or a problem at all, but it’s still odd to see this development from Apple.

Portrait mode on the main sensor is a new addition to the camera experience, but the issue is that Apple really hasn’t improved its segmentation and depth sensing capabilities. Qualcomm’s ISP here looks to be superior as it’s able to produce better bokeh effects.

Finally, Deep Fusion could very well be a game-changer for the camera. I was extremely surprised by the increased quality in sharpness and detail that the new mode brings. I didn’t have sufficient time to properly evaluate it in a wider range of scenarios and against more phones, but it could very well be one of the features that puts the iPhone 11 series ahead of other phones. It’s something we definitely have to revisit in the upcoming Pixel 4 and Mate 30 Pro reviews as we redo the whole camera comparison with iOS 13.2.

Camera - Daylight Evaluation: Triple Cameras Camera - Low Light Evaluation
Comments Locked

242 Comments

View All Comments

  • Henk Poley - Saturday, October 19, 2019 - link

    Does the A13 have more security features, such as the pointer encryption that was added with the A12 (essentially binding pointers to their origin (e.g. processes)) ? It was kinda interesting that the recent mass exploitation of iPhones uncovered, didn't touch any of the A12 iDevices (and neither does jailbreaks).
  • techsorz - Sunday, October 20, 2019 - link

    I'm sorry Anandtech, but your GPU review is absolutely horrendous. You are using 3Dmark on iOS, which hasn't recieved an update since IOS 10 and then compare it to the Android version which was updated June 2019. There is a reason you are getting conflicted results when you switch over to GFXbench, which was updated on iOS in 2018. How this didn't make you wonder, is amazing.
  • Andrei Frumusanu - Sunday, October 20, 2019 - link

    The 3D workloads do not get updated between the update versions, so your whole logic is moot.
  • techsorz - Sunday, October 20, 2019 - link

    Are you kidding me? The load won't change, but the score sure will. It makes it look like the iPhone throttles much more than it does in reality. That the score is 50% less due to unoptimized garbage does not mean that the chipset actually throttled with 50%.

    I can't believe that I have to explain this to you, 3Dmark supports an operative system that is 3 years old, for all we know it is running in compatibility mode and is emulated.
  • Andrei Frumusanu - Sunday, October 20, 2019 - link

    Explain to me how the score will change if the workload doesn't change? That makes absolutely zero sense.

    You're just spouting gibberish with stuff as compatibility mode or emulation as those things don't even exist - the workload is running on Metal and the iOS version is irrelevant in that regard.
  • techsorz - Monday, October 21, 2019 - link

    In computing you have what is called a low-level 3D API. This is what Metal and DirectX is. This is what controls how efficiently you use the hardware you have available. If you have a new version of this API in say, IOS 13, and you run an iOS 10 application, you will run into compatibility issues. These issues can degrade performance without it being proportional to the actual throttling taking place. On android however, it is compatible with the latest low-level API's as well as various performance modes.

    The hillarious thing is that Anandtech even contradict themselves, using an "only" 1 year outdated benchmark, where the iPhone suddenly throttles less at full load. This entire article is just a box full of fail, if you want to educate yourself, I suggest you watch Speedtest G on Youtube. Or Gary Explains. He has a video on both 'REAL' iOS and Android throttling, done using the latest version of their respective API
  • Andrei Frumusanu - Monday, October 21, 2019 - link

    > If you have a new version of this API in say, IOS 13, and you run an iOS 10 application, you will run into compatibility issues. These issues can degrade performance without it being proportional to the actual throttling taking place. On android however, it is compatible with the latest low-level API's as well as various performance modes.

    Complete and utter nonsense. You literally have no idea what you're talking about.
  • techsorz - Monday, October 21, 2019 - link

    How about you provide a proper response instead of saying it's nonsense. How can the throttling be different at full load on 2 different benchmarks otherwhise? There is clearly no connection between actual throttling and the score itself. You are literally contradicting yourself in your own review.
  • Andrei Frumusanu - Monday, October 21, 2019 - link

    A proper response to what exactly? Until now all you managed to do is complain is that the test is somehow broken and wrong and I need to educate myself.

    The whole thing has absolutely nothing to do with software versions or OS version or whatever other thing. The peak and sustained scores are performed with the same workloads and nothing other than the phone's temperature has changed - the % throttling is a physical attribute of the phone, the benchmark doesn't decide to suddenly throttle more on one benchmark more than the other simply because it's somehow been released a few years ago.

    The throttling is different on the different tests *because they are different workloads*. 3DMark and Aztec High will put very high stress the ALUs on the GPU, more than the other tests and create more heat on and hotspot temperatures the GPU, resulting into more throttling in and reduced frequencies those tests. T-Rex for example will be less taxing on the GPU in terms of its computation blocks have more load spread out to the CPU and DRAM, also spreading out temperature, and that's why it throttles the least amount.
  • techsorz - Monday, October 21, 2019 - link

    Thank you for your informative reply. Then, is it crazy to assume that 3-year-old 3Dmark benchmark is not providing the same workload as the 2019 version on Android? Maybe you could run an outdated buggy benchmark on a rog 2 as well and it would stress the ALU even more? Possibly, the rog 2 is getting a much more sensible workload while the iPhone is getting unrealistic loads that don't utilize the archiecture at all. In which case, it is pretty unfair and misleading. It's like taking a car and only testing 1 wheel and the other cars get to use all 4.

Log in

Don't have an account? Sign up now