Camera - Low Light Evaluation

It’s hard to argue against the fact that over the last 2 years, Apple has largely fallen behind in terms of low-light photography. The advent of computational photography with new dedicated night modes, along with competitor devices which have massively more performant camera hardware, meant that the iPhone XS ended up being one of the least competitive devices in low light conditions. Some developers out there have even tried to address this gap with third-party camera applications which deliver their take on computational photography night mode capture.

Apple seems to have made note of this and the new iPhone 11 series now does address this missing feature. Let’s see how it holds up against the fierce competition:

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ]
[ Pixel 3 ] - [ Mate 30 Pro ]
[ P30 Pro ] - [ Xperia 1 ] - [ G8 ]

Starting off with a low-light, yet still artificially illuminated scenario, we encounter the first aspect of Apple’s new camera: the night mode isn’t a dedicated mode that one can manually select. Rather, it's a mode that gets triggered automatically depending on the ambient light detected by the phone. In this scene, there was too much light available, and as such the phone wasn't able to trigger Night mode.

That isn’t to day that the iPhone falls behind, the main camera is still very much able to produce some excellent results. In such medium light scenarios, the telephoto lens’ wider aperture now allows the camera to actually use the module rather than falling back to the main sensor and cropping the scene, which is what the XS did.

Unfortunately, the wide-angle lens’ results here are just bad and it’s all just blurry. Huawei and Samsung clearly both dominate here in terms of low-light quality, with either better sensors such as the Mate 30 Pro, or making use of Night Mode on the wide-angle unit, something which isn’t available for the iPhone 11.

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ]
[ Pixel 3 ] - [ Mate 30 Pro ]
[ P30 Pro ] - [Xperia 1 ] - [G8 ]

Going to a darker scene, we now finally see Apple’s Night Mode in action. Apple’s non-night mode shot is actually more representative of the actual brightness of the scene at the time, but Night Mode really improves the amount of detail throughout the scene. Apple’s implementation here is superior to Samsung’s and Google’s, as it’s able to retain more detail and has a better handling of the noise. Samsung has the odd situation that the new Night Mode on the Snapdragon variants is inferior in quality to the Exynos based models, making things quite blurry. Huawei’s specialized low light RYYB sensor still is the best low-light camera.

It’s odd to see that Apple’s algorithm doesn’t attempt to bring down the highlights, as such the signs on the left which are still very much blown out and overexposed.

I tried to capture a picture with the Night Mode exposure set to the very maximum 10 seconds as made available in Apple’s camera UI, however the end result was always repeatedly always capped at the exposure the camera automatically selected, even if it did appear as if it’s capturing a 10 second shot. I retested this on the newest iOS 13.2 and things did change, as it was indeed able to capture a very different shot – so it seems the Night Mode behavior on iOS 13.1 is still bugged.

The iPhone’s 11 series' wide-angle module continues to be pretty terrible in low-light.

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ]
[ Pixel 3 ] - [ Mate 30 Pro ]
[ P30 Pro ] - [ Xperia 1 ] - [ G8 ]

Apple’s night mode here continues to impress as it’s able to reproduce an excellent representation of the scene with a lot of detail. Google’s Night Sight is comparable, or even better, in detail, however the colors are too vivid.

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ]
[ Pixel 3 ] - [ Mate 30 Pro ]
[ P30 Pro ] - [ Xperia 1 ] - [ G8 ]

We continue to observe that in more well-lit scenarios that the night mode doesn’t engage. Even without it, the new iPhone 11 is an improvement over the XS.

The wide-angle module remains terrible and frankly I’m a bit puzzled why it does so bad even in a more well-lit scene like this one. The phone gets pretty much embarrassed by all the other devices.

Click for full image
[ iPhone 11 Pro ] - [iPhone XS ] - [iPhone X ]
[ S10+(S) ] - [ S10+(E) ]
[ Pixel 3 ] - [ Mate 30 Pro ]
[ P30 Pro ] - [Xperia 1 ] - [G8 ]

One thing we notice about the new Night Mode is that it’s not really able to bring out details in areas where the sensor doesn’t see anything. For example, in this scene the roof of the abbey remains clipped to black, while other devices such as the S10 or the Pixel are able to bring out the roof’s structure and details. The darker it gets, and as long as there’s some brighter elements to the scene, we’re seemingly hitting dynamic range limitations on Apple’s night mode.

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ]
[ Pixel 3 ] - [ Mate 30 Pro ]
[ P30 Pro ] - [ Xperia 1 ] - [ G8 ]

In more uniformly dark areas, the iPhone is able to extract a ton of light, but in areas where the sensor just isn’t sensitive enough and it’s not able to provide any data for the algorithm to accumulate over time, it leads in some odd looking results such as this extremely pronounced shadow of the play castle.

Click for full image
[ iPhone 11 Pro ] - [ iPhone XS ] - [ iPhone X ]
[ S10+(S) ] - [ S10+(E) ]
[ Pixel 3 ] - [ Mate 30 Pro ]
[ P30 Pro ] - [ Xperia 1 ] - [ G8 ]

Here again, in a very uniformly lit but still extremely dim scene, the iPhone 11 is able to bring out a ton of light and the result here is significantly brighter than how the scene was in reality. The iPhone beats Google and Samsung in terms of detail and only Huawei’s devices remain as the better rivals.

Low Light Conclusion – A Much Needed Feature Added, Wide Angle Lacking

Overall, Apple’s addition of the new night mode very much elevates the camera capture ability of the iPhone 11 series. It was as solemnly needed feature given that almost all other vendors in the industry have embarked on the computational photography train.

Apple’s implementation shines in a few regards: First of all the fact that it gets selected automatically rather than it being a dedicated mode that you have to switch to is a significant plus. as well as a very large practical advantage over other vendor’s camera experiences.

Quality-wise the pictures that the iPhone 11 series is able to produce in low light is top of the line and is challenged only by Huawei’s massive specialized camera sensors in terms of detail that it’s able to capture. There are a few limitations – for example the phone isn’t able to bring out details in areas where the sensor just doesn’t see anything, particularly scenes with brighter objects requiring a wider dynamic range in the capture is where things hit a snag, although details are still excellent even in these scenarios.

The biggest disappointment was the wide-angle camera. Here it’s not only that Apple’s night mode photography isn’t available for this unit, but the module doesn’t even compete against phones without their respective night modes enabled. So the iPhone 11 series is utterly put to shame when they do enable it. I do hope Apple is able to iterate on its processing for this unit as currently it’s just outright terrible and not competitive.

I do have a feeling that we’ll be seeing further updates and improvements from Apple in improving the various aspect of the camera. Already I did notice that iOS13.2 fixes exposures longer than 3 seconds for the night mode, and there’s of course the question of how Deep Fusion behaves in low-light scenarios where the night mode doesn’t kick in.

Camera - More HDR, Indoors, Portrait, Deep Fusion Video Recording & Speaker Evaluation
Comments Locked

242 Comments

View All Comments

  • Henk Poley - Saturday, October 19, 2019 - link

    Does the A13 have more security features, such as the pointer encryption that was added with the A12 (essentially binding pointers to their origin (e.g. processes)) ? It was kinda interesting that the recent mass exploitation of iPhones uncovered, didn't touch any of the A12 iDevices (and neither does jailbreaks).
  • techsorz - Sunday, October 20, 2019 - link

    I'm sorry Anandtech, but your GPU review is absolutely horrendous. You are using 3Dmark on iOS, which hasn't recieved an update since IOS 10 and then compare it to the Android version which was updated June 2019. There is a reason you are getting conflicted results when you switch over to GFXbench, which was updated on iOS in 2018. How this didn't make you wonder, is amazing.
  • Andrei Frumusanu - Sunday, October 20, 2019 - link

    The 3D workloads do not get updated between the update versions, so your whole logic is moot.
  • techsorz - Sunday, October 20, 2019 - link

    Are you kidding me? The load won't change, but the score sure will. It makes it look like the iPhone throttles much more than it does in reality. That the score is 50% less due to unoptimized garbage does not mean that the chipset actually throttled with 50%.

    I can't believe that I have to explain this to you, 3Dmark supports an operative system that is 3 years old, for all we know it is running in compatibility mode and is emulated.
  • Andrei Frumusanu - Sunday, October 20, 2019 - link

    Explain to me how the score will change if the workload doesn't change? That makes absolutely zero sense.

    You're just spouting gibberish with stuff as compatibility mode or emulation as those things don't even exist - the workload is running on Metal and the iOS version is irrelevant in that regard.
  • techsorz - Monday, October 21, 2019 - link

    In computing you have what is called a low-level 3D API. This is what Metal and DirectX is. This is what controls how efficiently you use the hardware you have available. If you have a new version of this API in say, IOS 13, and you run an iOS 10 application, you will run into compatibility issues. These issues can degrade performance without it being proportional to the actual throttling taking place. On android however, it is compatible with the latest low-level API's as well as various performance modes.

    The hillarious thing is that Anandtech even contradict themselves, using an "only" 1 year outdated benchmark, where the iPhone suddenly throttles less at full load. This entire article is just a box full of fail, if you want to educate yourself, I suggest you watch Speedtest G on Youtube. Or Gary Explains. He has a video on both 'REAL' iOS and Android throttling, done using the latest version of their respective API
  • Andrei Frumusanu - Monday, October 21, 2019 - link

    > If you have a new version of this API in say, IOS 13, and you run an iOS 10 application, you will run into compatibility issues. These issues can degrade performance without it being proportional to the actual throttling taking place. On android however, it is compatible with the latest low-level API's as well as various performance modes.

    Complete and utter nonsense. You literally have no idea what you're talking about.
  • techsorz - Monday, October 21, 2019 - link

    How about you provide a proper response instead of saying it's nonsense. How can the throttling be different at full load on 2 different benchmarks otherwhise? There is clearly no connection between actual throttling and the score itself. You are literally contradicting yourself in your own review.
  • Andrei Frumusanu - Monday, October 21, 2019 - link

    A proper response to what exactly? Until now all you managed to do is complain is that the test is somehow broken and wrong and I need to educate myself.

    The whole thing has absolutely nothing to do with software versions or OS version or whatever other thing. The peak and sustained scores are performed with the same workloads and nothing other than the phone's temperature has changed - the % throttling is a physical attribute of the phone, the benchmark doesn't decide to suddenly throttle more on one benchmark more than the other simply because it's somehow been released a few years ago.

    The throttling is different on the different tests *because they are different workloads*. 3DMark and Aztec High will put very high stress the ALUs on the GPU, more than the other tests and create more heat on and hotspot temperatures the GPU, resulting into more throttling in and reduced frequencies those tests. T-Rex for example will be less taxing on the GPU in terms of its computation blocks have more load spread out to the CPU and DRAM, also spreading out temperature, and that's why it throttles the least amount.
  • techsorz - Monday, October 21, 2019 - link

    Thank you for your informative reply. Then, is it crazy to assume that 3-year-old 3Dmark benchmark is not providing the same workload as the 2019 version on Android? Maybe you could run an outdated buggy benchmark on a rog 2 as well and it would stress the ALU even more? Possibly, the rog 2 is getting a much more sensible workload while the iPhone is getting unrealistic loads that don't utilize the archiecture at all. In which case, it is pretty unfair and misleading. It's like taking a car and only testing 1 wheel and the other cars get to use all 4.

Log in

Don't have an account? Sign up now