Camera Architecture

As usual, it’s important to discuss some of the basics of the camera hardware before we move on to actual image and video quality tests in order to better understand the factors that can affect overall camera quality. Of course, there’s much more to this than meets the eye but for the most part things like the actual lenses used are hard to determine without a device teardown.

Apple iPhone Cameras
  Apple iPhone 6
Apple iPhone 6 Plus
Apple iPhone 6s
Apple iPhone 6s Plus
Front Camera 1.2MP 5.0MP
Front Camera - Sensor ?
(1.9 µm, 1/5")
?
(1.12 µm, 1/5")
Front Camera - Focal Length 2.65mm (31mm eff) 2.65mm (31mm eff)
Front Camera - Max Aperture F/2.2 F/2.2
Rear Camera 8MP 12MP
Rear Camera - Sensor Sony ???
(1.5 µm, 1/3")
Sony ???
(1.22 µm, 1/3")
Rear Camera - Focal Length 4.15mm (29mm eff) 4.15mm (29mm eff)
Rear Camera - Max Aperture F/2.2 F/2.2

At a high level, not a whole lot changes between the iPhone 6s and iPhone 6. The aperture remains constant, as does the focal length. Sensor size is also pretty much unchanged from the iPhone 6 line. Unfortunately, the iPhone 6s continues the trend of not having OIS, which has significant effects on low light photo and all video recording. Of course, OIS alone isn’t going to make or break a camera, but it can make the difference between a competitive camera and a class-leading one.

I’m sure some are wondering why the aperture hasn’t gotten wider or why the sensor hasn’t gotten larger, and it’s likely that attempting to make a wider aperture or a larger sensor would have some significant knock-on effects. A wider aperture inherently means that distortions get worse, as even in simple cases like chromatic aberration the incoming light is now reaching the lenses at a more extreme angle. A larger sensor with all else equal would significantly increase thickness, which is already near acceptable limits for the iPhone 6s camera module. Even if you modified the lens design to focus on z-height, the end result is that the focal length is shortened significantly. Even if you don’t think a wider field of view is a problem, distortion throughout the photo increases which is likely to be unacceptable as well.

In effect, the major changes here are pixel size/resolution and the ISP, which is a black box but is new for the A9 SoCs as far as I can tell. Instead of the 1.5 micron pixel size we’ve seen before, Apple has moved to a 1.22 micron pixel size for the iPhone 6s and 6s Plus. There’s been a perennial debate about what the “right” pixel size is, and some of the research I’ve done really indicates that this changes with technology. For the most part, noise from photos taken with strong, even lighting is solely due to the fact that light is composed of discrete photons. This shot noise is an unavoidable fact of life, but in low light the problem is that the sensor’s inherent noise becomes noticeable which is affected by factors like the sensor die temperature. The problem here is that in CMOS sensors each pixel has circuitry which independently converts the number of electrons counted into a corresponding voltage, which means that for the same sensor size, if you increase the number of pixels you’re also increasing the amount of read noise.

As a result, while in theory a smaller pixel size (up to a certain limit) has no downsides, in practice due to the way CMOS image sensors are made you have to trade-off between daytime and low light image quality. Apple claims that their way of avoiding this trade-off is through the use of new technology. One of the key changes made here is deep trench isolation, which we’ve seen in sensors like Samsung’s ISOCELL. This basically helps with effects like electron tunneling which causes a photon that hits one sensor to be detected at another. The iPhone 6s’ image sensor also has modifications to the color filter array which are designed to reduce sensor thickness requirements by increasing the chief ray angle.

Camera UX

Moving on to the camera UI, iOS has basically kept the same UI that we’ve seen since iOS 7. There’s nothing that I really have to complain about given the relative simplicity and the lack of any notable usability issues here. The one major change I’ll mention here is the Live Photos button, which illuminates and indicates when the camera is capturing a live photo. The one usability problem worth noting here is that the camera doesn’t stop capturing a live photo even when the camera is lowered, so live photos often just show the ground or some fingers towards the end. Otherwise, the experience is exactly like a normal photo.

Other than the addition of the live photos button, there are some subtle additions to the camera UI due to the addition of 3D Touch. Peeking on the image shows the last 20 images captured on the phone, and popping will open up the gallery in an interesting dark theme mode which is slightly odd and inconsistent but otherwise a nice addition. A force touch on the camera app icon allows quick access to some common modes without extra actions after opening the camera application.

Of course, the other question that still lingers is how fast the iPhone camera is. In order to test this, we continue to use our ISO chart with strong studio lighting in order to get an idea for what the best case focus and capture latency are. As the ISO chart is an extremely high-contrast object, this test avoids unnecessarily favoring phase-detect auto focus and laser AF mechanisms relative to traditional contrast-detect focusing.

Camera Focus Latency (Shooting ISO 12233 Target)

When it comes to focus latency, the iPhone 6s is basically identical to the iPhone 6. At this point, we're basically looking at variance in testing as a 64ms difference is only 4 frames on the display. Something as simple as a small difference in initial focus position is going to affect the result here, because the iPhone 6s traverses straight to the correct focus position in testing. Pretty much every other smartphone is behind here because they all seem to traverse past the correct focus point before reverting to verify that PDAF or laser AF is giving an accurate result.

Camera Shot Latency (Shooting ISO 12233 Target)

In the shot latency test, we're really seeing the value of Apple's NVMe mobile NAND solution here as the iPhone 6s captures a single image in roughly 200 ms less time than the iPhone 6. Of course, this is assuming a situation in which shutter speed isn't the dominating factor in shot latency so in low light these differences are going to be hard to spot.

Live Photos

Live Photos is a new feature in the iPhone 6s, which is effectively trying to capture a moment within a photo. At a technical level, Live Photos captures a photo and a video simultaneously, with the video lasting up to three seconds. The first half of the video is going to be the moment immediately before the shutter is tapped, and the second half is right after the shutter is tapped. The video has a resolution of 1440x1080 to fit the 4:3 aspect ratio, and appears to vary in frame rate from about 12 to 15 FPS, with a bitrate of roughly 8Mbps and H.264 high profile encoding.

These are all technical details, but really what matters here is that the frame rate is relatively low so it isn’t necessarily the greatest at capturing something that is going to pass through the frame within a second. It’s likely that this is at least partially necessarily in order to make sure that Live Photos don’t take up a huge amount of storage. Similarly, I suspect this is the same logic behind why the resolution is closer to a video than a photo. The frame rate is low enough though that low light photos aren’t going to be limited by the need to keep the video at an acceptable frame rate. This is important to note, mostly because the whole point of a live photo is kind of ruined if you have to turn it on to use it.

Ultimately, with these features it is insufficient to focus on the technical details of the implementation, even if they matter. What really matters here is the user experience, and to that end Live Photos solves a lot of the friction that was present with HTC’s Zoes. I loved the idea of Zoes when I first got the HTC One M7, but after a few months I found I just wasn’t using the feature because it was too much effort to try and pre-emptively plan for a shot that would work well as a Zoe. It was also difficult to deal with the fixed recording time, a higher minimum shutter speed in low light, and the need to keep the phone raised for the entire time the Zoe was recording.

In some ways, Apple has solved these problems with Live Photos. It’s fully possible to keep the mode enabled all the time, and with the recent release of iOS 9.1 it seems Apple has implemented an algorithm to dynamically alter the length of recording based upon whether the camera is suddenly lowered in the middle of recording. Due to the relatively low frame rate there’s also no need to worry about worse low light performance or something similar, which helps with keeping the feature enabled all the time even if it means that motion isn’t has fluent as it would be with a 30 or 60 FPS video. The end result is that you can basically just take photos like usual and serendipitously discover that it resulted in a great live photo.

I really like the idea of Live Photos, and in practice I had a lot of fun playing with the feature to capture various shots to see the results. Even though I’ve spent plenty of time with the iPhone 6s I still don’t know whether I’ll actually continue using the feature in any real capacity as I definitely used HTC’s Zoe feature for the first few weeks that I spent with the One M7 but as time went on I promptly forgot that it ever existed.

Battery Life and Charge Time Still Image Performance
Comments Locked

531 Comments

View All Comments

  • TitaniK - Friday, November 13, 2015 - link

    I used to be so pro android and have tried all main phones on the market; Samsung 3&4, note3,4, htc one m7, nexus 4. I need my phone constantly mainly for business as well as pleasure and at the end, i surrendered to Apple product; so reliable, fast and just clean. It's just a well tuned machine. I compare it this way; android is the NASCAR of mobile devices where Apple is Formula 1. Cars go very fast in both organizations but the Formula 1 machines are simply finer tuned and polished machines.
  • 10basetom - Sunday, November 8, 2015 - link

    Even though my last two phones have been Androids, I would have to agree with the reviewer's assessment that Android phones have been, more or less, a zero sum game. You can call me jaded, but there's not a single Android phone in the past year that has gotten me truly excited, maybe with the exception of potentially cheaper (relative to YotaPhone 2) dual screen phones coming out of China that would change how you use a phone on a daily basis. PDAF, laser autofocus, and RAW support are nice specs to have for a limited group of photography aficionados, but I don't consider them real innovation in the overall user experience department. Most consumers (i.e., non-geeks) who use phones to take everyday photos will not notice -- or even care -- whether their phone has PDAF or not; and for people who want to take frameable photos, they would probably do so with a tripod and DLSR rather than a mobile phone. Besides, the cameras in the iPhone 6s' are nothing to laugh at.

    When I think of progress in mobile OS usability, it would have to be something that gives the end user more pleasure in using it, or increase their productivity in a measurable way (e.g., less time in doing something, fewer taps). Maybe I've just been using Android for too long, but there is nothing in Lollipop or what I've seen of Marshmallow that makes me stop and silently shout "damn, that is impressive!". Sure, the interface is a little more streamlined with enhanced jazzy animations (that I turn off anyway to improve performance), and some new iterative features sprinkled here and there, but nothing revolutionary. It's unfortunate that most Android phone manufacturers build a custom skin on top that more often than not makes the phone less usable and more buggy, and also more confusing when you move from one Android phone to the next.

    The WinCE-based Neno OS that introduced a 100% swipable interface and weaned people off the stylus two years before the original iPhone -- that's way into revolutionary territory. The pulley menu system in Sailfish OS -- now that's something refreshing. It may not be everybody's cup of tea, but at least they are trying something different, and when you do get used to it, it really does improve one-handed usability. The 3D Touch interface in the new iPhones? Now that's bordering on revolutionary. Again, it may not seem apparent when you first use it, but after living with it for an extended period of time until it becomes habit, you would be hard-pressed to go back to a mobile phone without a pressure-sensitive touch layer. The exciting thing is that we are just scratching the surface of what 3D Touch can bring; and the module could be made thinner and lighter so that future iPhones won't get such a large weight bump.

    Other than the superior A9 SoC which has already been widely discussed, the other big thing for me that Android phones have been dropping the ball on is storage architecture. Whereas most Android phones are still advertising eMMC 5.0 storage solutions, the iPhone 6s' have moved way beyond that. Samsung's move to UFS 2.0 is a step in the right direction, and I hope all other Android phone manufacturers will follow suit soon.
  • dusszz - Monday, November 30, 2015 - link

    I've been a long time android user seriously thinking of switching to iphone. Android OS in general is not meant for high end devices because prior to nexus 6p, android is designed for nexus phone which is not a high end devices. The high end iteration of android as in galaxy s6/note 5 with skins feel fragmented and does not really in line with what google intended (material design). Sure they add features with that but it felt like they (high end oem) trying too hard to compete. I always feel the best android devices must come from nexus line but then it does not quite there at least just yet. Every innovation in android OS always feel like it is in beta because the implementation more for marketing rather than useful. For example, nexus 5 has OIS since 2013 but does not feel it has advantage over other phone that has EIS. Furthermore, decision google made to ditch OIS (nexus 6p/5x) further clarify it. I personally never have android phone for more than a year without feeling outdated in term of hardware. So if you think you buy $500 android phone thinking it can compete with iphone, its going to be disappointing. Android is at its best being a midranger.
  • hans_ober - Monday, November 2, 2015 - link

    at last!
  • vFunct - Monday, November 2, 2015 - link

    I wish he took proper photo tests.

    Tip: when testing cameras, do make sure to take photos of people. Don't take photos of brick walls.

    You're going to find that most people take photos of people with their phones - at parties, selfies, etc..

    A good camera test always includes people shots.
  • vFunct - Monday, November 2, 2015 - link

    Basically you're looking for skin-tone reproduction quality.
  • Klug4Pres - Monday, November 2, 2015 - link

    I wonder if next year the Home button will disappear, which would help a lot with the bezeltastic design.
  • zeeBomb - Monday, November 2, 2015 - link

    That username...lol.

    I dunno man, the home button is the staple of iPhone Design since the very original. Might be pretty controversial if you'd ask me.
  • KoolAidMan1 - Tuesday, November 3, 2015 - link

    The fingerprint reader is another big reason. If they can get it to be as fast and accurate as it is right now while reducing home button size then I can see them reducing the bottom bezel.

    Otherwise you're looking at making their fingerprint reader as flaky and undependable as Samsung or everyone else's
  • Tetracycloide - Tuesday, November 3, 2015 - link

    The nexus 5x has been super solid.

Log in

Don't have an account? Sign up now