Apple iPhone 4S: Thoroughly Reviewedby Anand Lal Shimpi & Brian Klug on October 31, 2011 7:45 PM EST
- Posted in
- iPhone 4S
Arguably the second largest hardware change (with the A5 SoC being the first and largest) in the 4S is the inclusion of a much improved 8MP camera. In case you’ve forgotten, the iPhone 4 previously included a 5 MP camera. Back when the 4 was introduced, Apple talked for the first time about backside illumination, and pixel sizes. In a later update, the camera got even better with the ability to buffer three full size images and merge to HDR in real time. This time, Apple brought up F/# and backside illumination again, and added one more thing.
Though Apple never talked about any of their optical design for the iPhone 4 camera, to the best of my knowledge the design likely was close to reference designs reported on a few lens lists consisting of four plastic elements. For the 4S, Apple has mixed things up by including its own optical design front and center, and made special note of a five plastic element design. I’ve put together a table showing the 4 and 4S in comparison based on what information is available.
Note that many have speculated that Apple is dual sourcing the CMOS sensor which seems likely, and given the sensors out there the two most likely choices are Omnivision’s OV8830 and Sony’s IMX105. Both of these have almost identical specifications, including 1.4µm pixels, a 1/3.2“ format, and an improved backside illumination process over the previous generation wafer-scale process. Omnivision’s BSI–2 process cites some specifications that seem to line up with what Apple talked about in their presentation, including better quantum efficiency (ability to convert photons into electrons), low-light sensitivity, and larger well capacity (which translates to increased dynamic range). You’ll note that the 4S uses the same sensor format as the previous generation - 1/3.2”, and includes more pixels, which results in the pixel size going down from 1.75µm to 1.4µm.
|iPhone 4 vs. 4S Cameras|
|Property||iPhone 4||iPhone 4S|
|Sensor Format||1/3.2" (4.54 x 3.42 mm)||1/3.2" (4.54 x 3.42 mm)|
|Optical Elements||4 Plastic||5 Plastic|
|Pixel Size||1.75 µm||1.4 µm|
|Focal Length||3.85 mm||4.28 mm|
|Image Capture Size||2592 x 1936 (5 MP)||3264 x 2448 (8 MP)|
|Average File Size||~2.03 MB (AVG)||~2.77 MB (AVG)|
Everybody likes talking about sensors (and I see lots of attention given to them), but any good photographer knows that it’s a combination of optical system and sensor that matters to performance. Optical design is important, and having studied as an optical engineer I find it interesting that Apple would draw attention to having a custom design of their very own with an additional plastic element. For a while I’ve held off on really talking about smartphone camera optics, but while we’re here, let’s touch briefly on them.
Thus far this generation and the one before it have primarily used 4 plastic elements, and virtually everyone but Nokia uses nothing but plastic (Nokia famously uses Zeiss-branded designs, often with glass elements). Optical design is generally driven by material availability, and there are only a few optical grade (read: transmissive in the visible) thermoplastics out there - Styrene, Polystyrene, ZEONEX, PMMA (Acrylic) and so forth - the list is actually relatively short. Thankfully polystyrene and PMMA can be used to make something of an achromatic pair, with polystyrene as a flint, and PMMA as something of a crown. Plastic provides unique constraints as well though - coatings don’t stick well, not very many have great optical properties, they have a high coefficient of thermal expansion, high index variation with temperature (which oddly decreases with increasing temperature), and less heat resistance or durability among others. With all those downsides you might wonder why smartphone vendors use plastic, and that reason is simple - they’re cheap, but more importantly, they can be molded into complicated shapes. Those complicated shapes are aspheres, which are difficult to fabricate out of glass, and afford much finer control over aberrations using fewer elements, which is an absolute necessity when working with very little package depth.
Apple's 4S versus 4 infographic
So what does adding another element get you? Well, when you’ve faced with limited material choices, adding more surfaces gives you another opportunity to balance aberrations that start blowing up rapidly as you increase F/#. That said, there are tradeoffs as well to adding surfaces - more back reflections, increased cost, and a thicker system. In the keynote, Apple notes that sharpness is improved by 30% in their new 5 element design, and MTF is what they’re undoubtably alluding to.
Genius electronic optical - 5P lens. Compare to above.
Genius electronic optical has a page on their website with a lens system that seems likely to be what’s in the 4S, as the specifications include 8 MP resolution (same size), same sensor format, F/# (2.4), 5 plastic elements (5P) and looks basically like what’s in the 4S. Other than that, however, there’s not much more that I can say about this Apple specific design without destructively taking things apart. One thing is for certain however, and it’s that Apple is getting serious about camera performance, something that other handset vendors like HTC (with its F/2.2 systems) are also doing.
Apple made mention that it also included an IR filter in the 4S optical design. If you recall back to our Kinect story, I used the 4 camera to photograph the IR laser structured light projector that Kinect uses to build a 3D picture. The 4 no doubt has an IR filter (though not a great one), but it’s probably just a thin film rather than a discrete filter right before the sensor. The 4S includes what Apple has deemed a ‘hybrid IR filter’ right on top of the sensor, which is possibly just a combination of UV/IR CUT filter (UV is a problem too), and an anti-aliasing filter.
If you try and take the same Kinect (IR source) picture with the 4S, thankfully all those non-visible, IR wavelength photons get rejected by the filter. This doesn’t sound like much until you realize that silicon is transparent in the IR and will bounce around off the metal structures inside a CMOS or CCD and create lovely diffraction effects on fancy sensors. I digress though since that’s probably not what Apple was trying to combat here. On a larger scale, IR will generally just cause undesirably incorrect color representation, and thus people stick an IR filter either in the lens somewhere or before the sensor, which is what has been done in the 4S. The thin film IR filters that smartphones have used in the past also are largely to blame for some of the color nonuniformity and color spot (magenta/green circle) issues that people have started taking note of. With these thin film IR filters, rays incident on the filter at an angle (as we move across the field) change the frequency response of the filter and the result is that infamous circular color nonuniformity. I wager the other effect is some weird combination of vignetting and the microlens array on the CMOS, but when I saw Apple make note of their improved IR filter my thoughts immediately raced to this ‘hybrid IR filter’ as being their logical cure for the infamous green circle the iPhone 4 exhibits.
Another minor difference on the 4S is that the LED flash is improved. The previous LED flash had a distinctively yellow-green hue, the LED flash on the 4S seems slightly brighter and also has a temperature that’s subjectively much closer to daylight, though I didn’t measure it directly. I habitually avoided using LED illumination on the 4 and will probably continue to do so on the 4S (and use HDR instead), but it does bear noting that the LED characteristics are improved. Unfortunately the diffuser and illumination pattern still isn’t very uniform or wide. It also seems that all this talk of moving the LED flash to the other side of the device to combat red eye turned out wrong as well.
Post Your CommentPlease log in or sign up to comment.
View All Comments
tipoo - Monday, October 31, 2011 - linkAnyone know if there is a reason this hasn't made it into any Andriod phone yet? Does Google specify compatible GPU's, or is it cost, or development time, etc? Looks like it slaughters even the Mali 400 which is probably the next fastest.
zorxd - Monday, October 31, 2011 - linkThe only reason is that no one used it yet. The TI OMAP 4470 will use the 544 which is probably a little faster.
The SGS2 is using the slower Mali 400, however it was released 6 months ago. Yet it's not that bad, even beating the 4S in Glbenchmark pro.
zorxd - Monday, October 31, 2011 - linkI meant no SoC vendor is using it.
djboxbaba - Monday, October 31, 2011 - linkThe numbers were incorrect and have been updated, the 4S is ~2x faster than the GS2 on the GLBenchmark Pro.
freezer - Thursday, November 3, 2011 - linkBut not when running at phone's native resolution. Thats what people will use while running games on their phone.
iPhone 4S has much more pixels for GPU to draw while having much smaller screen. Not very optimal for gaming right?
djboxbaba - Thursday, November 3, 2011 - linkCorrect, but we're comparing the GPU's by standardizing the resolution. Of course in the native resolution this will change.
thunng8 - Monday, October 31, 2011 - linkI don't see any GL benchmark that the Mail 400 beats the 4S???
freezer - Thursday, November 3, 2011 - linkThat's because Anandtech review shows only the 720p offscreen results.
This gives very different numbers compared to running GL Benchmark Pro in phone's native resolution.
iPhone 4S has about 60% more pixels than Galaxy S2, and so its GPU has to draw much more pixels in every frame.
Go to glbenchmark.com and dig database yourself.
Ryan Smith - Monday, October 31, 2011 - linkThe 544 should be identical to the 543 at the same clock and core configuration. It's effectively a 543 variant with full D3D feature level 9_3 support. The primary purpose of the 544 will be to build Windows devices, whereas for non-Windows devices the 543 would suffice. We don't have access to PowerVR's pricing, but it likely costs more due to the need to license additional technologies (e.g. DXTC) to achieve full 9_3 support.
Penti - Tuesday, November 1, 2011 - linkWho will use it to support Windows Phone though? Qualcomm uses their own AMD/ATi based Adreno GPU. I guess it will be TI's attempt off getting Microsoft to support Windows Phone on their SoC in order to supply say partners of theirs like Nokia. Or might just be a later purchase/contract date for the other SoC vendors. Getting the IP-blocks later, but many did opt for the Mali-400 so why wouldn't they opt for the successor too? It seems to have worked out good. Samsung is just one of the vendors that usually did use PowerVR. I guess ST-E will use it in order to support Windows Phone on Nova A9540 SoC too. While Android vendors might opt for the older A9500 still.
Interesting to see how Nvidia do lag in this field though.