Probably one of the biggest changes with the HTC One series is emphasis on camera quality. It’s quickly becoming one of the most important axes of both improvement and comparison for handset vendors, and HTC has taken a rather unique approach with the One. So what is that unique approach exactly? It’s the inclusion of discrete ISP which HTC calls ImageChip.

Moving to a discrete ISP to tackle some of the image pipeline makes sense for a number of reasons. First, it makes optimizing for image quality less of a moving target if you’re a handset vendor. Each SoC has different ISP, and each combination of both ISP and camera sensor/module needs to be tweaked for optimal image quality. The result is a lot of optimization work that takes time if you’re deploying multiple platforms with silicon from different vendors. Moving to one single controlled platform in conjunction with one single module and sensor makes it possible to really squeeze everything out of that combination, and in addition spend more time characterizing the system. The other axis ties into the One branding - each One series smartphone should actually perform the same, thanks to this unified combination of ISP and rear facing camera. So far, this applies to all of the One S and One X(L) variants.

HTC’s ImageChip is responsible for most of the things that would traditionally be done on the SoC ISP. 3A (Autofocus, Auto white balance, and Autoexposure), lens correction (geometric and chromatic correction), noise reduction, best shot selection, continuous auto focus, controlling gains on the CMOS sensor, LED flash level decision, region of interest identification (augmented with face detection) and so on. This is all stuff you can verify yourself by taking apart some of the ISP related files - curiously enough internally ImageChip is actually referred to as “rawchip.” This is also the hardware responsible for enabling HTC’s extremely fast image continuous capture and frame grabbing during video capture (HTC Video Pic). It’s somewhat analogous to what Google and TI did with the OMAP4460 on the Galaxy Nexus, except discrete and with a much more ambitious focus.

The rest of HTC’s efforts involve both optics and sensor. The optical system consists of an F/2.0 3.6mm system (28mm effective). HTC has at present the fastest aperture on a smartphone camera that I’m aware of, surpassing some of its own F/2.2 systems and the F/2.4 systems on other vendors camera modules. Getting a larger aperture is no small feat, as aberrations quickly blow up with F/#. I’m impressed with how well HTC has kept these at bay with the One system. The HTC One X on AT&T (and the One X international) use the same front and rear facing cameras, and we’ve actually seen these modules before. The rear facing CMOS is a Samsung S5K3H2YX which Vivek spotted in the MyTouch 4G Slide. This is an 8 MP 1/3.2“ format BSI sensor with 1.4µm square pixels. The front facing camera is a 1.3 MP S5K6A1GX 1/6” format FSI sensor with 1.75µm pixels. You can check for yourself by catting “/sys/android_camera/sensor” on the device.

The next component of HTC’s camera emphasis is its new camera UI, which is excellent. There’s properly implemented tap to focus/expose, and in addition the camera preview appears to be close to native resolution. There are settings and configuration options if you want them, including ISO, white balance, and manual exposure. My only gripe is that HTC continues to ship with the camera set to 16:9 aspect ratio instead of 4:3 and taking full advantage of the sensor.

There are two other tabs - some camera scenes settings, and filters. Tapping on the scenes option lets you switch into HDR mode, panorama shooting, and a few other useful presets. HTC’s HDR mode is the first I’ve seen which works just like the HDR mode on the iPhone by combining 3 (or more?) exposures from a bracket capture, quickly. There are a bunch of applications which do this through the camera API on Android, but the result is too much time between frames and more chance of shift. I captured a few HDRs using the in-camera option which looks great. The other is the blue filter button which pops you into some vaguely Instagram-like options, including vignette, a depth of field filter, and others.

There’s no dedicated video or still mode, instead to start taking video or images you just tap the appropriate button. Still image capture is basically instantaneous, and just like the Galaxy Nexus there’s continuous auto focus running, alternatively you can tap to focus on a specific region before capture. The other feature is what I mentioned briefly before - continuous capture (by holding down the button) and the ability to select from what the ISP determines is the best of those captures, or save the whole series.


An AT&T eNodeB - Captured with the HTC One X (AT&T) - How meta...

To get to the bottom of still image quality, we turned to our regular set of evaluation tools, consisting of both photos taken in a fixed smartphone lightbox test scene with the lights on and off, with test charts (GMB color checker card, ISO12233, and distortion), and at our smartphone bench locations.

I’m very impressed with HTC’s One X/S series camera, to say that HTC has made massive improvements is an understatement. The One X has some of the best (if not the best) low light performance I have seen from a smartphone, no doubt in thanks to that fast F/2.0 aperture and noise reduction they’re doing on their ISP. HTC’s optics are very controlled with minimal distortion and remain nice and sharp across the field, which is no small feat. The lights on lightbox sample has great color saturation, sharpness, and dynamic range. In the dark with the flash on, their ability to control LED brightness also helps avoid the washed out flash look that I see from other smartphones as well.

The other part is that HTC’s optical system now has a low enough F/# to overcome the small 1.4µm pixels and deliver shallow enough depth of field to actually show some nice, visible bokeh. It isn’t perfect, but it’s definitely there.

I think with the HTC One series it’s time to add HTC to the short list (Nokia, Apple) of smartphone vendors doing more than just integrating a module into their smartphone platforms.

Video

Video is the other part of the camera puzzle, and here ultimately HTC has to touch the SoC for video encode, which isn’t done on ImageChip. At that point there’s going to be some differences between platforms, but at the high end (everything but the One V) 1080p30 video encode is in order.

In the case of the One X on AT&T, that means 1080p30 video encoded at 10 Mbps H.264 baseline with one reference frame. That’s unfortunate since I know that MSM8960’s encode blocks can do more, but I’ve seen similar for a while now from other vendors. Audio is 128kbps stereo AAC. The One X also can do 60 fps video capture at the rather odd 768x432 resolution, which plays back at around 24fps. I’m still waiting for a platform that can do 720p60 properly, unfortunately. Front facing video is 5 Mbps 720p30 also at H.264 baseline.

I have no complaints with video capture quality other than that I feel quality would get a bump from either higher bitrate or better encode. That said there isn’t much distracting macroblocking or other annoying artifacts. HTC also gives you control over whether you want to stabilize video or not in the menus.

Rear 1080p Sample
 
Front 720p Sample
 
Rear Slow Motion Sample

As usual I’ve uploaded the videos to YouTube, and to the AT servers for you to download in a zip without YouTube’s transcode.

Performance Display - Infinity Screen
Comments Locked

137 Comments

View All Comments

  • MrMilli - Tuesday, May 1, 2012 - link

    "On the GPU side, there's likely an NVIDIA advantage there as well."

    How do you get to this conclusion?
    Qualcomm scores a little bit higher in Egypt as in the Pro test of GLBenchmark. I don't know why you would put any importance to the off-screen tests for these two devices since they both run the same resolution (which is even 720p) which takes me to my next points. Actual games will be v-synced and how does the Tegra suddenly become faster than the Adreno even though they both are still rendering at the same resolution as on-screen but just with v-sync off. I've always had a hard time accepting the off-screen results of GLBenchmark because there's no way to verify if a device is actually rendering correctly (or maybe even cheating). Can you imagine testing a new videocard in the same fashion?
  • metafor - Tuesday, May 1, 2012 - link

    Results can vary with v-sync because Tegra could be bursting to higher fps values. The offscreen isn't a perfect test either but it gives you an idea of what would happen if a heavier game that didn't approach the 60fps limit would be like.

    Of course, those games likely won't have the same workloads as GLBenchmark, so it really wouldn't matter all that much.
  • ChronoReverse - Tuesday, May 1, 2012 - link

    The offscreen test is worthless really.

    If at 720p, the same benchmark, except it puts an image on the screen, shows that the S4 GPU is faster than the Tegra3 GPU, then how useless is the offscreen test showing the opposite?

    Furthermore, neither the S4 nor Tegra3 comes close to 59-60FPS, both tipping at around the 50FPS range.

    It's pretty clear that by skipping the rendering, the offscreen test is extremely unrealistic.
  • metafor - Wednesday, May 2, 2012 - link

    It doesn't need to come close. It just needs to burst higher than 60fps. Let's say that it would normally reach 80fps 10% of the time and remain 40fps the other 90%. Let's say S4 were to only peak to 70fps 10% of the time but remained at 45fps the other 905. The S4's average would be higher with v-sync while Tegra's would be higher without v-sync.

    The point of the benchmark isn't how well the phone renders the benchmark -- after all, nobody's going to play GLBenchmark :)

    The point is to show relative rendering speed such that when heavier games that don't get anywhere close to 60fps are being played, you won't notice stutters.

    Of course, as I mentioned, heavier games may have a different mix of shaders. As Basemark shows, Adreno is very very good at complex shaders due to its disproportional ALU strength.

    Its compiler unfortunately is unable to translate this into simple shader performance.
  • ChronoReverse - Wednesday, May 2, 2012 - link

    That's still wrong. If you spike a lot, then your experience is worse for 3D games. It's not like we don't know that minimum framerate is just as important.

    As you mentioned stutters, a device that dips to 40FPS would be more stuttery than one that dips only to 45FPS.
  • metafor - Thursday, May 3, 2012 - link

    I'm not disagreeing. I'm just saying that v-sync'ed results will vary even if it's not close to 60fps. Because some scenes will require very little rendering (say, a panning shot of the sky) and some scenes will require a lot of heavy rendering (say, multiple characters sword fighting, like in Egypt).

    The average fps may be well below 60fps. But peak fps may be a lot higher. In such cases, the GPU that peaks higher (or more often) will seem worse than it is.

    Now, an argument can be made that a GPU that also has very low minimum framerates is worse. But we don't know the distribution here.
  • Chloiber - Monday, May 7, 2012 - link

    Well the benchmark doesn't measure your experience in 3D games but the fps.
  • snoozemode - Tuesday, May 1, 2012 - link

    The ATRIX has a LCD pentile RGBW display, as well as the HTC one S, so LCD is definitely not a guarantee for RGB. Maybe you should correct the article with that.
  • snoozemode - Tuesday, May 1, 2012 - link

    Sorry one s is obviously amoled.
  • ImSpartacus - Tuesday, May 1, 2012 - link

    It's also the passable RGBG pentile, not the viled RGBW pentile.

Log in

Don't have an account? Sign up now