The Google Pixel 3 Review: The Ultimate Camera Test
by Andrei Frumusanu on November 2, 2018 11:00 AM EST- Posted in
- Smartphones
- camera
- Mobile
- Pixel
- Snapdragon 845
- Pixel 3
Camera - Low Light Evaluation - Night Sight
Of course one of the new exciting features about the new Pixel 3 is the promise of its Night Sight mode. As mentioned a few pages back, in order to enable this facility we’re using a modified camera application in order to get the mode working for this review, as otherwise it would have made for a pretty boring low-light comparison.
I’m also showcasing the camera differences on the original Pixel as well as Pixel 2, so that users can see what kind of improvements they can expect on their existing devices. Both of these devices also have the Night Sight enabled option for the software.
[ Pixel 3 ] - [ Pixel 2 ] - [ Pixel XL ]
[ Mate 20Pro ] - [ Mate 20 ] - [ P20Pro ]
[ P20 ] - [ Mate 10Pro ] - [ iPhone XS ] - [ iPhone X ]
[ Note9 ] - [ S9+ ] - [ S8 ] - [ LG G7 ] - [ LG V30 ]
[ OnePlus 6 ] - [ OPPO FindX ] - [ MIX2S ]
In the first construction scene, the difference between the auto shot and the Night shot are, pardon the pun, night and day. Here the differences in processing are quite astounding and make for a major improvement in the Pixel’s low-light capture ability.
The resulting image is significantly brighter than what how the scene looked in reality. I’d even go as far that the Pixel is so aggressive with the exposure here that it even goes a bit too far, as the Mate 20 Pro’s auto mode and Mate 20’s night mode seem a lot more realistic. It’s to be noted that the Mate 20 Pro’s result is achieved with no software tricks – just relying on the ISO25600 mode of its sensor.
The Pixel 2, with the Night Sight enabled software, manages to get a near identical result to the Pixel 3, and even the original Pixel doesn’t seem too far off.
[ Pixel 3 ] - [ Pixel 2 ] - [ Pixel XL ]
[ Mate 20Pro ] - [ Mate 20 ] - [ P20Pro ]
[ P20 ] - [ Mate 10Pro ] - [ iPhone XS ] - [ iPhone X ]
[ Note9 ] - [ S9+ ] - [ S8 ] - [ LG G7 ] - [ LG V30 ]
[ OnePlus 6 ] - [ OPPO FindX ] - [ MIX2S ]
Night Sight doesn’t seem to need to be used in very dark scenes to show a benefit, as even with artificially lit objects such as the tree here we can see benefits to the scene. The result puts the Pixel phones far ahead of conventional shooters from Samsung and Apple, with only Huawei’s being able to keep up and battle Google’s new algorithm.
One characteristic of Night Sight is that it doesn’t seem to be able to actually bring down highlights – Huawei’s implementation on the other hand will do this, and that’s why the tree in Huawei’s mode is far less blown-out compared to Google’s camera.
Where Google does shine is in terms of detail retention – the Pixels are able to retain significantly more details than Huawei, and for that matter, the Pixels retain more details than all of the other phones.
[ Pixel 3 ] - [ Pixel 2 ] - [ Pixel XL ]
[ Mate 20Pro ] - [ Mate 20 ] - [ P20 ]
[ Mate 10Pro ] - [ iPhone XS ] - [ iPhone X ] - [ Note9 ]
[ S9+ ] - [ S8 ] - [ LG V30 ] - [ OnePlus 6 ] - [ OPPO FindX ]
The main benefits of Night Sight in scenarios where there is sufficient light is that it allows for better detail retention and less noise. Google competition here is again Huawei – however the Pixels are able to edge out the P20’s and Mate 20’s in terms of detail retention and less noise.
[ Pixel 3 ] - [ Pixel 2 ] - [ Pixel XL ]
[ Mate 20Pro ] - [ Mate 20 ] - [ P20Pro ]
[ P20 ] - [ Mate 10Pro ] - [ iPhone XS ] - [ iPhone X ]
[ Note9 ] - [ S9+ ] - [ S8 ] - [ LG G7 ] - [ LG V30 ]
[ OnePlus 6 ] - [ OPPO FindX ] - [ MIX2S ]
When going into lower light scenes, again, the Pixels are able to produce images that are much brighter than how the scene was originally.
Again, the only phones able to compete in terms of light capture are Huawei’s – but again, the Pixels are able to produce a better image thanks to better detail retention. Huawei’s phones here most likely are suffering from the lack of OIS on their main cameras.
[ Pixel 3 ] - [ Pixel 2 ] - [ Pixel XL ]
[ Mate 20Pro ] - [ Mate 20 ] - [ P20Pro ]
[ P20 ] - [ Mate 10Pro ] - [ iPhone XS ] - [ iPhone X ]
[ Note9 ] - [ S9+ ] - [ S8 ] - [ LG G7 ] - [ LG V30 ]
[ OnePlus 6 ] - [ OPPO FindX ] - [ MIX2S ]
Although this is meant to be a comparison between 18 phones, the real fight here is just between the Pixel 3 and Huawei’s devices. Again the Pixels here significantly win because of the vast advantages in terms of detail retention and sharpness – far ahead of any other phone.
Extreme low-light
Extreme low light scenarios is something as early as last year we wouldn’t have expected phones to be viable in. Again I started shooting such scenes earlier in the year when Huawei made its Night mode usable without a tripod – along with vendors like LG introducing pixel binning modes that quadruple the light capture of the sensors.
[ Pixel 3 ] - [ Pixel 2 ] - [ Pixel XL ]
[ Mate 20Pro ] - [ Mate 20 ] - [ P20Pro ]
[ P20 ] - [ Mate 10Pro ] - [ iPhone XS ] - [ iPhone X ]
[ Note9 ] - [ S9+ ] - [ S8 ] - [ LG G7 ] - [ LG V30 ]
[ OnePlus 6 ] - [ OPPO FindX ] - [ MIX2S ]
This shot is very similar to the first one in that the Pixels are able to generate such bright pictures that I’d say they’re overexposed. Again, only Huawei’s phones and as well as the LG G7’s LLS mode are able to achieve similar light capture. The latter suffers from a stark lack of details, leaving only Huawei’s phones in the competition.
What is very interesting is to see just how much colour accuracy Google is able to achieve even with such low brightness levels.
[ Pixel 3 ] - [ Pixel 2 ] - [ Pixel XL ]
[ Mate 20Pro ] - [ Mate 20 ] - [ P20Pro ] - [ Mate 10Pro ]
[ iPhone XS ] - [ iPhone X ] - [ Note9 ] - [ S9+ ] - [ S8 ]
[ LG G7 ] - [ LG V30 ] - [ OnePlus 6 ] - [ OPPO FindX ] - [ MIX2S ]
The last shot I wanted to take the phones to their limits – the vast majority of phones here won’t be able to discern nearly anything and many will just produce a black picture. The scene was solely illuminated by moonlight of a full moon as well as some far as way industrial spotlights.
Even here, the Pixel’s Night Sight is able to deliver, producing a semi visible result of the object. Only the Mate 20 Pro’s ISO102400 shot was able to come near the exposure levels, but with significantly more noise.
Low-light conclusion
This conclusion of the Pixel 3 in low light would have sounded extremely differently if I had just used Google’s official camera application and not tested Night Sight. I’ve never really understood why people claimed the Pixel 2 camera to be good in low-light, because in my experience as well as visible in these sample shots, the Pixels were never really competitive and are outclassed by the better sensors from Samsung and Apple, when capturing in traditional modes.
Night Sight is very much a game-changer to this situation, and Google is able to showcase an outstanding example of computational photography that vastly beats even the wildest expectation of what a smartphone camera is able to achieve in low-light scenarios.
In a swoop, Google’s Pixels significantly climb up the ladder in terms of low-light photography ranking, even putting themselves at a comfortable distance ahead of the previous low-light champions, Huawei’s 40MP sensor phones as well as their own night mode.
135 Comments
View All Comments
Impulses - Saturday, November 3, 2018 - link
They're not using bad sensors, I mean, they often manufacture everyone else's, the post processing is often horrid for Sony tho. You think they could get someone from their dedicated camera division to better tune that (they don't have the greatest JPEG engine either but it's gotten better every year).Edwardmcardle - Friday, November 2, 2018 - link
Awesome review as always. Will there be a mate 20 pro review? Have one and am considering returning because of odd screen issue. Also the new performance mode seems to suck battery, but animations seem laggy when not engaged...would be great to have a professional insight!luikiedook - Friday, November 2, 2018 - link
Excellent review, and comparison photos galore! I think a lot of the day light photos is a matter of opinion. Personally I find the iPhone Xs and Samsung photos over exposed and less pleasing than the pixel photos. The outdoor seating area for example, the black table tops look reflective and almost white in the iphone Xs and Samsung photos.Most of the time the Pixel photos are darker, but I'm not convinced there is less detail, most of the time.
The p20 pro seems to crush everything in 5x zoom.
melgross - Sunday, November 4, 2018 - link
The shadows on the Pixel are all blocked up. It’s pretty obvious. Some people mistakenly equate black shadows with better contrast, as Google apparently does, but that’s wrong. You can always darken the shadows later in a quick edit. But if the detail is killed on the photo, you can never retrieve it.Dr. Swag - Friday, November 2, 2018 - link
Hey Andrei, you got the displays mixed up. The 3XL uses a Samsung amoled panel whereas the 3 uses a p-oled from LG. The table on the first page says the opposite.warrenk81 - Friday, November 2, 2018 - link
haven't even read the article yet, just want to say i'm so happy to see smartphones review return to Anandtech!!spooh - Friday, November 2, 2018 - link
Pixel XL used in the review has optics issue affecting corner sharpness, and light fallof. I think it's also slightly less sharp than good unit.I've had one with the same issue, but returned it.
id4andrei - Friday, November 2, 2018 - link
The Verge reviewer Vlad Savov is a big fan of Google's computational photography. He makes it seem like the Pixel is clearly above the latest flagships. Your expansive review paints a different picture, that of a phone that tries to keep up with a single camera module.On a personal level I have a dilemma. Isn't computational photography basically post-processing? Even if it produces a subjectively better outcome out of stitching several shots, isn't it "fake" somehow as it is not an accurate representation of a frame?
Andrei Frumusanu - Friday, November 2, 2018 - link
> Even if it produces a subjectively better outcome out of stitching several shots, isn't it "fake" somehow as it is not an accurate representation of a frame?Not really. If a sensor fails to have sufficient dynamic range by itself, then even with no processing that's also going to lead no an "inaccurate representation".
Impulses - Friday, November 2, 2018 - link
It's a little fancier than the post processing you could (easily) manage yourself, mostly cause of the way they chop up frames in tiles to then stack them intelligently... You could say it's "fake" in instances where their algorithm just decided to drop a tile to avoid artefacts or movement etc., but wouldn't you just clone those out yourself if you were anal about the overall end result?It's an interesting question without a straightforward answer IMO. It's just gonna vary by shot and usage case, if you're getting consistently better DR then you're consistently closer to "what you see", but all photography is ultimately an interpretation.