Google Pixel 4 and 4 XL review: The biggest upgrade Google has delivered so far
Camera & sample images
The Pixel 4 gets a second camera
I'll remember 2019 as the year Google decided to add a second telephoto lens to its Pixel phone, after years of insisting that it didn't need one. After all, the company has long said that it can accomplish with just one camera what others need two or three to do, thanks to its expertise in computational photography.
"Some subjects are farther away than you'd like, so it does help telephoto shots to have a telephoto lens," said Google Research's Marc Levoy at the launch of the Pixel 4. That's a rather obvious statement to make, but it's significant because Google is effectively admitting that you can't do everything with just software.
That said, the Pixel 4 and 4 XL have the same camera setup. Here's what it looks like:
- 12.2MP, f/1.7, 28mm (wide), 77° field of view, 1/2.55", 1.4µm, dual pixel PDAF, OIS + EIS
- 16 MP, f/2.4, 45mm (telephoto), 52° field of view, 1.0µm, PDAF, OIS + EIS, 2x optical zoom
- 8 MP, f/2.0, 22mm (wide), 90° field of view, 1.22µm, fixed focus
Google's choice to go for a second telephoto lens over a wide-angle one is also an odd one. “While wide-angle can be fun, we think telephoto is more important,” said Levoy on stage. Maybe, but Google already had a pretty decent solution for capturing far-away subjects on the Pixel 3 in the form of Super Res Zoom. That's getting beefed-up with the telephoto lens on the Pixel 4, but wide-angle is the one thing you can't even attempt to approximate using software. Google has already proven that it's great at software, so a wide-angle lens seems like it would be an obvious complement to it's already formidable capabilities.
Sure, the Pixel 4's Portrait Mode can also take advantage of the telephoto lens, but a wide-angle lens would be great for taking dramatic landscape shots. Why didn't Google just add a third wide-angle lens while it was at it? I really can't say, especially since most other flagship devices already have at least three cameras, including this year's iPhone 11 Pro.
That aside, the camera app is also getting something called Live HDR+, which lets you see in real time what your picture will look like after all that HDR processing. Normally, you'd have to open up the picture in Google Photos to see what it looks like after Google is done working its software magic. The difference was obvious when I was out shooting with both the Pixel 4 and Pixel 3 XL, and the colours were far more vibrant through the viewfinder on the Pixel 4.
Live HDR+ also pairs well with the Pixel 4's dual-exposure controls, which gives you separate sliders for manipulating highlights and shadows. You just tap on the viewfinder to bring up the sliders, and you can also tap the lock icon for AF/AE lock. The extra controls are good for challenging lighting situations, such as when you want to photograph someone silhouetted against a bright sky.
In addition, the Pixel 4 is better at adjusting for colour casts, such as the yellow from the glow of a candle. Its white balance adjustments are now based on machine learning, so you should get more natural skin tones even in less than ideal lighting conditions. This was the approach Google took with Night Sight on the Pixel 3, but it's now applying it to regular photos as well.
Another feature I really like is Social Share, which lets you quickly share to third-party sites directly from the camera app. After taking a photo, you'll see an arrow icon pop up from the image thumbnail at the bottom right. Slide it out, and you can quickly share to Instagram Stories or your messaging app of choice. It's really convenient, and most of the major social platforms are supported, including Discord, Messages, Instagram, Telegram, Snapchat, and Facebook.
It's also easier to access Google Lens on the Pixel 4 – just tap and hold on the object in question in the viewfinder. Lens doesn't get stuff right all the time, but it's pretty impressive when it works. For example, it was able to correctly identify my PC case, which I found to be quite cool.
Unless otherwise stated, the pictures below are taken in auto mode with no adjustments made to the dual-exposure sliders. You can click on them to view the full-resolution images. These shots were taken on the Pixel 4 XL, but since the Pixel 4 has the same camera, you'll get the same results on the smaller phone too.
Google has tweaked its HDR aesthetic slightly with the Pixel 4, and it now favours more natural colours and lighter shadows. That's telling in the daytime shots below, where the Pixel 4 did a pretty good job of presenting true-to-life and accurate colours. It didn't dial them up by much, and because there weren't many hours of daylight left when I started shooting, many of the photos also appear slightly muted.
There's also not much of that oversharpening effect you tend to get on Samsung and Huawei devices, so you'll be quite happy with the Pixel 4 if you prefer your photos to look more natural. Overall, the Pixel 4 produces bright and clean photos, and it's remarkably good at handling edges and preserving detail, even for the complex facades on shophouses.
But while Google is pretty conservative in daylight, the reverse happens at night. The Pixel 4 isn't shy about pumping up the colour saturation, and a light blue sky immediately takes on a deep blue hue. The same goes for things like neon lights, and the Pixel 4 is capable of some stunning nighttime shots.
It does a reasonably good job of handling the lights in the restaurants below, and you can still make out details like people at the tables. Some areas are still blown out however, and darker portions of the picture, like the area surrounding the boat, are a bit noisy. That said, Night Sight would probably have come in handy here.
The Pixel 4 is pretty adept at close-up shots too, capturing the intricate detail on the keycap with no issues.
Super Res Zoom
The pictures below are taken with the maximum 8x zoom. I've also included the original shot below each magnified shot so you get an idea of the distance from which it was taken.
One problem with zooming in is that even the slightest wobble can result in a blurry photo. Super Res Zoom gets around that by taking advantage of these inevitable movements and using them to collect even more detail about the scene. This means higher quality magnification, and up to 8x digital zoom.
From what I can see, the results are pretty impressive. It's able to capture good detail on even the farthest signs, and the lettering is even sharper than if you cropped a similar section of the original photo. And considering how crowded the section I zoomed in on was, Google did a really good job of filling in all that detail. The same goes for the second set of pictures, where the camera was able to pick out individual air-conditioning units and lines on the building. Whatever you may think of Google's decision to go with a telephoto lens over a wide-angle one, it sure has done wonders for its magnification capabilities.
Google was the one that got the ball rolling with Night Sight, and it's on most other flagship phones now, including the iPhone 11. Night Sight on the Pixel 4 continues to work wonders in near pitch-black conditions, and it was able to pick out surprising detail and colour in an otherwise dark room. You can even make out the Omega lettering on the Secretlab chair, and the colour of the trash in the dustbin. Both are nearly entirely obfuscated when Night Sight is off.
Of course, Night Sight isn't magic, and the picture is still really noisy. But all the items in the room are clearly visible, and it's still pretty amazing what fusing together multiple images with longer exposure times can do.
There's also a dedicated astrophotography mode for capturing the stars on a clear night, but I didn't bother with that given the amount of light pollution we get in Singapore.
This is a less drastic example, but Night Sight also visibly improves detail in the darker areas around the swimming pool. For instance, you can now make out something approaching individual leave blades in the foliage on the left.
Thanks to the telephoto lens, the Pixel 4 is also able to capture 3D scene data in the form of a depth map, which opens up more editing possibilities in Adobe Lightroom. This is already possible on the iPhone, so Google is sort of playing catch-up here.
The Pixel 4 derives spatial data from two sources – the distance between one side of the telephoto lens to the other, and the distance between the two cameras. This helps it better judge distance for both close and distant objects, and because these two gaps are oriented at right angles to each other, they can work together to judge distance along both the x- and y-axis.
The picture of the chicken looks pretty good if you don't scrutinise it too closely, and the camera does a decent job of keeping the animal and the foreground area in focus while blurring out the background. However, the transition between the foreground and background isn't that smooth, and you can still make out the abrupt edges around the grass where the camera decided it would start to begin blurring out the background.
That aside, Portrait Mode on the Pixel 4 produces some delicious bokeh effects, and points of light in the background blur smoothly into creamy, even discs. Google actually put special emphasis on improving bokeh on the Pixel 4 to better simulate real lens bokeh. Normally, these simulations take an average of nearby pixels, but this tends to turn white things grey. To better preserve the white, Google moved Portrait rendering from an 8-bit pipeline to a 14-bit one, which can better capture the brightest whites, such as sunlight glinting off water.