The iPhone 7 Plus’ cameras are slightly better than the iPhone 6s Plus in good light. There are more details in each shot, but still not at the same level of clarity as the Samsung Galaxy S7. The iPhone 7 Plus’ raw files also have less detail than the S7, and more color noise. It looks like the iPhone 7 Plus applies a liberal dose of noise reduction in JPEGs, which gets rid of the color noise but also fine details. This holds true for both the 28mm and the 56mm cameras.
Besides shooting outdoors, we also shot our resolution chart in raw (both the 7 Plus and the S7 save to DNG). The S7’s images have more detail than the 7 Plus, and the 7 Plus’ 28mm raw images have more color noise. When shooting in JPEG, the 7 Plus loses that color noise but at the loss of fine details. Outlines also gain a slight jitteriness that’s not in the raw. The S7 JPEGs retain more image detail, but in extremely fine areas slight artifacts can be seen, which look like an after-effect of over-sharpening.
The iPhone 7 Plus’ 56mm camera has three disadvantages when compared to the 28mm; it has a smaller sensor, it has a slower aperture at f/2.8 compared to f/1.8, and it doesn’t have optical image stabilization (OIS).
But the 56mm camera (usually) returns the same results as the 28mm when shooting in good light. In good light, the amount of detail looks to be the same, and that could be a result of what Apple calls “Fusion.” According to TechCrunch’s Matt Panzarino, the iPhone 7 Plus takes images using both cameras every time, and merges the two photos into one for the best possible image.
The exception is in middling light, such as when shooting our resolution chart indoors. In the sample below, the 56mm lens was forced to shoot at a higher ISO setting of ISO 400 compared to ISO 100 on the 28mm, likely to compensate for the slower lens, and you can see significantly more image noise as a result.
However, and this is something I go into more on the next page, the iPhone 7 Plus does a bit of trickery behind the scenes. It won’t always listen to you when it tells you to shoot using the 56mm lens, instead, it’ll decide when either camera has shot the better image and saves that instead.
The iPhone 7 Plus will usually do that in low light, and when shooting macro in good or low light. That might be because the 28mm has a shorter focusing distance, and thus is able to actually get closer to the subject and be in focus.
This also means that when the iPhone 7 Plus swaps out a 56mm image for one shot with the 28mm, you’re essentially getting a digital 2x crop that’s blown up to 12MP. It’s a softer image with even less fine detail than if you’d originally shot in 56mm. You might not notice the loss in quality if the only time you look at your images is online or on the phone, but it might come as a rude shock if you ever print your photos.
A key new feature on the 7 Plus’ cameras is their ability to save JPEGs in wide color, which is Apple’s name for the DCI-P3 color space. DCI-P3 contains more colors than the previous sRGB color space, but you’ll need a display which supports wide color to tell the difference. The 7 Plus’ display does, as do the iPad Pros, and Apple’s newest Macs (here’s a test to see if your screen can display wide color).
As for myself, I couldn’t tell the difference. Even when viewing 6s Plus, 7 Plus and S7 images side by side on an iPad Pro, I still sometimes picked the S7 images for better color. Thinking I might be going color blind, I asked my art director to pick and choose which images with colors he preferred, and it ended up the same: He sometimes picked the 7 Plus’ photo, and sometimes the S7’s.