iPhone 7 Plus Camera
You might think the dual cameras in Apple’s iPhone 7 Plus aren’t that complicated. It’s like having two regular cameras slung over your shoulder, one for wide-angle shots, the other for zooming in on subjects farther away, right?
Apple’s approach uses both iPhone cameras at the same time. The iPhone 7 Plus blends its two cameras into one, drawing on each camera’s virtues and sidestepping their weaknesses to try to get the best image possible.
That lets the telephoto lens sharpen some wider-angle photos. But it also can mean limits you might not expect — like the effective loss of that second camera when you’re shooting in dim conditions.
With screen sizes settling down and processor speeds leveling out, it’s harder these days to convince customers that the latest phone is a big step up. But one area that continues to draw interest is a phone’s camera because it captures your most personal moments and lets you share them with friends and family. By throwing away some traditional aspects of digital camera design, Apple’s dual-camera approach shows there’s still room for significant improvements when it comes to photography.
Apple devoted 15 minutes to the iPhone 7 and 7 Plus cameras — a full eighth of its two-hour event — to launch the new iPhone and the Apple Watch Series 2. Phil Schiller, Apple’s senior vice president of worldwide marketing, didn’t hold back: “This is the best camera ever made in any smartphone.”
When your iPhone 7 Plus acts like an iPhone 7
The iPhone 7 and 7 Plus each come with a wide-angle camera, the equivalent of a 28mm focal length on a traditional SLR camera. The iPhone 7 Plus adds the longer 56mm equivalent camera to magnify more distant subjects and to zoom into a face for a portrait. (The iPhone image sensors are identically sized but much smaller than those of a full-frame SLR; their lenses’ actual focal lengths are 3.3mm and 6.6mm.)
The telephoto option is nice for anyone frustrated by years of phones that offer no alternative to wide-angle. There’s digital zoom, but it uses software to expand the central portion of the image, which is why it often looks grainy or blurry.
Here’s the rub: The iPhone 7 Plus’ 56mm-equivalent telephoto lens doesn’t always kick in.
In many circumstances, the telephoto lens takes over when you’ve set the phone to shoot at 2x magnification. But CNET testing shows that for zoomed-in shots in dim conditions, the iPhone 7 skips the telephoto camera and uses the wide-angle camera instead. In other words, beyond 2x zoom, you might get the same shot you would with the iPhone 7 and its single wide-angle camera. The phone makes up the extra pixels with digital zoom, not optical zoom.
The wide-angle camera also has a maximum aperture of f1.8, which gathers more than twice as much light as the f2.8 telephoto lens. In dim conditions, the telephoto lens aperture would mean less light and therefore more image noise — the off-color speckles that can degrade photos.
A telephoto boost for wide-angle shots
But in other circumstances, the telephoto camera lends a hand to improve wider-angle shots.
When you’re shooting between 1.5x zoom and 1.99x zoom, the iPhone 7 Plus doesn’t just digitally magnify the pixels from the wide-angle lens. It also can blend higher-resolution imagery from the telephoto lens into the central portion of the frame — an approach Apple internally calls “fusion.”
You still get a 12-megapixel image. But the second camera can provide sharper detail to avoid digital zoom muddiness.
It’s the kind of clever trick that’s unthinkable with traditional digital cameras, models whose single image sensor is more technologically similar to the film days. But computational photography, in which computer processing and not just optics is instrumental in capturing a photo, is steadily maturing.
Apple isn’t alone here. The Huawei P9 combines imagery from two cameras for better photos, for example. But the iPhone is among the most popular phones on the market year after year, so Apple’s choices carry corresponding influence.
Computational photography already blends multiple shots into a single high-dynamic range (HDR) photo that can capture a scene like we remember it, with details in both the shadows and bright areas. And it can compensate for lens problems, like the distortion that can make parallel lines bow out like the sides of a barrel. Expect more developments in the field.
The two-lens approach opens up new options. In October, Apple plans to issue an update to let the iPhone 7 Plus blur backgrounds in portrait photos. It’s a simulation of the effect more expensive cameras achieve naturally to concentrate attention on the subject. Apple does it by using the two lenses to calculate a 3D “depth map” of the scene and blurring only the parts that are distant from the camera.
Apple wasn’t the first to offer dual cameras on a phone, and it’s not the only player in computational photography. But it is showing a sophisticated understanding of how to package all the new technology so ordinary folks get to use it.