Mikael Almehag Photography

iPhone 7 Plus: Spoofing DOF

Screenshot from september 2016 keynote

Less than 3 days have passed since the next generation of iPhones were presented. There’s already been a lot said and written about what the new dual camera module of the 7 Plus will do for photography. Some have gone as far as predicting that the interchangeable lens camera segment will be the next to take a serious blow when a new generation of innovative camera phones emerges.

Control over depth of field has long been a feature associated with larger sensors and apertures. f/1.8 (at 28mm in 35mm equivalent) and f/2.8 (at 56mm in 35mm equivalent) may sound like a relatively generous offer, but the 1/3-inch and 1/3.6-inch sensors in the 7 Plus will severly limit the ability to achieve shallow DOF. To put it in other words, if you are shooting with the f/1.8 lens and your subject is further than 7 feet away, everything behind your subject (all the way to ∞) will be in focus.

To address this Apple opted for shallow DOF simulation using a dual camera module in their 5.5” “Plus” version of the iPhone 7. Apple was not first to the race with the dual camera based shallow DOF spoofing. The HTC One M8, Honor 8 and Huawei P9 all provided a way to achieve the much sought after background separation by using two cameras.

We can only hope that Apple studied previous iterations of this technology closely. The P9 used a dedicated image signal processor to do the DOF trickery, but that didn’t help much. Those who have become acquainted with the OOF rendering (and glitches) of the Leica branded P9 can attest to its awfulness. Considering the sub par shallow DOF simulation of this camera module we can probably conclude that it is a difficult thing to simulate algorithmically — even with a second sensor capturing a depth map of the scene.

The 7 Plus will rely on machine learning to do it’s magic, which is something. The scope of use will also be severly limited as it is going to be tucked away as an effect that is destined for portrait use only. The portrait shooting mode switches to the 56mm lens and unlocks what they call a “depth effect” which will be rendered on screen in realtime. So, if you’re looking to spice up your flower macros with some delicious blur – think again. This feature will only work if you are capturing humans (at least for the time beeing).

Apple’s strongest suit hasn’t lately been innovation. I realize some will oppose to that statement but (as many others) I usually argue that their greatest merit is a long history of refining already existing technologies and making them enjoyable and usable. In this sense, I have some hope that they will succeed with spoofing DOF where others failed.

Somewhat alarming, though, is the fact that this feature will not be shipping with the phone. As it stands now, it will be slated for a november release. Only Apple knows if this has to do with challenges that were greater than expected or if they’re just giving the machines some time do their learning.