Apart from the screen, the big different between the iPhone XS and XR is the camera. The XS has two, and the XR only has one. This means that — like a one-eyed person — the XR camera can’t calculate the depth of objects in a scene, and therefore can’t use the Depth Blur feature to blur the background. It works around this by using clever facial recognition tricks to allow Portrait Mode with people, but that’s it.
Until now, that it. In its latest update, camera app Halide adds back this functionality to the new iPhone. That’s right. With Halide, you can take depth-effect pictures of anything with the iPhone XR.
In a blog post, Halide developer Ben Sandofsky explains how depth mode works on both the iPhone XR and XS. The gist of it is that the XR uses its focus pixels to detect how far away some parts of the scene are, and combines this with face recognition and some very clever processing to make a Portrait Effect Map.
The reason that this is limited to people, says Sandofsky, is that the iPhone “uses machine learning to create a highly detailed matte that’s perfect for adding background effects. For now, this machine learning model is trained to find only people.”
The future, now
In future, then, the iPhone could be trained to recognise more kinds of objects, not just people. But Halide has a clever workaround. It just uses the focus-pixel method to grab depth data in every photo it takes. If the data is good enough, it’ll blur the background. If not, it won’t. That kind of hit-or-miss implementation would never make it into Apple’s Camera app, but for a pro-level third-party app, one that is purchased by people who understand the limitations, it effectively unlocks a powerful feature on the iPhone XR, for just a few bucks.
I use the built-in camera most of the time, but when I want more control, or when the iPhone XS Portrait Mode is acting screwy, I switch to Halide. It’s a fantastic app, and if you have an iPhone XR, it’s essential.
Download: Halide from the App Store (iOS)