This post is presented by Indice, maker of the Apollo app.
The photos you take are only as good as the lighting. That’s true no matter whether you’re using a top-of-the-line DSLR or an iPhone. The difference is, with an iPhone, you can change the lighting after you’ve taken the picture. That’s thanks to Apollo, an iOS app that uses the iPhone’s depth data to totally reimagine the lighting conditions in your photos.
The iPhone XS camera is pretty incredible. The device uses its two rear cameras, plus the A12 chip’s Neural Engine, to record such an accurate 3D map of the scene that you can adjust the background blur with a slider. But that depth map is useful for more than just blurring backgrounds. It can be used by other apps to:
Add realistic lights to a scene.
Choose any subject to be in focus, not just the one you picked when shooting.
Add custom background blurs.
Remove and replace backgrounds, like movie green-screen effects.
The iPhone XS is the gold standard for iOS cameras, but the XR manages some excellent tricks of its own. Despite having only one rear camera, the XR can still recognise people, and then use AI and the super-powerful A12 Neural Engine to separate out the person form the background. While this portrait matte isn’t as detailed as an iPhone XS depth map, it can in theory still be used to do many of the same tricks.
Today we’ll look at the best depth apps for the new iPhone XS, XR, and XS Max.