This is the future of iPhone photography

By

Year in Review Future of Photography 2018
The iPhone’s camera already does things impossible for a regular camera. What’s next?
Image: Killian Bell/Cult of Mac

The iPhone camera is hands-down amazing, thanks almost entirely to the fact that it is hooked up to a pocket-size supercomputer. Initially, the iPhone used its computer smarts to overcome the limitations of phone cameras — the tiny sensor, for example. But over time, Apple added amazing features like Smart HDR and the incredible Portrait Mode, which simulates the out-of-focus background that occurs naturally with traditional high-end cameras.

This path is likely to continue. Computational photography, as it is called, is pushing the capabilities of cellphone cameras far ahead of regular “dumb” cameras. So what can we expect to see in future?

Future of iPhone photography

Better low-light images

Google’s Pixel 3 cameras has Night Sight, which manages to turn night into day. It works kind of like Apple’s Smart HDR, in that it takes several photos and merges them together. The difference is here that the pictures are combined in order to extract as much light from the sensor as possible.

The iPhone can already fake long exposures using Live Photos.
The iPhone can already fake long exposures using Live Photos.
Photo: Charlie Sorrel/Cult of Mac

Low light scenes are tricky, because if you leave the camera’s shutter open long enough to gather enough light, then other the Camera or the subject will move, creating blur. Night Sight uses the Pixel’s sensors to work out how much the camera is moving, then grabs the longest blur-free exposures it figures it can get away with. The resulting images look like they were shot with much more light.

This is interesting because it ties together not only the camera and the computer, but also motion detection.

iPhone users can try the Cortex Camera app, which does much the same.

Better bokeh

Bokeh” refers to the quality of the out-of-focus highlights on a photograph. On a regular camera, the shape and texture of out-of-focus lights blobs are characteristic of the lens used, and in particular the shape of the aperture, or hole, in the lens.

Recent iPhones use Portrait Mode to blur backgrounds.
Recent iPhones use Portrait Mode to blur backgrounds.
Photo: Charlie Sorrel/Cult of Mac

On the iPhone, this blur is faked, and is always the same. But why couldn’t we have bokeh that models that found in real lenses? You could resurrect old vintage lenses, or mimic legendary Leica lenses. You could even experiment with bokeh generated by impossibly-shaped apertures, like squares, or cheesy hearts and stars. And in fact, apps like the excellent Halide already apply a custom blur to their Portrait Mode backgrounds.

Extreme telephoto

We already have a faux 2X telephoto mode on single-lens iPhones, which just crops the photo to make you seem closer to the subject. But that results in a lower-resolution image. What if the iPhone took several shots, just like with Smart HDR, and combined them to make a super-zoomed image? The natural movement of the camera would mean that no point of the real image would be captured by the same pixel twice, so in the end you’d have enough information to make a high-resolution image, zoomed in to the max.

This technique is already well-used in astrophotography, to capture super high-res images of distant galaxies.

3D photos

3D photos are one of those gimmicks that seem lame now, but in 20 years time will seem amazing. Another is the iPhone’s Live Photos feature. These might seem little more than space-wasting novelties now. But when your kids are all growed up, or your parents are dead and buried, being able to see them moving for a tiny moment will be precious. Likewise for 3D.

You might not want to animate the result, but being able to glance around behind the subject of a picture will seem very cool when looking back. To see just how cool it could be, just imagine that all those old photo prints from your childhood were Live Photos, or in 3D.

Computational photography is already here

The iPhone is already exploiting its computer/camera combo. Aside from the aforementioned fear4us like Smart HDR, and Portrait Mode, it already uses machine-learning tricks to get great exposure and color balance, it can pick the best photo from a multi-image burst, it manages almost never to capture a photo of someone while they’re blinking, and it knows when you’re smiling.

What’s more, you can search your photos for pictures of melons, books, clouds, or pretty much anything, and see auto-generated albums based on the people and places in your library.

Which is all to say, we’re already enjoying computational photography, and it’s only going to get better.

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.