It’s taken a while to fully unpack and understand the technical improvements to the cameras in the iPhone 14 Pro. Camera nerd Sebastiaan de With, co-founder and designer of the highly-respected Halide camera app, has written a detailed review of the improvements to the camera system.
His professional opinion? These are not just great iPhone cameras, they’re great cameras, period.
How did Apple do make the iPhone 14 Pro camera so good?
These changes are all made possible because the camera continues to occupy a greater space inside (and outside) the phone every year. For one, phones are getting bigger, giving them more real estate to work with. But thanks to the power efficiency of Apple silicon, the logic board and the battery can grow modestly year-over-year and offer consistent improvements while the camera can take giant leaps in physical size.
As much as we feel the cameras now dominate the back of the phone, in five years, they may be even bigger.
As for the selfie camera, pushing one of the adjacent sensors underneath the display has allowed for a much larger camera component. With more space, the selfie camera takes significantly sharper and brighter pictures.
A huge leap for the selfie camera
Before, the selfie camera was fixed focus. That meant that no matter whether you were holding the phone close to your face or reaching for a big group photo with a selfie stick, it didn’t have the lens hardware capable of changing the focus point.
The iPhone 14 Pro takes a huge leap forward in selfies. The difference in detail, sharpness and clarity is night and day.
A larger sensor captures more light, which will certainly help for late-night pictures outdoors, in restaurants, movie theaters and any dark environment. According to de With, “Low light shots are far more usable, with less smudging apparent.”
The Ultra-Wide continues to surprise
Turning our focus to the rear, de With wasn’t expecting much from the Ultra-Wide lens. After all, it had its complete overhaul last year, adding “a more complex lens design [that] allowed for autofocus and extremely-close focus for macro shots, and a larger sensor.”
Regardless, the Ultra-Wide lens got another big refresh this year, too. The sensor is nearly 50% bigger in area, and the lens design is new. It can achieve sharper images with less post-processing and guesswork.
A crystal-clear ultra-wide image is truly immersive because of how close it gets to matching the human field-of-vision. While the iPhone’s Ultra-Wide lens still has some room for improvement (low light shots are still grainy; the far corners of the image are still soft) ultra-wide lenses push and fight against the very physical limits of capturing light. This year marked the second big step forwards in a row.
De With closes with a great observation that puts it all into perspective (emphasis mine): “A few iPhone generations ago, this would have been a fantastic main camera, even when cropped down to the field of view of the regular camera.”
Speaking of the main camera…
Stop me if you’ve heard this before — the sensor is significantly bigger in this lens, allowing for better low-light performance and cleaner shots.
But it doesn’t stop there. The main camera fits four times as many pixel sensors as before. It’s like Retina resolution for your camera. By default, the images it takes are still the same size; it uses the extra pixels to take clearer shots with less noise.
You can use all 48 MP if you shoot in ProRAW. This has a few caveats — huge file sizes, no automatic correcting for white balance or noise, slower pictures — but you can get some incredible detail you have to see to believe. “This camera can make beautiful photos, period, full stop. Photos that aren’t good for an iPhone. Photos that are great,” de With wrote.
It also uses the quad-pixel sensor for better dynamic range — when half of your image is in bright sunlight and the other half is dark and shadowed. Before, your phone had to (really quickly) take several different pictures at different exposures and merge them together; the iPhone 14 Pro can capture “every pixel at different levels of brightness to allow for an instant HDR capture.”
Not only will it take better pictures in mixed lighting conditions, but doing it in one shot instead of four or five will reduce blurring.
Another change the main lens will take a slightly wider image than before: from 26 mm to 24 mm. It’s a minor difference, and a matter of personal preference, but I agree with De With. I prefer the old narrower field-of-view compared to the wider one.
The return of the 2× lens… kind of
The 2× zoom makes its return! No, your phone doesn’t have a hidden fifth camera. This is another trick enabled by the quad-pixel sensor. It crops in on the sensor, using every subpixel to capture a full resolution image without physically zooming or digitally upscaling.
This is a wonderful feature for people like me who find the 2× lens absolutely indispensable. The 1× lens is too wide, and the 3× lens is too much zoom for everyday shots. I use the 2× lens on my iPhone 12 Pro more often than the 1× lens.
If you really know your photography, you’ll know that there are slight optical differences between cropping in on a 1× sensor and taking a full 2× shot. You won’t get quite the same shallow depth-of-field effect. But this may be only the first step: “This feels like laying the groundwork for a far longer, more extreme telephoto zoom in the iPhone future.”
The Telephoto camera
Last year, when the iPhone 13 Pro switched from a 2× to 3× lens, it came at the cost of a smaller sensor with worse low-light performance. Did the Telephoto lens this year follow the pattern of the other three?
Despite the virtually identical hardware, the pictures are still noticeably better, thanks to the A16 chip. According to de With, “While these two cameras ostensibly pack the same size sensor and exact same lens, the processing and image quality on the iPhone 14 Pro is simply leagues ahead,” resulting in more contrast and cleaner shots every time.
Shooting with the 3× lens is a game of its own. When you’re shooting with such a tight crop, you’re making a deliberate decision to capture a narrow detail of your perspective. What you leave out is just as important as what you leave in.
This is the camera lens you should pay attention to next year. I’m really curious to see where Apple takes it.
A bigger part of the camera story, arguably just as important as the hardware itself, is the processing that happens afterwards. There isn’t a transparent metric to measure the amount of noise reduction or sharpening that an iPhone does — it changes with every device in opaque ways. We can only take pictures and see how they compare.
The iPhone XS was notorious for some egregious over-processing. Last year’s iPhone 13 Pro aggressively processed selfies in low-light. In de With’s observations, it seems the iPhone 14 Pro is “even more hands-on” with post-processing.
That’s not always a bad thing. When combined with the monumental improvements to the hardware, as you’ve seen, the sharpening and noise reduction add up to create incredible pictures.
In some cases, however, this can erroneously remove details. In this extreme example, comparing the iPhone 13 Pro (left) to the iPhone 14 Pro (right), you can see that the newer phone entirely removes windows from a building.
Put it all together
Apple treats all of its separate cameras as one continuous system. One you can seamlessly zoom into, step away from or get up close with, switching lenses totally invisibly.
The iPhone 14 Pro has limits. It’s fighting against the laws of physics with its still comparatively small sensor against the full-sized dedicated cameras of the world. But, as they say, the best camera is the one you have with you.
The hundreds of millions of people out there with an iPhone who’ve never touched a DSLR camera have less reason every year to fear missing out.