Infrared light. We can’t see it, but it surrounds us, permeating everything… especially our digital camera sensors, leading to images filled with off, unnatural colors.
With the iPhone 4S, Apple introduced an infrared filter to improve color quality in the images. But what are the practical effects of this filter? Much more accurate color and the elimination of the reddish tint that plagues so many iPhone photos.
Over at Camera Technica, there is a great overview about how infrared affects image quality, and why Apple’s IR filter leads to so much higher quality photographs. Here’s an overview:
When IR light is allowed to pass through to the sensor, the IR light contaminates the channels (mostly the red channel) with information that was not visible in the original scene. The result is an image with a color cast. The images below show the impact of IR contamination on a scene with a dark background. The iPhone 4 image shows a reddish cast due to the extra IR light recorded on the red channel. The 4S image shows a background which much more closely resembles the original scene.
This sort of filter is especially important for the iPhone 4S, because there’s no other way to accurately eliminate the IR contamination otherwise. On SLR cameras, you can muck around with RAW files to fix things up, but on an iPhone, everything’s preprocessed into JPGs, meaning all adjustments need to happen in-camera.
Just another example of Apple putting a lot of thought into the small details no one else ever notices.