In another piece of possible evidence that this year’s iPhone 12 handsets will feature ToF (time-of-flight) camera sensors, a report published Monday claims that component makers in the Apple supply chain are gearing up their production for mass-manufacturing.
Time-of-flight camera sensors typically work by emitting a laser pulse which is used to bounce off objects. By measuring how long it takes for the laser to hit the object and return to the sensor, you can work out how far away it is. This makes it possible to create detailed 3D maps of spaces. That could help Apple in both improving its iPhone camera tech and augmented reality ambitions.
“[Wafer] foundry Win Semiconductors reportedly has landed orders for processing VCSEL chips to support 3D sensors and ToF (time of flight) camera solutions for the new iPhone devices, with testing solution providers Elite Advanced Laser and Chroma Ate developing inspection tools for examining those VCSEL components, the source said.”
VCSEL stands for vertical-cavity surface-emitting lasers. It is the tech that gives iPhones their Face ID, Animoji, and Portrait mode selfies, which depend on 3D depth-sensing.
In the case of the iPhone 12, Apple is heavily rumored to be introducing rear time-of-flight sensors. These will help the iPhone make advances in the field of augmented reality. They could also be useful for next-gen photo-taking tech.
Time-of-flight sensors for the iPhone
Time-of-flight sensors were rumored as far back as 2019, and has not wavered since then. Only the iPhone 12 Pro handsets are expected to feature this feature. The report claimed that there is likely to be “robust demand” for the ToF components.
Alongside the new rear ToF sensors, the new iPhones are reported to boast a redesigned form factor, 5nm A-series chip, other upgraded internals, and Apple’s first 5G compatibility.