There’s something weirdly off-putting about the eye-contact problem with video calling services like FaceTime and Skype.
It happens because users must choose either to look directly at the camera lens, and miss what’s happening on screen, or look at the screen and appear to be staring at the listener’s neck.
That’s not ideal for a tool that’s meant to make it seem like you’re having a face-to-face conversation. Fortunately, Apple fixes this shortcoming in iOS 13.
How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.
Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNGN
— Dave Schukin ? (@schukin) July 3, 2019
FaceTime Attention Correction in iOS 13
The new FaceTime Attention Correction feature is present in the latest iOS 13 developer beta. It’s only available for iPhone XS and XS Max devices, but it’s nonetheless appreciated.
To pull off this eye-contact trick, Apple employs its vaunted augmented reality platform, ARKit. In essence, it maps a user’s face, then changes the position of their eyes in real time during FaceTime video calls.
In other words, you’re seeing a doctored image designed to look more natural. It’s totally fake, but it shouldn’t be overly noticeable for the majority of users.
For more information on the other great features found in the latest iOS 13 beta, check out the roundup by my Cult of Mac buddy Charlie.
Via: The Verge