iOS 13 uses ARKit to solve one of the biggest FaceTime complaints | Cult of Mac

iOS 13 uses ARKit to solve one of the biggest FaceTime complaints


2018 iPad Pro Animoji
The eye-line problem is finally fixed. If you own an iPhone XS or XS Max, that is!
Photo: Apple

There’s something weirdly off-putting about the eye-contact problem with video calling services like FaceTime and Skype.

It happens because users must choose either to look directly at the camera lens, and miss what’s happening on screen, or look at the screen and appear to be staring at the listener’s neck.

That’s not ideal for a tool that’s meant to make it seem like you’re having a face-to-face conversation. Fortunately, Apple fixes this shortcoming in iOS 13.

FaceTime Attention Correction in iOS 13

The new FaceTime Attention Correction feature is present in the latest iOS 13 developer beta. It’s only available for iPhone XS and XS Max devices, but it’s nonetheless appreciated.

To pull off this eye-contact trick, Apple employs its vaunted augmented reality platform, ARKit. In essence, it maps a user’s face, then changes the position of their eyes in real time during FaceTime video calls.

In other words, you’re seeing a doctored image designed to look more natural. It’s totally fake, but it shouldn’t be overly noticeable for the majority of users.

For more information on the other great features found in the latest iOS 13 beta, check out the roundup by my Cult of Mac buddy Charlie.

Via: The Verge