Interacting with an AR headset would be a breeze if it could project virtual controls into the real world. Apple developed a new method for using cameras to accurately track finger movements, enabling someone to toggle these augmented reality switches and buttons.
Wearable cameras closely track your hand movements
Apple today received a patent for “Depth-Based Touch Detection.“ It describes a method to “utilize a depth map to identify an object and a surface, and a classifier to determine when the object is touching the surface.” It was innovative when Apple proposed it because it tracks the distance from the user’s hand to the surface — previous versions track the distance between the hand and the camera creating the depth map.
What Apple’s tech makes possible is a set of AR glasses virtually projecting its Settings screens onto a table or other surface. Then the user could toggle switches or move sliders with their fingers. A full keyboard is also theoretically possible.
And, of course, knowing exactly where the user’s hands and fingers are would make for easier interactions with virtual objects created by the AR glasses.
Apple hard at work on an AR headset
There is ample evidence that Apple is developing an AR headset, even if there’s been no official announcement. Most recently, code buried in iOS 13 points directly to that type of device.
Unconfirmed reports give more details, and indicate potential buyers will have to wait years. A recent leak indicates Apple will release its first AR headset in 2022, followed by a smaller device such as AR glasses in 2023.