Apple invents laser-mapping tech for controlling your Mac

By

An officer worker flips through screens on his computer with Reemo gesture technology. Photo; Reemo/YouTube
How gesture-tracking tech could work.
Photo: Reemo/YouTube

Future Apple devices could be controlled using gaze tracking or point gestures, claims a newly-published patent, describing Apple’s investigations into 3D depth-mapping technology.

The technology would build on the kind of 3D depth-mapping tech being used in the new dual lens iPhone 7, but would apply this to new ways for users to interface with their iMac or, possibly, Apple TV by using in-air gestures or eye-tracking to navigate on-screen menus and content.

The patent gives a broad overview of how this technology might work, by combining a 3D map of users with a 2D image, which the system then uses to work out who it is interacting with.

Being a patent, some of the specific use-cases Apple describes are pretty broad. For example, it’s not entirely clear how the laser mapping system for gesture recognition would necessarily work with one another. There are suggestions, however — such as the idea that a user could scroll through a webpage using gaze-tracking, or select options on a screen by pointing.

Whether this is something Apple will wind up using as anything more than an R&D testing concept remains to be seen, but — as noted — similar tech is already showing up on the iPhone. As seen with technology like Siri and 3D/Force Touch, Apple has certainly shown its willingness to introduce certain technologies on one device and then expand their functionality or use-cases by bringing them to others.

Apple is also no stranger to this area: having been working on it for a while now, with its 3D head tracking patents first coming to light more than half a decade ago in 2009. The company took a step forward when Apple acquired PrimeSense — the Israel-based company behind the 3D motion tracking in the original Xbox Kinect, in late 2013.

Personally, I think this is very interesting tech. I can see it fitting in particularly well as an interface element designed for disabled users, who are unable to use a mouse or trackpad to navigate. It could be especially useful when used in conjunction with Apple’s much-improved speech recognition technology.

Are you excited about the possibilities of this interface element? Leave your comments below.

Source: USPTO

Via: Patently Apple