Apple Glass headset could sense users’ physiological states

By

John Prosser
What will Apple Glass have to offer?
Photo: Jon Prosser/Front Page Tech

Apple Watch already features heart rate-tracking technology. But Apple may go even further with its biometric-reading tech — and it could be used to make an eventual Apple Glass head-up display more reactive to users in the process.

Published Thursday, a new Apple patent application describes how the company could turn information such as temperature and brainwave-reading into an assessment of the “physiological condition of the user.”

It means that Apple is interested in Apple Glass being able to figure out how a user is feeling. The headset could then change content accordingly. The patent application, shared by Apple Insider, notes that:

“[One] or more physiological sensors may be configured to sense physiological conditions in the facial engagement region, which may include force, temperature, moisture, displacement, capacitance, brain activity (e.g., EEG), muscle activity (e.g., via force sensors and/or electromyography (EMG)), and/or heart rate.”

This information could be used in various different ways. One might be for wellness. For instance, it could point a person toward a hospital if they have an elevated heart rate. It’s equally possible to imagine it determining stress levels and advising you on how to calm down. Apple also suggests it could have application for tools like ResearchKit, being able to carry out “multi-person health studies” by, say, showing users different content and then gauging their reaction to it. Apple also mentions “yet-to-be determined uses.” One of the most obvious of these would be gaming — with content that could change based on the physiological response of players.

AR “Apple Glass” headset is coming. But when?

Apple has been rumored to be working on an AR headset of some kind for several years. In May, Apple leaker Jon Prosser released a video with alleged details about the Apple Glass project. Prosser claimed that Apple’s AR glasses would not feature a front-facing camera. However, he said it would sport a lidar sensor for scanning. He also said the glasses will display information inside both lenses and work via gesture controls, and could possibly be shown off by the end of 2020.

But veteran Apple reporter Mark Gurman has hit back at those rumors, dismissing them. According to Gurman, Apple is currently working on two devices. One is a project code-named N301. This will allegedly combine the “best of” VR and AR in a headset capable of overlaying AR images. The second device, code-name N421, is “a lightweight pair of glasses using AR only.” In a June Bloomberg report, Gurman wrote that Apple could announce the first headset next year and release it in 2022. Meanwhile, Apple’s AR glasses will arrive “by 2023” at the earliest, according to Gurman.

Tech’s interest in emotion-sniffing

This is the first time I’ve heard about a possible Apple Glass containing a physiological state-assessing element. Aside from Prosser’s comment about a lidar sensor, there haven’t been many reports talking about the sensor set expected for Apple Glass. Once that is reported it will be easier to assess possible use-cases.

Apple’s far from alone in exploring this territory, however. A number of tech companies large and small are interested in “emotion-sniffing” technology. Done right, the idea that a device could sense your mood and make recommendations accordingly sounds almost like magic.

Will Apple take this technology forward and end up releasing it? There’s no guarantee with any of its patents — which is why I’m still waiting for the gaming joystick that pops out of the iPhone’s (now disappeared) Home button. But it’s certainly fascinating research.

You can read Apple’s patent application here.