Facebook and Ray-Ban teamed up on a pair of smart glasses. It’s essentially a camera you wear on your face, making it a perfect example of what not to do with this type of product. They turn the wearer into a walking, talking privacy violation.
Apple is designing its own smart glasses. These better not have a camera or they’re dead on arrival.
Anyone remember Google Glass?
Google put smart glasses on the map almost a decade ago. There were plenty of problems with Google Glass but all of these could have been fixed in later versions. The reason there weren’t follow-up versions is that, as a society, we rejected letting people wander around pointing cameras in our faces all the time.
Apple does not want to fall into this hole.
We still demand some privacy
It’s not surprising Ray-Ban Stories were created in cooperation with Facebook, a company that profits from violating the privacy of everyone possible.
And that’s the crux of the problem. As a society, we’ve given up a lot of privacy. Both Google and Facebook rake in huge profits by selling the private information of their users to advertisers. While we mostly accept that, the failure of Google Glass shows that someone capable of taking pictures or video of everyone around them all the time is too much.
And the opinions of non-users will determine the success of smart glasses. Usually, it’s the number of buyers that make the success/failure determination. But smart glasses are different. People won’t let others wear a camera into their gym locker room. Or a changing room at a store. Co-workers and supervisors will object to having a camera constantly pointed at them in the workplace. Heck, even friends and family will probably find it creepy.
When Google Glass debuted in 2013, many businesses started looking into banning its use on their premises because of the built-in camera. That includes hospitals, movie theaters and casinos. Even the U.S. Congress did some preliminary investigation into whether these devices should be regulated.
So people can buy smart glasses with a built-in camera but they won‘t be able to wear them much.
There are good alternative sensors for Apple smart glasses
It’s crystal clear: If the team working on Apple’s augmented reality glasses have a camera in their designs then they need to go back to the drawing board.
This might make the job harder. A camera is a great sensor for discovering what’s going on around the user. But it’s a non-starter even if it can only be used to find the location of nearby objects. People absolutely will not accept, “Yes, I have a camera pointing at you but I promise it’s not taking pictures of you naked.“
Fortunately, there are alternatives. Lidar would let Apple’s smart glasses create a 3D map of the area around the wearer without violating anyone’s privacy. It’s not accurate enough for anyone specific to be identified.
And that’s only the start. GPS can be used to specify the wearer’s location. Accelerometers and gyroscopes can be used to determine motion and orientation. With all these available, a camera isn’t necessary.
Apple probably knows this already. The company is (most of the time) fighting to protect user privacy. It seems vanishingly unlikely it’ll make smart glasses that would immediately be branded a huge privacy violation.
Especially as this is a very important project for the company. CEO Tim Cook reportedly won’t leave Apple until the AR glasses are released. It’s understandable why — they could replace iPhone and Apple Watch. They might even someday replace external monitors on all computers. But not if they come with a built-in camera.