Did Apple Just Buy Eyes for Siri?

By

Mobile-3D-sensing

 

Apple agreed this week to buy Israel-based PrimeSense for $350 million.

PrimeSense is best known for making the 3D motion-tracking technology inside Microsoft’s Kinect.

Does this mean Apple plans to make its own Kinect? Maybe. But I think Apple may be thinking about something far more interesting.

All the standard caveats apply to Apple’s PrimeSense acquisition.

That Apple buys a company tells you very little about future products. Sometimes companies buy other companies as a way to hire a team of talented people. Sometimes they want the patents. Sometimes they eliminate competitors.

Still, Apple has a long history of basing major new user interface initiatives on an acquisition.

For example, Apple’s iOS user interface was based in part on technology from FingerWorks, acquired by Apple two years before the iPhone hit. Siri was acquired in 2010 before being baked right into the iPhone as a core interface.

Because of its association with Microsoft’s Kinect product, it’s easy to assume Apple’s acquisition of PrimeSense will lead to a new in-the-air gesture interface for Apple devices, especially TVs and possible for tablets and phones.

But PrimeSense brings powerful capabilities to Apple for all kinds of things, from Kinect-like gesture interfaces and gaming to indoor mapping for Apple Maps.

In fact, PrimeSense technology can be or has already been used by that company’s small number of partners for 3D printing object scanners, robotics, augmented reality, healthcare, and, yes, Kinect and Kinect like in-the-air gesture interfaces.

But Apple is not the kind of company that jumps promiscuously into this or that new business. They’re focused on a very narrow range of very mainstream businesses that mostly involve content creation, content consumption and all-purpose computing and communication.

One of the coolest and most powerful things Apple could do with PrimeSense technology isn’t a new interface, but a new way for Siri understand you and your environment in order to offer more relevant help.

The idea that PrimeSense 3D sensors could be used to give vision to Siri has been suggested but unexplored recently, including by Mark Hachman, Shel Israel and others.

However, if you look at what PrimeSense sensors are capable of — especially PrimeSense’s low-power system-on-a-chip, branded Capri — it’s easy to imagine what Apple could and should do to enhance Siri in mind-blowing ways.

Any Kinect user knows that PrimeSense uses what look like multiple “cameras” pointed at the user, which leads some to falsely assume that it works like human 3D vision.

Humans create 3D in the brain by feeding it two “video feeds” of the same scene from two slightly different vantage points. Vision is the hardware to detect light and convert it into electrical impulses, plus the wetware to make sense of it.

PrimeSense is based on a patented technology trademarked as “Light Coding,” which the company claims enables “sub millimeter accuracy for 3D scanning.” This “Light Coding” bathes a room, person or set of objects in light that is outside the range of human vision, near but not quite within the range of infrared, then detects the reflected light using a regular, off-the-shelf CMOS sensor. A parallel computational algorithm then analyzes the returned data to figure out exactly how far every visible surface is from the sensor (providing accurate 3D). This data is then merged with color data, providing an enhanced ability to differentiate objects from each other. And not just objects. PrimeSense sensors can tell the differences between a floor, a wall, a ceiling and a door.

PrimeSense’s newish Capri product can be understood as Microsoft Kinect’s sensor, but smaller, cheaper faster and higher resolution. It’s small and cheap enough to be built into a smartphone without significantly raising the price.

In fact, it’s not unimaginable to think about Capri-like future sensors being built into both the front and back of iPhones and iPads, and into the fronts of a future iTV and future iMacs and MacBooks — and mostly for the benefit of Siri contextual awareness.

By holding up your phone and talking to Siri, Siri could learn about you, as well as your surroundings.

For example, PrimeSense technology could enable Siri to:

* Know your location in the house. By looking out through your iTV, iMac, iPhone or iPad and seeing both you and your surroundings, Siri could know where you are in the house — or if you’re not at home.

* Know who you’re with. Siri could know if you’re alone or with other people. If you’re alone, Siri might talk to you. If you’re in a restaurant, Siri might display its information on screen exclusively.

* Read your emotions. By scanning your facial expressions, Siri could improve its understanding about what you say, and also gauge how well its doing. If a future version of Siri grows Google Now-like pre-emptive suggestions (and it probably will), Siri could learn if various suggestions are delighting or frustrating you.

* Detect what you’re paying attention to. Siri could become far more polite and sensitive by interacting with you when you’re paying attention, and not interrupting you while you’re focusing on something or someone else. It could also tell what you’re doing — sleeping, eating, listening with earbuds, exercising, etc. — and interact with you accordingly.

* Guess what size clothes you wear. You could simply order clothing without knowing your exact size.

* Keep track of the sizes of your growing children. As your kids walked from time to time in front of your iMac or TV, PrimeSense technology could recognize your kids and monitor their height and clothing sizes. At any time, you could simply ask Siri: “How tall is Jason now?”

* Tell you the sizes of things. By pointing your phone at an appliance or piece of furniture in a store, Siri could tell you if it would fit in your home (having already seen and measured all the spaces).

Virtual assistants like Siri will get better over time mostly by getting better at the context part — knowing your situation and circumstances and helping you accordingly.

In order for Siri to truly understand what’s going on with you and your surroundings, it’s going to need to look around and have some basic understanding about what it sees. PrimeSense technology is really good at that.

I think Apple may have just bought Siri a really great set of eyes.

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.