Full-body Animoji? Smart tech uses iPhone camera to track body motion


The future of Animoji?
Photo: Carnegie Mellon

Forget only animated avatar faces: a team at Carnegie Mellon University wants to bring the world full-body Animojis. As the director of the Future Interfaces Group at Carnegie Mellon University’s Human-Computer Interaction Institute, Chris Harrison‘s job is to help create the computer features of tomorrow.

In a newly published demo, shown off this week, Harrison’s team has come up with a way to let regular iPhones do full body tracking using only the front-facing camera — by estimating what the rest of your body is doing.

It works surprisingly well.


“The smartphone has no way to see the whole body, so instead we have to be clever, combining several sources of sensor data to create an accurate estimate,” Harrison told Cult of Mac. “More specifically, we fuse data from front and back cameras, the user-facing depth camera — what Apple calls a TrueDepth camera — the inertial measurement unit (IMU), and the touchscreen. These different sensors provide different clues as to how the body is posed. For example, [they can tell us] where the user’s arms are, or if they are walking.”

Called “Pose-on-the-Go,” the system could give iOS developers the ability to build apps using full body pose. Rather than having to wear a movie motion capture-style suit, Pose-on-the-Go requires nothing more than the sensors already fitted into today’s iPhones. Smart algorithms then fuse this information together. The results can accurately estimate the user’s pose — even in motion.

Body pose
The future of gaming on the iPhone?
Photo: Carnegie Mellon University

From gaming to full-body Animoji

“It’s still an estimate, and the software can be tricked, but overall it’s pretty good,” Harrison said. “We benchmarked our system against a Hollywood-grade motion capture system, and across all joints — head, arms, legs — the average error was [just] 20.9cm.”

In one impressive demo, the team shows a person jogging up to a wall and then ducking down. The tracking doesn’t so much as skip a beat.

“I think there are lots of uses, from exercise apps that count your reps and give you quality feedback, to full-body gaming experiences where you have to run, jump and duck,” Harrison continued. “One of the applications we made for ourselves was a fantasy game where you could use your hands to cast spells. Another cute demo was full-body Animoji, where users could wave, dance and walk around.”

A killer app for a future iPhone?

Any chance that this gets baked into a future iPhone? There’s no word on that, and, for now, this is very much a proof of concept. However, the Future Interfaces Group is one of the research labs that top Silicon Valley companies keep an eye on. Ex-Harrison students have gone on to work at some of the big tech giants — Apple included.

But even if Apple doesn’t officially adopt this tech, that doesn’t mean you won’t see this on a future iPhone app. “We are planning an open source release for much of our code,” Harrison said. “Given this is all software — no new hardware or dongles needed — this could be enabled on many modern smartphones as a software update.”

The researchers presented Pose-on-the-Go at this week’s ACM Conference on Human Factors in Computing Systems. You can read a paper describing the work here.

Do you have any ideas for how you’d like to see this technology used? Let us know your thoughts in the comments below.



Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.