For augmented reality maps to succeed, just add accuracy

By

ARkit
New AR framework makes it easier to build mapping apps.
Photo: Dent Reality

The augmented reality revolution sparked by Apple’s ARKit needs a dash of accuracy if it’s really going to catch fire — especially when it comes to mapping.

That’s why London developer Andrew Hart created a location-aware toolkit that uses artificial intelligence to punch up the precision in apps built with ARKit.

“One problem with AR location experiences is the low accuracy of phone GPS and compasses, which makes it difficult to ensure that things line up,” Hart told Cult of Mac. “The toolkit uses computer vision techniques to recognize landscapes from tagged imagery, and then aligns the AR environment upon recognition. It works in different weather conditions, and means you can have really precise experiences.”

ARKit is one of Apple’s most hyped technologies in years. Apple CEO Tim Cook says barely any facet of our lives will remain “untouched” by augmented reality. And yet ARKit hasn’t exactly set the world ablaze yet. Analysts say that, while developers rushed to adopt ARKit following its September launch, the number of ARKit apps available today remains disappointing.

Augmented reality mapping with ARKit

Hart, who named his startup Dent Reality, thinks the future of augmented reality is mapping. Dent Reality’s AR Location Toolkit is an open-source framework that makes it simpler for developers to add AR experiences to their apps.

It’s built on top of Apple’s ARKit, and uses computer vision technology to identify the surrounding landscape. It allows developers to overlay interactive — and, more importantly, geographically precise — information about different points of interest.

Regular AR apps focus on placing objects within the user’s immediate proximity (think of Ikea’s furniture preview app). But Hart’s AR Location Toolkit allows devs to create apps that augment skylines, or help users recognize and annotate objects within AR.

In demos of his technology, Hart’s toolkit is used to create visual pointers on the street guiding users to their destination using on-screen arrows. In another demonstration, similar arrows are shown stretching into the distance on the horizon to show a proposed route. There are plenty of other possible use-cases, too. Imagine pointing your iPhone at an ancient building and seeing a brief history of the structure pop up on the screen. Or seeing nutritional information floating in front of entrees listed on a restaurant menu.

(Developers can find out more about Hart’s AR Location Toolkit here. If you’re looking to add an augmented reality element into your next app, this is a great resource.)

Big limitations for AR maps right now

Hart, like Cook, thinks augmented reality will fundamentally change the future of apps. However, he acknowledges that AR mapping faces some big limitations right now — starting with the way people use their iPhones.

“Holding up your phone and pointing forward isn’t something people are comfortable with,” Hart said. “It’s awkward for social and physical reasons. As a result, you have to optimize for the use case where people are angling their phone downwards in front of them. You’ve also got to bear in mind that people are only going to be using this for a short time. People may take out their phone to look for some AR directions, but then they’re going to put their phone away again. You shouldn’t expect that people will have their phone out for the whole journey.”

Long-term, Apple is apparently working on creating augmented reality glasses that could work as a more successful version of the ill-fated Google Glass project.

Hart said he expects AR glasses to become mainstream within above five years. The next-gen wearables will put a fantastically useful stream of data right in front of our eyes.

“People are going to look at this technology as being the next level of the internet,” Hart said. “You’ll be able to walk along the street and see a supermarket and immediately see its opening hours. Or you could pick up a book and immediately see information about it online, such as its price or reviews. Or, at the train station, you could see if you’ve got enough time to grab something to eat by glancing at a platform to see when the train will arrive and then how far away you are from a cafe. AR will make information like this far more accessible about the world around you.”

It’s still early days for AR

Many of the ARKit apps Hart has seen so far have been a bit “gimmicky” or downright “lackluster,” he said. But he writes that off as more a result of where we are in the augmented reality adoption cycle than anything.

“We’re really in the early days of mobile AR, so people are still hitting the walls to find out what works and doesn’t work,” he said. “A lot of the use cases we’ve seen fall into the technology demo category. I’m not so interested in that. I want to use AR to present information in a way that can improve the user experience.”

Hart maintains that ARKit will be a game-changer for augmented reality, and that Apple is playing an “industry leading” role in its work. Augmented reality just needs a bit of a helping hand to live up its potential.

He hopes Dent Reality’s AR Location Toolkit adds the kind of accuracy that can make augmented reality commonplace in the not-so-distant future.

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.