ARKit 3.5, the new version of Apple’s augmented reality software, can capture a 3D representation of the world in real time. This feature employs the LiDAR scanner that’s already in the 2020 Pad Pro and expected in some of this autumn’s iPhone models.
And the version of ARKit that debuted yesterday in iOS 13.4 is better at allowing virtual objects to pass in front of and behind people in the scene.
ARKit 3.5 adds Scene Geometry
Augmented reality involves placing virtual objects in the real world, as long as the scene is viewed through a phone or tablet screen. It’s important for these virtual objects to be placed realistically. Say, on top of a table not embedded into it.
With this in mind, a highlight of ARKit 3.5 is a Scene Geometry API. According to Apple, this “lets you create a topological map of your space with labels identifying floors, walls, ceilings, windows, doors, and seats.”
And this new API enables additional capabilities, like improved people occlusion. Apple’s says “AR content realistically passes behind and in front of people in the real world, making AR experiences more immersive while also enabling green screen-style effects in almost any environment.”
Apple’s software also now offers motion capture. It can scan someone and understand their body position and movement as a series of joints and bones.
Enhanced people occlusion and motion capture require the LiDAR camera in the 2020 iPad Pro that debuts today. But reported leaking from Apple’s supply chain indicate that some of the 2020 iPhone models will also have this 3D scanner.
ARKit 3.5 allows an iPhone or iPad to use the front and rear cameras at the same time. So, for example, an augmented-reality game could be controlled entirely by movements of the player’s face.
The update also allows two Apple devices running augmented-reality applications to work together to build an accurate map of the surrounding area.
Installing iOS 13.4 or iPadOS 13.4 is all that’s required to get ARKit 3.5. It’s embedded into Apple’s mobiles operating systems.