Turning the Apple Vision Pro headset from an announcement into a shipping product took a big step forward Wednesday with the release of the visionOS SDK. This includes the software tools developers will use to write applications for the AR headset that Apple unveiled earlier this month.
Apple also said it will open developer labs around the world soon, giving coders a chance for some hands-on time with Vision Pro, which won’t launch until 2024.
Apple visionOS SDK released to developers
The headset is Apple’s first foray into what the company calls “spatial computing.” Vision Pro focuses on augmented reality, which overlays computer-generated content onto the real world. Rivals like the Meta Quest 3 look similar but focus on virtual reality, which completely replaces the real world. Plus, Apple’s upcoming headset offers features that set it apart, like a front-facing screen that can show the wearer’s face.
Now that the visionOS SDK is available, developers can begin writing the third-party applications for the device that likely will prove important to the headset’s success — or failure.
“Apple Vision Pro redefines what’s possible on a computing platform,” said Susan Prescott, Apple’s vice president of worldwide developer relations, in a statement Wednesday. “Developers can get started building visionOS apps using the powerful frameworks they already know, and take their development even further with new innovative tools and technologies like Reality Composer Pro, to design all-new experiences for their users. By taking advantage of the space around the user, spatial computing unlocks new opportunities for our developers, and enables them to imagine new ways to help their users connect, be productive, and enjoy new types of entertainment. We can’t wait to see what our developer community dreams up.”
The same, but different
Many of the tools used to create visionOS applications will be familiar to current Apple developers. The SDK makes use of Xcode, SwiftUI, RealityKit, ARKit and TestFlight.
That means not every app must be rewritten for the headset. Apple’s developer notes for the AR headset say, “visionOS supports most of the same technologies as iOS, so many apps built to run on iPad or iPhone can run unmodified on visionOS devices.”
But an AR headset presents its own special needs and possibilities, of course. The new Reality Composer Pro mentioned by Prescott lets devs preview and prepare 3D models, animations, images and sounds.
Opening Vision Pro developer labs
With the release of Vision Pro still so far away, developers need access to the product to test their applications. The visionOS SDK includes a visionOS simulator, but that’s likely to be of limited use. Apple’s solution is a series of developer labs with the AR headset.
The company committed to putting these in Cupertino, London, Munich, Shanghai, Singapore and Tokyo. Their doors will open in July.