In a groundbreaking development, a brain-computer interface (BCI) company successfully demonstrated the first-ever use of Apple’s Vision Pro AR/VR headset — and later an iPad — controlled directly by thought, according to Synchron. The innovation opens up exciting possibilities, perhaps even beyond helping individuals with severe mobility limitations engage with cutting-edge technology. Innovations like thought control of Vision Pro and iPad could lead to big things for both hands-free and voice-free use of devices.
August 4 update: In a new video, Synchron released proof of the first-ever public demonstration of an individual using an iPad controlled entirely by thought, leveraging Apple’s built-in accessibility features and new Brain-Computer Interface Human Interface Device (BCI HID) protocol, the company said. Watch the video below.
May 13 update: Synchron said it would be the first brain-computer interface (BCI) company to achieve native integration with a new BCI Human Interface Device (BCI HID) profile Apple just rolled out among various accessibility upgrades.
New video shows patient controlling iPad entirely by thought
In the new video from Synchon, a patient with MLS named Mark demonstrates using an iPad by thought alone as company executives explain how it works. As a participant in the company’s COMMAND clinical study, Mark uses an implantable BCI to navigate the iPad home screen, open apps and compose text — all without using his hands, voice or eyes, Synchon said. He’s the world’s first person to do so.
This development comes after Apple rolled out a new BCI Human Interface Device (BCI HID) input protocol in May. It lets Apple’s operating systems use brain signals as a native input method for the first time.
“This is the first time the world has seen native, thought-driven control of an Apple device in action,” said Dr. Tom Oxley, CEO and Founder, Synchron. “Mark’s experience is a technical breakthrough, and a glimpse into the future of human-computer interaction, where cognitive input becomes a mainstream mode of control.”
Demonstration shows controlling Vision Pro via thought-driven commands is possible
The previous demonstration involved Mark, a 64-year-old man living with amyotrophic lateral sclerosis (ALS) that has taken away his use of his arms and hands. He could control the Vision Pro‘s cursor using his thoughts, thanks to Synchron’s tiny implanted BCI, which does not require open brain surgery. That allowed Mark to play Solitaire, watch Apple TV and send text messages without the need for hand gestures, which, along with eye movement, are typically required to operate the device. Watch a video of Mark using Vision Pro.
“This is pretty cool, I’ve been wanting to try this for a while now,” Mark said in reaction to using his BCI to watch a video on the Vision Pro. “It’s like watching it in the theater, it really comes to life. Using this type of enhanced reality is so impactful and I can imagine it would be for others in my position or others who have lost the ability to engage in their day-to-day life. It can transport you to places you never thought you’d see or experience again.”
As for how the implant is applied, the BCI is inserted in the blood vessel on the surface of the motor cortex of the brain via the jugular vein, through a minimally-invasive endovascular procedure, Synchron said. It communicates with a small device affixed to the patient’s chest. Then it detects and wirelessly transmits “motor intent” out of the brain to enable severely paralyzed people to control personal devices with hands-free point-and-click.
A step forward in human-computer interaction
This integration of BCI technology with the Vision Pro headset represents a significant step forward in accessibility and human-computer interaction. It suggests a future where individuals with paralysis, limited mobility or limited speech can fully immerse themselves in augmented and virtual reality experiences.
“BCI is a platform to re-connect people with injury or disease back to the fast-moving consumer technology landscape. Vision Pro is a powerful system, but it relies on the use of hand gestures to exert control over the UI,” said Tom Oxley, CEO & Founder of Synchron. “We are sending control signals directly from the brain to replace the need for hand gestures. We are moving towards a new Bluetooth standard for Human Computer Interactions that do not require touch or speech. This is a critical unmet need for millions of people with paralysis.”
And Apple is “very supportive” of the Vision Pro integration, Oxley told CNBC.
What else could this mean?
Thought control of Vision Pro for people with disabilities, like Mark, is a fascinating development. And looking ahead, it’s not difficult to anticipate several potential applications and developments of this type of innovation on a wider basis. It’s fun to think about it.
- Expanded accessibility. This successful integration could lead to further adaptations of AR/VR technology for individuals with physical limitations.
- Medical and therapeutic use. Vision Pro’s immersive capabilities, combined with thought control, could open new avenues for rehabilitation and therapy.
- Enhanced communication. For those unable to speak or use traditional input methods, this technology could provide new ways to interact and communicate in real and virtual environments.
- Educational opportunities. Students with mobility impairments could participate more fully in virtual classrooms and interactive learning experiences.
- Professional applications. The technology could enable folks with physical limitations to work remotely and collaborate well in virtual environments.
- Entertainment and social interaction. Users could more easily participate in virtual social gatherings and entertainment experiences.
- Further technological integration. Other tech companies could develop similar integrations, potentially leading to a new standard in human-computer interaction that doesn’t rely on touch or speech.
As this technology continues to evolve, it could improve quality of life for many individuals. However, widespread adoption will probably depend on factors like cost, ease of use and further clinical trials ensuring safety and efficacy. But it’s interesting to consider this glimpse of a future where boundaries between thought and digital interaction continue to blur, potentially revolutionizing how we interact with the world.
Source: Synchron
This post first published on July 30, 2024. We republished it on May 13, 2025, and August 4, 2025, with news updates.