Along with the launch of its new AI glasses models, Meta’s also looking to open up its glasses to outside developers, with its Meta Wearables Device Access Toolkit, which will enable third-party developers to build apps that link into Meta’s wearable devices.
That will enable more integrations like the current Garmin and Strava connection options for Meta’s next-wave of AI glasses, including the coming Meta Oakley “Vanguard” glasses which are focused on athletes.
As explained by Meta:
“Our first version of the toolkit will open up access to a suite of on-device sensors - empowering you to start building features within your mobile apps that leverage the hands-free benefits of AI glasses.”
With this, developers will be able to use the various sensors and devices built into Meta’s AI glasses to power new experiences, with the initial toolkit enabling access to the camera and audio functionalities.
“Early results are promising. Disney’s Imagineering R&D team is working on early prototypes to see how AI glasses could help give guests access to tips while in their parks. Major streaming services like Twitch will enable creators to livestream straight from their glasses - creators could even reach multiple platforms simultaneously via streaming software partners like Logitech’s Streamlabs. And HumanWare is building an integration that gives blind and low-vision people live guidance as they navigate the world.”

And eventually, the toolkit will be expanded to full AI glasses capability, with outside developers able to build app add-ons and experiences around its new in-set display, as well as its wristband controller tech.

Meta will be launching an initial version of its Wearables Toolkit in preview mode, with limited access:
“The developer preview is designed for exploration and early development so we can build the future of this toolkit based on your feedback. During the preview, you’ll be able to access the toolkit, build prototypes, test sensor-based experiences, and distribute to testers using our beta testing platform in the Wearables Developer Center. Publishing will be available to limited audiences in the preview phase so that we can responsibly test, learn, and refine our toolkit.”
This could be a big step in boosting broader adoption, by providing a range of AI glasses integrations for popular apps. And as those experiences and options grow, more people will be interested in trying them out, which could help Meta increase sales of its coming devices.
And this is before we get into full AR capacity, with Meta’s next-level glasses, which are scheduled for launch next year.
The more Meta can invite third parties to develop new functionalities, the better its chances of tapping into new use cases, and the more likely that Meta will be able to strike up more consumer interest.
It’s an important step, and one which could end up being a major advantage for its wearables push.
You can learn more about Meta’s Wearable Device Access Toolkit here.