For more than a decade, live creators have pushed technology forward in the service of a simple idea: the closer people are to the moment, the better the experience. Every generation of streaming tools, from desktop broadcasting software to alerts, widgets, mobile apps, and AI-powered production, has been a step forward in audience engagement. The next evolution is making streaming even more immersive by letting creators share their world naturally, through their own perspective.
Streamlabs has received early access to Meta Wearables Device Access Toolkit for AI glasses, enabling us to prototype and validate a hands-free, first-person livestreaming workflow. The SDK is not yet publicly available to everyone; our team is currently building with early access, and broader availability will come at a later date.
Why this matters
Most live streams are designed for use while sitting at a desk. IRL streaming goes beyond that, but creators still have to juggle gear, cables, mounts, or a phone in one hand.
- Immersive live content: Viewers see what you see, not a setup angle you had to choose. This allows for more immersive streams that draw viewers into the content.
- No extra gear: With hands-free live streaming, creators can cook, coach, hike, build, compete, and perform without a rig getting in the way.
- Brand consistency across devices: The overlays, alerts, chat tools, and monetization that creators rely on do not disappear just because you left the desk.
What changes for creators
A glasses-to-app pipeline unlocks three concrete benefits:
- A simpler IRL stack: Pair once, hit go live, and keep your existing scenes, alerts, and monetization intact.
- A new creative canvas: POV streaming opens the door to formats that feel more immersive. It works well for experiences such as cooking, coaching, field reporting, traveling, and backstage tours.
- Smarter ergonomics: With less gear to manage and fewer steps to go live, creators can focus on their content instead of their setup.
What changes for the industry
As live streaming expands into wearables, software and hardware ecosystems will need to support creators who move fluidly between devices. Audiences expect a seamless experience, whether it starts on a PC, shifts to mobile, or transitions to glasses. That means we need to create solutions for smooth handoffs and shared signals, such as voice triggers, gestures, and automations, that make workflows consistent across hardware. Streamlabs is actively developing in this space and will share more at TwitchCon San Diego 2025.
At the same time, standards and safeguards will become table stakes. Features such as consent cues, sensitive-location detection, and transparent labeling of augmented elements will be critical as POV live becomes more common in public spaces.
The near future of “live”
Looking ahead, we see two major trends starting to come together:
- Wearables as cameras. Glasses won’t replace professional rigs, but they will unlock a new category of spontaneous, hands-free video capture that’s woven into everyday life. For creators, that means being able to go live when inspiration strikes, without setting up all your gear.
- AI-assisted production. With the launch of AI automations in Streamlabs Desktop, creators can already start using features like scene switching, replay capture, and highlight clipping. Looking ahead, we envision these same capabilities extending to mobile and wearable devices. When that happens, real-time scene detection and dynamic overlays will handle much of the production work automatically, allowing creators to focus on narration and audience engagement. The result will be productions that feel lighter, smarter, and more natural than ever.
Logitech and Streamlabs have been long-standing partners of Meta, collaborating on multiple projects over the last decade. Working with Streamlabs and Meta Wearables Device Access Toolkit is another example of how we are continuing to push boundaries of human potential.
None of this replaces what creators already do best. Desktop streams will continue to anchor most channels, and mobile will remain a powerful entry point. Hands-free streaming adds another layer on top of a creator’s toolkit that makes live experiences feel immersive, natural, and unfiltered.
The best ideas in streaming have always come from creators. Our role is to make their path easier, safer, and more accessible. Hands-free live is another way to achieve this, and when we get it right, the experience will be so engaging that viewers won’t just say “nice stream.” They’ll say, “I felt like I was there.”
— Ashray
Head of Logitech G’s Streamlabs