
Marketplace Tech The year in AI wearables
6 snips
Dec 25, 2025 In a fascinating discussion, Will Gottsegen, a staff writer at The Atlantic, dives into the world of AI wearables and shares his firsthand experience testing Meta’s smart glasses. He highlights the promise of wearables aimed at reducing screen time and providing more natural interactions. However, he also points out the technical limitations, like reliance on cloud processing that can hinder performance. The conversation touches on the future of AI interfaces, accessibility potential, and the hurdles like privacy concerns standing in the way of mainstream adoption.
AI Snips
Chapters
Transcript
Episode notes
A New Third Way Of Computing
- AI wearables aim to sit alongside phones and laptops as a third, more ambient computing interface.
- They promise more fluid context-aware interactions that reduce screen time and surface relevant info in real life.
Hands-On With Meta's AI Glasses
- Will Gottsegen visited Meta's skate-shop–style pop-up to try the newest AI glasses with a tiny display.
- He found gesture control cool but hampered by slow cloud-dependent processing and spotty Wi‑Fi in the demo.
Cloud Limits Real-Time Use
- Many AI wearables rely on cloud computation because on-device models are still too large for tiny hardware.
- That dependence makes performance sensitive to connectivity and can slow real‑world use cases like live translation.
