#1550: “EchoVision” Answers the Question ‘What is it like to see like a bat?’ with Mixed Reality, AI, & Haptics
Mar 29, 2025
auto_awesome
Jiabao Li, a multi-disciplinary artist and assistant professor at UT Austin, discusses her innovative work 'ECHOVISION.' She reveals how mixed reality and LiDAR technology simulate bat echolocation in an immersive experience. Listeners learn about the artistic exploration of bat communication through AI and haptics, including an interactive haptic couch. Li also shares insights from a field trip to witness a bat colony, emphasizing the significance of understanding human and non-human intelligence and the ecological impact of her projects.
EchoVision uniquely combines mixed reality and haptic elements to replicate bat echolocation, enhancing our understanding of non-human sensory experiences.
The project not only engages participants with immersive technology but also educates them on bat ecology and conservation efforts through real-world exploration.
Deep dives
Echo Vision: An Innovative AR Experience
Echo Vision is a groundbreaking phone-based mobile AR experience designed to replicate echolocation, akin to how bats navigate their environment. Participants use a 3D-printed bat mask equipped with a LiDAR scanner, which detects sounds and generates visual representations of echolocation through shaders. The experience features a 'balloon cave' where users can explore various pillars while employing sound to interact with their surroundings, simulating how bats perceive their environment. This innovative approach invites users to explore sound dynamics, embodying an extraordinary immersive interaction.
Collaborative Video Installation and AI Insights
In addition to its AR component, Echo Vision incorporates an interactive video experience that categorizes bat calls using AI algorithms developed by researchers. This immersive video installation categorizes calls based on their contextual meanings, such as mating or food-seeking communication, integrating audio-reactive visualizations to enhance user engagement. By sitting on a specially designed haptic couch, participants experience vibrations synchronized to the bat calls, creating a multi-sensory environment that brings the essence of bat communication to life. This collaboration highlights the intersection of art, technology, and biology as part of a comprehensive viewer experience.
Experiencing the Bat Population in Austin
A unique field trip component involves witnessing the urban bat population emerging from the Congress Avenue Bridge in Austin, Texas, home to one of the largest colonies in the U.S. Participants observe millions of bats, witnessing their nocturnal migration while also engaging in educational discussions about bat behavior and their ecological significance. The collaboration with the Austin Bat Refuge allows attendees to learn about bat conservation efforts and the biology of bats through hands-on experiences, fostering a deeper appreciation for these creatures often misunderstood by the public. This immersive adventure serves as a compelling way to bridge scientific knowledge with experiential learning.
The Multimodal Approach to Immersive Storytelling
The project exemplifies a multimodal storytelling approach, engaging audiences through various sensory experiences that go beyond traditional storytelling boundaries. By combining visual, auditory, and haptic elements, Echo Vision creates a comprehensive sensory experience that fosters empathy with non-human intelligence, encouraging participants to reflect on their perceptions of the natural world. The integration of different technologies, such as AR, video installations, and sound design, showcases the potential of immersive art to raise awareness about ecological issues and animal behaviors. Ultimately, this project aims to cultivate curiosity and inspire action towards a deeper understanding of our relationship with other species.
ECHOVISION is the latest experience from multi-disciplinary artist Jiabao Li that has three major parts. The first part is a mobile-phone based mixed reality experience that does a metaphoric translation of echolocation by using LiDAR to detect your immediate physical surroundings, and then reveals it with a rippling shader that is voice activated. The second part is a video that poetically visualizes different bat calls that have been identified by AI into different contextual domains, and there's a really awesome haptic couch experience that goes along with it. The last part is a field trip to the Congress Street bridge to watch the emergence of hundreds of thousands of bats come out at night as they go to eat. I caught up Li to hear more about all of her interdisciplinary and interspecies collaborations on this piece.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Get the Snipd podcast app
Unlock the knowledge in podcasts with the podcast player of the future.
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode
Save any moment
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Share & Export
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode