

How to See ‘Where’ Through Low-Power Event Cameras
Oct 27, 2023
Professor Guillermo Gallego, an expert in algorithms for event-based vision from the Technical University of Berlin, discusses the cutting-edge world of bio-inspired event-driven cameras. He explains how these cameras mimic human vision and operate in various light conditions, which is vital for autonomous robotics. The conversation dives into motion estimation, enhancing spatial AI, and the unifying frameworks that improve visual processing. Gallego also shares insights on the historical evolution of cellular neural networks and their role in modern imaging systems.
Chapters
Transcript
Episode notes
1 2 3 4 5 6 7 8
Intro
00:00 • 2min
Innovations in Event-Driven Vision Technology
01:50 • 6min
Visual Pathways and Motion Estimation in Robotics
07:31 • 2min
Understanding Motion Compensation in Event-Driven Cameras
10:00 • 2min
Understanding Contrast and Motion Estimation in Vision
12:28 • 2min
Enhancing Visual Processing Through a Unifying Framework
14:04 • 2min
Diving into Event-Driven Cameras
16:07 • 12min
Evolution and Insights of Cellular Neural Networks
28:21 • 16min