

Channel Gating for Cheaper and More Accurate Neural Nets with Babak Ehteshami Bejnordi - #385
Jun 22, 2020
Babak Ehteshami Bejnordi, a Research Scientist at Qualcomm AI Research, dives deep into conditional computation for optimizing neural networks. He discusses how channel gating enhances efficiency and accuracy while reducing model size. The conversation explores innovative methods for multitask learning, addressing challenges like catastrophic forgetting in continual learning. Babak also shares insights into practical applications of his research, demonstrating how these advancements transition effectively from the lab to real-world usage.
AI Snips
Chapters
Transcript
Episode notes
Conditional Computation
- Conditional computation selectively activates neural network units based on input.
- This reduces computation costs by avoiding unnecessary processing.
Channel Gating
- Gating channels offers more fine-grained control than gating layers.
- This allows for increased flexibility, representation power, and interpretability.
Training Challenges
- Initial attempts to train gated networks led to trivial solutions.
- Gates remained always on, behaving like standard networks.