

Neural Ordinary Differential Equations with David Duvenaud - #364
Apr 9, 2020
Join David Duvenaud, an Assistant Professor at the University of Toronto, as he shares his insights on Neural Ordinary Differential Equations (ODEs). He discusses how ODEs could revolutionize neural networks by offering continuous-depth modeling and tackling complex dynamics. David dives into their application in managing irregular medical time series data, emphasizing the efficiency of predictive analytics. He also touches on the balance between specialization and the exploration of diverse research interests, making this conversation a fascinating blend of theory and real-world application.
AI Snips
Chapters
Transcript
Episode notes
Neural ODEs for Bigger Models
- Neural Ordinary Differential Equations (ODEs) can train larger models than traditional neural networks.
- ODEs offer potential to replace the backbone of current neural networks, like residual networks.
ODE Networks vs. Residual Networks
- Residual networks and ODE networks share a similar structure of adding contributions from smaller functions.
- ODE networks adaptively determine function evaluations, offering flexibility compared to fixed-layer networks.
Adaptive Computation in ODE Networks
- ODE networks can adaptively adjust the number of function evaluations based on problem complexity.
- This allows for a dynamic trade-off between accuracy and computational cost during training.