

Hyperparameter Optimization through Neural Network Partitioning with Christos Louizos - #627
7 snips May 1, 2023
Christos Louizos, an ML researcher at Qualcomm Technologies, dives into cutting-edge topics like hyperparameter optimization and federated learning. He discusses innovative techniques for speeding up transformers and optimizing computational graphs. You'll learn about effective methods for adapting models during distribution shifts and the significance of data partitioning. Louizos also highlights the challenges in federated learning and its implications for data privacy and efficiency, setting the stage for future advancements in the field.
AI Snips
Chapters
Transcript
Episode notes
Hyperparameter Optimization in Federated Learning
- Hyperparameter optimization in federated learning presents unique challenges.
- Multiple training runs are undesirable due to communication costs, device constraints, and privacy concerns.
Neural Network Partitioning
- The proposed approach partitions both the neural network and the dataset into K parts.
- Each subnetwork trains on a subset of the data, simulating marginal likelihood and enabling efficient hyperparameter learning.
Dataset and Augmentation Experiments
- The approach used datasets like CIFAR-10 and MNIST, including rotated versions.
- Optimizing affine augmentation parameters allowed the model to recover the applied rotation, demonstrating its effectiveness.