

Automated Model Tuning with SigOpt - #324
Dec 9, 2019
Scott Clark, Co-founder and CEO of SigOpt, dives into automated model tuning and its transformative potential for AI applications. He showcases SigOpt's platform with a live demo, highlighting how tailored solutions drive efficiency in enterprise modeling. The discussion covers the importance of customization, the need for effective experimentation platforms, and strategies to navigate machine learning optimization. Clark also explains Bayesian optimization techniques for parameter tuning, emphasizing a holistic approach to balancing multiple performance metrics.
AI Snips
Chapters
Transcript
Episode notes
Buy vs. Build in AI Platforms
- Customize data pipelines and model deployment based on specific application needs.
- Leverage standardized open-source tools for model development and experimentation.
Automated Model Tuning Considerations
- AutoML and automated hyperparameter tuning address basic model development needs.
- Differentiated models require customized tools, domain expertise, and advanced optimization.
Importance of Experimentation History
- Track and document all experimentation steps, including failed attempts.
- This history aids reproducibility, knowledge transfer, and regulatory compliance.