
Coercing LLMs to Do and Reveal (Almost) Anything with Jonas Geiping - #678
The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)
Advancing Hyperparameter Optimization
This chapter examines the evolution of hyperparameter optimization techniques, comparing traditional methods like random search to advanced strategies such as Bayesian optimization. It highlights the innovative G2G algorithm, which merges random search with gradient-based approaches, and discusses the implications of these advancements for optimizing large models.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.