
Revenue Search: Inside Bittensor Subnet Session with Wandering Weights from Gradients: SN56
23 snips
Jul 15, 2025 Wandering Weights, also known as Chris, is a machine learning practitioner and founder of the Gradients subnet (SN56), dedicated to making fine-tuning accessible and cost-effective. He shares how Gradients disrupts the post-training market with $250 fine-tuned models that challenge expensive AWS services. Chris discusses the efficiency of replacing multiple ML engineers while enhancing productivity. He also outlines plans for confidential execution aimed at enterprise privacy and reveals interesting use cases, including bespoke models for personal and business applications.
AI Snips
Chapters
Transcript
Episode notes
From Academia To Practical ML
- Chris left academia because he felt his work wasn't solving real-world problems and moved into industry and consultancy.
- That practical itch led him to join early BitTensor work and design Gradients to automate tedious ML training tasks.
AutoML Automation Beats Cloud Costs
- Gradients automates AutoML-style hyperparameter and training choices to deliver better performance than major cloud providers.
- Chris's experiments show consistently higher task performance and much lower cost versus naive cloud training.
Go After Cost-Sensitive And Repeat Clients
- Target customers who either can't afford ML engineers or want to reduce headcount by automating training.
- Focus on enterprises that will iterate frequently and prefer repeatable, lower-cost model retraining workflows.
