

S2E8 - Subnet 56 Gradients.io w/ wanderingweights
Mar 28, 2025
The discussion kicks off with Rayon Labs' groundbreaking no-code training subnet, simplifying AI model building. Listeners learn about the dynamic trends in BitTensor's subnets and the competitive landscape of machine learning training. The team dives into validator operations and the critical role of synthetic data in maintaining trust within the network. They emphasize the power of collaboration in creating innovative models and the strategic application of AI to address challenges. Flexibility and client-specific solutions are key takeaways as technology continues to advance.
AI Snips
Chapters
Transcript
Episode notes
Using Logprobs for Verification
- Verifying miner output by log probability space rather than output text overcomes sampling non-determinism.
- This allows consistent model verification despite stochastic output variations.
Optimal Mining Pool Size
- Use competing mining pools of 4 to 16 miners for each training task to balance quality and efficiency.
- Avoid having too few or too many miners on the same task to prevent poor results or wasted compute.
Client's Personal LLM Wish
- A client once wanted to train a language model on their own data before they died.
- They wished their grandchildren could interact with a personalized model of them, showing LLMs' personal applications.