

Language Model Chat Provider (aka BYOK) API with Logan Ramos
Sep 8, 2025
Logan Ramos, a Senior Software Engineer on the VS Code team, dives into the innovative Bring Your Own Key (BYOK) feature, allowing users to configure various language models. He unpacks the trade-offs between cloud and local models regarding privacy and performance. Logan explains the new extensible API that permits model providers to create marketplace extensions. Additionally, he shares insights on early adopter feedback and the team's ‘dogfooding’ experiences, revealing how they’re refining model selection for users.
AI Snips
Chapters
Transcript
Episode notes
Origin Of 'Bring Your Own Key'
- Logan named the feature "bring your own key" early in development and later clarified its official name is language model chat providers.
- He explains BYOK lets users add any model to GitHub Copilot to use via chat and agents as if it were built in.
Why BYOK Exists
- VS Code can't ship every model because each new model increases backend GPU fragmentation and cost.
- BYOK shifts that burden to external providers and lets users pick niche or cutting-edge models themselves.
Choose BYOK For Access Or Privacy
- Use BYOK when you need model access or privacy, including local models for regulated data.
- Try local providers like LM Studio or Ollama for on-device, privacy-first inference.