How AI Is Built

#049 BAML: The Programming Language That Turns LLMs into Predictable Functions

May 20, 2025
In this discussion, Vaibhav Gupta, co-founder of Boundary, dives into BAML, a programming language designed to streamline AI pipelines. He emphasizes treating large language model (LLM) calls as typed functions, which enhances reliability and simplifies error handling. The podcast explores concepts like Schema-Aligned Parsing and the drawbacks of traditional JSON constraints. Vaibhav also discusses the importance of simplicity in programming and how BAML facilitates better interactions between technical and non-technical users, ensuring robust AI solutions.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

LLMs Require Fault-Tolerant Design

  • AI pipelines require fault tolerance unlike traditional reliable backend systems.
  • Treat LLMs as unreliable function calls needing fallback and error handling to build robust apps.
ADVICE

Treat Prompts as Code or Data

  • Store and manage prompts like code or parameters in databases for experimentation and promotion.
  • Keep prompt management flexible to support team workflows and promote robust versioning.
INSIGHT

LLMs as Typed Functions

  • View LLMs purely as functions transforming inputs to outputs, not magic.
  • This abstraction simplifies orchestration and tool calling by treating everything as typed function calls.
Get the Snipd Podcast app to discover more snips from this episode
Get the app