

#23 - Engineering Intelligence: How to Build LLM Applications at Scale with Marc Klingen of Langfuse
11 snips Apr 17, 2025
Marc Klingen, CEO of Langfuse, shares insights on the journey from startup to a leading open-source LLM engineering platform. He discusses the pivotal challenges of building LLM applications, including the importance of monitoring key performance indicators. The conversation dives into tracing user interactions within AI applications and emphasizes collaboration among diverse roles in AI engineering. Klingen also highlights how observability tools can enhance project success and the benefits of open-source software in fostering community-driven innovation.
AI Snips
Chapters
Transcript
Episode notes
Langfuse's Early Code Red
- Langfuse experienced a Code Red after neglecting version 2 while building version 3, leading to degraded SLOs and customer churn.
- Some customers self-hosted by forking and improving Langfuse themselves, which helped mitigate the impact.
Langfuse's Y Combinator Pivot
- Langfuse started from a pivot during Y Combinator after seeing the power of GPT-3 APIs.
- They quickly grew from demo projects to an open-source LLM engineering platform used by large brands.
Traces Tailored for LLM Apps
- LLM application tracing focuses on recording each step, including security checks, retrievals, and multiple LLM calls.
- This structured tracing enables detailed evaluation beyond status codes, crucial for debugging and improving such apps.