

From Data Pipelines to Agentic Applications: Deploying LLM Apps That Actually Work
12 snips May 1, 2025
Spencer Cook, Lead Solutions Architect at Databricks, helps financial services leverage cloud solutions for real-world challenges. He discusses the journey from data hype to practical AI, focusing on real-time data pipelines and vector search. Spencer elaborates on how to minimize AI hallucinations while maximizing business value through clean data governance. The conversation highlights innovations like Retrieval-Augmented Generation (RAG) for AI accuracy, and the critical role of data management in ensuring effective LLM deployment and customer experience.
AI Snips
Chapters
Transcript
Episode notes
Data is the Key to AI Value
- Enterprises achieve AI business value mainly through information retrieval and coding assistance use cases.
- The key challenge is organizing and cleaning data to reliably feed advanced AI models.
Avoid LLM Hallucinations
- Certify AI answers with human validation to avoid hallucinations in LLMs.
- Track and augment training data using LLM ops techniques to improve accuracy across domains.
Build Reliable AI Data Pipelines
- Build reliable, low-latency data layers like feature stores and vector stores to power AI applications.
- Apply established data engineering practices for document and multimedia data as for structured data.