

#264 Amr Awadallah: Vectara’s Mission To Make AI Hallucination-Free & Enterprise Ready
Jun 22, 2025
Amr Awadallah, CEO and co-founder of Vectara, is on a mission to make AI accurate and secure. He discusses the challenges of AI hallucinations and how his innovative strategies tackle this issue. Awadallah highlights the importance of data quality, real-time hallucination detection, and the collaborative approach to enhance AI performance. He also delves into the significance of trust and transparency in AI for enterprises. This conversation offers valuable insights for anyone interested in the future of reliable, production-ready AI.
AI Snips
Chapters
Transcript
Episode notes
Vectara’s Core AI Focus
- Vectara focuses on building AI agents that are accurate and prevent hallucinations.
- They also emphasize security against prompt attacks and explainability for enterprise trust.
RAG and Real-Time Fact Checking
- RAG (Retrieval-Augmented Generation) is essential for grounding AI responses in facts to reduce hallucinations.
- Vectara uses a real-time hallucination evaluation model to fact-check and ensure accuracy in milliseconds.
Real-World AI Hallucination Risks
- A lawyer relying on ChatGPT got disbarred for citing fabricated case references.
- Air Canada had a Supreme Court case over a hallucinated ticket offer priced at $1.