Let's Get Surety cover image

#147 Federal Bid Protests and the Unexpected Twists of AI

Let's Get Surety

00:00

How Often Do Legal AIs Hallucinate?

David explains legal LLM hallucination rates and that even legal-specific models still produce significant error rates.

Play episode from 18:20
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app