AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
How to Avoid Hallucinations in Important Output
There was a lawyer that got in quite a bit of trouble for using LLM to create documents that elucidated rulings that didn't exist. How do we avoid hallucinations in important output? And that is the right question to ask. Jonathan: The solution is to not ask creative things, but instead be able to constrain it to a retrieval augmented generation that will not lose.