The Data Exchange with Ben Lorica cover image

Using LLMs to Build AI Co-pilots for Knowledge Workers

The Data Exchange with Ben Lorica

00:00

How to Avoid Hallucination in Messages

The AI can just give you immediately, like follow these three steps and then you're done. The human can't remember all that. So I think the fact that you guys actually are building an application that is an actual full fledged solution, I would lean into that and hallucination is just one aspect of it. But don't emphasize too much hallucination for the following reason. It will be a confusing term which everyone will claim they have. You know, we do penetration testing or right. Then when you go somewhere, everyone is doing hallucination. Right? They claim. And if you have status, they will pick up. And they will treat you nicely. Okay. Yeah.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app