AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Examining Hallucinated Cases and Citations in Legal Databases
This chapter delves into the phenomenon of hallucinated cases and citations within legal databases, citing a study redo involving LexisNexis and Thomson Reuters. The speakers discuss instances of hallucinated answers and cases from the future being cited, emphasizing the need for transparency regarding the limitations of AI in legal databases.