With so much focus on the use of large language models in law practice, the Kelvin Large Language Model – or KL3M (pronounced CLEM) for short – stands out as distinct for two reasons. For one, it is the first LLM built entirely from scratch specifically for the legal market. In addition, it is the first LLM in any domain to be training entirely on clean, legally permissible data and to be certified as such by the organization Fairly Trained.
To discuss how KL3M was developed and why this built-from-scratch, domain-specific LLM is significant for the legal industry, our guest for today’s LawNext is Jillian Bommarito, chief risk officer at 273 Ventures, the company that developed CLEM. Not only was Bommarito involved in developing KL3M and the data set used to train it, but she also oversaw the process of earning KL3M its Fairly Trained certification.
Regular listeners of LawNext may remember my interview last year with two of the other principals of 273 Ventures, CEO Michael Bommarito, who is Jillian’s husband, and Chief Science Officer Daniel Katz, who were on this show just after they conducted the first experiment in having GPT take the bar exam. All three, along with Katz’s wife Jessica Katz, had formerly founded the legal AI and consulting company LexPredict, which was acquired in 2018 by the global law company Elevate.
In today’s conversation, Bommarito talks about what went into developing the model and creating the dataset to train it, and discusses what it offers the legal market and why and how a law firm would use this model over others that are commercially available.
Thank You To Our Sponsors
This episode of LawNext is generously made possible by our sponsors. We appreciate their support and hope you will check them out.
If you enjoy listening to LawNext, please leave us a review wherever you listen to podcasts.