In a surprising turn of events, a massive lawsuit against Spotify for streaming unauthorized songs ended up benefiting the tech company. Zien Tang, a former lawyer and now UCLA law professor, sheds light on the potential silver lining of such lawsuits. The case involved songwriter Melissa Farick and David Lowry, who filed separate lawsuits that later merged. Despite Spotify streaming countless unlicensed songs, they managed to navigate the legal battle with the help of Zien Tang and their defense team. A captivating tale of unexpected outcomes.
When best-selling thriller writer Douglas Preston began playing around with OpenAI's new chatbot, ChatGPT, he was, at first, impressed. But then he realized how much in-depth knowledge GPT had of the books he had written. When prompted, it supplied detailed plot summaries and descriptions of even minor characters. He was convinced it could only pull that off if it had read his books.
Large language models, the kind of artificial intelligence underlying programs like ChatGPT, do not come into the world fully formed. They first have to be trained on incredibly large amounts of text. Douglas Preston, and 16 other authors, including George R.R. Martin, Jodi Piccoult, and Jonathan Franzen, were convinced that their novels had been used to train GPT without their permission. So, in September, they sued OpenAI for copyright infringement.
This sort of thing seems to be happening a lot lately–one giant tech company or another "moves fast and breaks things," exploring the edges of what might or might not be allowed without first asking permission. On today's show, we try to make sense of what OpenAI allegedly did by training its AI on massive amounts of copyrighted material. Was that good? Was it bad? Was it legal?
Help support Planet Money and get bonus episodes by subscribing to Planet Money+ in Apple Podcasts or at plus.npr.org/planetmoney.Learn more about sponsor message choices:
podcastchoices.com/adchoicesNPR Privacy Policy