

AI Is Moving Fast. We Need Laws that Will Too.
20 snips Sep 13, 2024
Casey Mock, Policy Director at the Center for Humane Technology, dives into the urgent need for a legal framework to regulate AI. With companies racing ahead, he discusses the importance of holding AI developers accountable for harms caused by their products. Mock highlights real-world examples, such as deepfake scams, to emphasize the risks of unregulated AI. He advocates for innovative yet flexible regulations that balance consumer protection with the need for innovation, exploring how to design laws that keep pace with fast-moving technology.
AI Snips
Chapters
Transcript
Episode notes
Policy Team Incentives
- Tech company policy teams prioritize maximizing shareholder value.
- This limits their ability to provide good-faith policy input.
AI as a Product
- AI and social media should be treated as manufactured products, not services, under liability law.
- This would hold companies responsible for harms caused by defects.
Airline Chatbot Fiasco
- An airline chatbot gave incorrect bereavement policy information.
- The airline was held liable when they refused to honor the incorrect guidance.