And for an applied ethics perspective, I think the most important thing is if we want to minimize suffering in the world, and if we want to minimize animal suffering, we should always, err on the side of caution, we should always be on the safe side.
Should we advocate for a moratorium on the development of artificial sentience? What might that look like, and what would be the challenges?
Thomas Metzinger was a full professor of theoretical philosophy at the Johannes Gutenberg Universitat Mainz until 2022, and is now a professor emeritus. Before that, he was the president of the German cognitive science society from 2005 to 2007, president of the association for the scientific study of consciousness from 2009 to 2011, and an adjunct fellow at the Frankfurt Institute for advanced studies since 2011. He is also a co-founder of the German Effective Altruism Foundation, president of the Barbara Wengeler Foundation, and on the advisory board of the Giordano Bruno Foundation. In 2009, he published a popular book, The Ego Tunnel: The Science of the Mind and the Myth of the Self, which addresses a wider audience and discusses the ethical, cultural, and social consequences of consciousness research. From 2018 to 2020 Metzinger worked as a member of the European Commission's high level expert group on artificial intelligence.
Topics discussed in the episode:
- 0:00 introduction
- 2:12 Defining consciousness and sentience
- 9:55 What features might a sentient artificial intelligence have?
- 17:11 Moratorium on artificial sentience development
- 37:46 Case for a moratorium
- 49:30 What would a moratorium look like?
- 53:07 Social hallucination problem
- 55:49 Incentives of politicians
- 1:01:51 Incentives of tech companies
- 1:07:18 Local vs global moratoriums
- 1:11:52 Repealing the moratorium
- 1:16:01 Information hazards
- 1:22:21 Trends in thinking on artificial sentience over time
- 1:39:38 What are the open problems in this field, and how might someone work on them with their career?
Resources discussed in the episode are available at https://www.sentienceinstitute.org/podcast
Support the show