

Computer Says Maybe
Alix Dunn
Technology is changing fast. And it's changing our world even faster. Host Alix Dunn interviews visionaries, researchers, and technologists working in the public interest to help you keep up. Step outside the hype and explore the possibilities, problems, and politics of technology. We publish weekly.
Episodes
Mentioned books

Jul 25, 2025 • 51min
After the FAccT: Labour and Misrepresentation
Did you miss FAccT? We interviewed some of our favourite session organisers!More like this: Part One of our FAccT roundup: Materiality and Militarisation.Georgia, Soizic, and Hanna from The Maybe team just went to FAccT. Georgia and Soizic interviewed a bunch of amazing researchers, practitioners, and artists to give you a taste of what the conference was like if you didn’t get to go. Alix missed it too — you’ll learn along with her!In part two we look into how AI is used to misrepresent people through things like image generation, and even care labour. These are conversations about AI misrepresenting hidden identities, care work becoming data work, how pride and identity is tied to labour — and how labour organisers are building solidarity and movement around this.Who features in this episode:Priya Goswami brought a multimedia exhibition to FAccT: Digital Bharat. This explores the invisibilised care work and manual labour by women in India, and how their day-to-day has become mediated by digital public infrastructures.Kimi Wenzel organised Invisible by Design? Generative AI and Mirrors of Misrepresentation, which invited users to confront generated images of themselves and discuss issues of representation within these systems.Alex Hanna and Clarissa Redwine ran the AI Workers Inquiry, which brought people together to share in how AI has transformed their work, identify common ground, and potentially begin building resistance.Further reading & resources:Circuit Breakers — tech worker conference organised by Clarissa RedwineKimi Wenzel’s researchBuy The AI Con by Alex Hanna and Emily Bender**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Jul 23, 2025 • 14min
Short: Musk: Reanimating Apartheid w/ Nic Dawes
In May, Grok couldn’t stop talking about white genocide. This injection of right-wing South African politics triggered a conversation with a Musk contemporary, Nic Dawes.In this short Nic shares his perspective on how post-apartheid white communities have dealt with apartheid’s end. And how Musk is basically seeking out an information environment that can recreate the apartheid information system: Grok is just an extension of a media ecosystem designed to soothe guilt and stoke resentment.Computer Says Maybe Shorts cover recent news with an expert in our network. If there is a news story you want us to cover, please email pod@themaybe.orgNic is Executive Director at THE CITY, a news outlet serving the people of New York through independent journalism that holds the powerful to account, deepens democratic participation, and helps make sense of the greatest city in the world. He has led news and human rights organizations on three continents, and was previously Deputy Executive Director of Human Rights Watch, Chief Content Officer of Hindustan Times in Delhi, and Editor-in-Chief of South Africa's Mail & Guardian newspaper.

Jul 18, 2025 • 1h 4min
After the FAccT: Materiality and Militarisation
Georgia, Soizic, and Hanna from The Maybe team just went to FAccT. Georgia and Soizic interviewed a bunch of amazing researchers, practitioners, and artists to give you a taste of what the conference was like if you didn’t get to go. Alix missed it too — you’ll learn along with her!In part one we explore the depth of AI’s hidden material impacts, including its use in military applications and to aid genocide. One of our interviewees talked about why they spoke up at the town hall — questioning why FAccT, the biggest AI ethics conference there is, accepts sponsorship from those same military contractors.Who we interviewed for Part One:Charis Papaevangelou who co-organised a CRAFT session called The Hidden Costs of Digital Sovereignty. Greece is trying to position itself as a central digital hub by building data centres and participating in the ‘fourth industrial revolution’ — but what does this actually mean for the people and infrastructure of Greece?Georgia Panagiotidou ran a session on The Tools and Tactics for Supporting Agency in AI Environmental Action — offering some ideas on how the community can get together and meaningfully resist extractive practices.David Widder discussed his workshop on Silicon Valley and The Pentagon, and his research on the recent history of the DoD funding academic papers — is it ever worth taking military money, even for basic research?Tania Duarte offered something very different: a demonstration of two workshops she runs for marginalised groups, to better explain the true materiality of AI, and build knowledge that gives people more agency over the dominant narratives and framings in the industry.Further reading & resources:Recording of Charis’s CRAFT session: The Hidden Cost of Digital SovereigntyCloud hiding undersea: Cables & Data Centers in the Mediterranean crossroads by Theodora KostakaBasic Research, Lethal Effects: Military AI Research Funding as Enlistment and Why ‘open’ AI systems are actually closed and why this matters by David WidderThe video that David quoted the Carnegie Mellon professor from — David was paraphrasing in the episode!We and AI & Better Images of AIMore on Georgia Panagiotidou’s work and resources from her session**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Jul 11, 2025 • 38min
Making Myths to Make Money w/ AI Now
AI Now have just released their 2025 AI Landscape report — Artificial Power. Alix sat down with two of it’s authors, Amba Kak and Sarah Myers-West for a light unpacking of the themes within.This report isn’t a boring survey of what AI Now have been doing this year; it’s a comprehensive view of the state of AI, and the concentrated powers that prop it up. What are the latest AI-shaped solutions that the hype guys are trying to convince us are real? And how can we reclaim a positive agenda for innovation — and unstick ourselves from a path towards pseudo religious AGI.Further reading & resources:Read the AI Now 2025 Landscape Report: Artificial Power**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!***Amba Kak has spent the last fifteen years designing and advocating for technology policy in the public interest, across government, industry, and civil society roles – and in many parts of the world. Amba brings this experience to her current role co-directing AI Now, a New York-based research institute where she leads on advancing diagnosis and actionable policy to tackle concerns with artificial intelligence and concentrated power. She has served as Senior Advisor on AI to the Federal Trade Commission and was recognized as one of TIME’s 100 Most Influential People in AI in 2024.**Sarah Myers-West has spent the last fifteen years interrogating the role of technology companies and their emergence as powerful political actors on the front lines of international governance. Sarah brings this depth of expertise to policymaking in her current role co-directing AI Now, with a focus on addressing the market incentives and infrastructures that shape tech’s role in society at large and ensuring it serves the interests of the public. Her forthcoming book, Tracing Code (University of California Press) draws on years of historical and social science research to examine the origins of data capitalism and commercial surveillance.*

Jul 4, 2025 • 55min
Is Computer Science Made for Dudes? w/ Felienne Hermans
Felienne Hermans calls herself an ‘involuntary ethnographer of computer science’. She studies the culture behind programming, and challenges the dominant idea that learning to program has to be painful. Alix and Felienne chat about the history of programming and how it went from multidisciplinary and inclusive, to masochistic and exclusive. They also dig into all the ways it excludes women and people who do not speak English.Further reading & resources:Scratch — a high level programming language aimed at kidsHedy — the programming language that Felienne designedJoin in and help out with Hedy!GenderMag by Margaret Burnett — how to ensure more gender inclusiveness in your softwareElm — an easy and kind browser-based programming languageA Case for Feminism in Programming Language Design by Felienne Hermans & Ari SchlesingerA Framework for the Localization of Programming Languages by Felienne Hermans & Alaaeddin SwidanSubscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!Felienne is the creator of the Hedy programming language, a gradual and multi-lingual programming language designed for teaching. She is the author of “The Programmer’s Brain“, a book that helps programmers understand how their brain works and how to use it more effectively. In 2021, Felienne was awarded the Dutch Prize for ICT research. She also has a weekly column on BNR, a Dutch radio station.

Jun 27, 2025 • 46min
The Elephant in the Algorithm: Live from ZEG Fest in Tbilisi
Smart people focused on technology politics issues get it. We trade high level helpful concepts like surveillance capitalism, automated inequality, and enshittification. And even as some of these ideas are making it more mainstream, normies aren’t getting the message. We need stories for that. But how? How do we take the technical jargon and high-level concepts that dominate tech narratives and instead create stories that are personal, relatable, and powerful?And how do we combat the amazing hero-god narratives of Silicon Valley without reinforcing them?Alix went to storytelling festival ZEG Fest in Tbilisi to chat with three amazing storytellers about that challenge:Armando Iannucci, creator of Veep and The Thick of It: who discusses how to use humour and satire to keep things simple — and that stories are not ‘made up’, but rather a way to relay a series of facts and concepts that are complex and difficult to process.Chris Wylie, Cambridge Analytica whistleblower: on how the promise of superintelligence and transhumanism is basically like a religious prophecy. His new show Captured explores the stories that tech elites are telling us about our utopian AI future.Adam Pincus, producer of The Laundromat and Leave no Trace: shares his frustrations with the perceived inevitability of AI in his day to day, and also tells us more about his podcast series ‘What Could Go Wrong?’ in which he explores writing a Contagion sequel with director Scott Burns.Further reading & resources:Captured: The Secret Behind Silicon Valley’s AI Takeover — limited podcast series featuring Chris Wylie**‘Contagion’ Screenwriter Scott Z. Burns Asks AI to Write a Sequel to Pandemic Film in Audible Original Series ‘What Could Go Wrong?’** — Variety articleWhat Could Go Wrong? — limited podcast series by Scott Burns**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Jun 20, 2025 • 38min
Is Digitisation Killing Democracy? w/ Marietje Schaake
There has been an intentional and systematic narrative push that tells governments they are not good enough to provide their own public infrastructure or regulate tech companies that provide it for them.Shocking: these narratives stem from large tech companies, and this represents what Marietje Schaake refers to as a Tech Coup — which is the title of her book (which you should buy!).The Tech Coup refers to the inability of democratic policymakers to provide oversight, regulation, and even visibility into the structural systems that big tech is building, managing, and selling. Marietje and Alix discuss what happens when you have a system of states whose knowledge and confidence have been gutted over decades — hindering them from providing good services, and understanding how to meaningfully regulate the tech space.Further Reading & Resources:Buy The Tech Coup by Marietje Schaake**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**Marietje Schaake is a non-resident Fellow at Stanford’s Cyber Policy Center and at the Institute for Human-Centered AI. She is a columnist for the Financial Times and serves on a number of not-for-profit Boards as well as the UN's High Level Advisory Body on AI. Between 2009-2019 she served as a Member of European Parliament where she worked on trade-, foreign- and tech policy. She is the author of **The Tech Coup.**

Jun 13, 2025 • 1h
AI in Gaza: Live from Mexico City
This episode contains some descriptions of torture methods, automated human targeting by machines, and psychological warfare throughoutLast week Alix hosted a live show in Mexico City right after REAL ML. Four panellists discussed a huge important topic, which has been wrongfully deemed as taboo by other conferences: the use of AI and other technologies to support the ongoing genocide in Palestine.Here’s a preview of what the four speakers shared:Karen Palacio AKA kardaver gave us an overview of Operation Condor — a program of psychological warfare that ran in the late 20th century in South America to suppress activist voices.Marwa Fatafta explains how these methods are still used today against Palestinians; there are coordinated surveillance projects that make Palestinian citizens feel they are living in a panopticon, and the granular data storage and processing is facilitated by AWS, Google, and Azure.Matt Mahmoudi goes on to describe how these surveillance projects have crystallised into sophisticated CCTV and facial recognition networks through which Palestinians are continuously dehumanised via face-scanning and arbitrary checks that restrict movements.Wanda Muñez discusses how fully autonomous weapons obviously violate human rights in all kinds of ways — but ‘AI ethics’ frameworks never make any considerations for machines that make life or death decisions.Further reading & resources:The Biometric State by Keith Breckenridge — where the phrase ‘automated apartheid’ was conceivedCOGWAR Report by Karen Palacio, AKA KardaverSubscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!Wanda Muñez is an international consultant with twenty years of experience in the design, implementation and evaluation of programs and policies on human rights, gender equality, inclusion and the rights of people with disabilities. Wanda has worked for international NGOs and UN organizations in Asia, Africa, Europe and Latin America. She became involved in the field of artificial intelligence in 2017, initially through the analysis of its intersection with International Humanitarian Law in the issues of autonomous weapons systems; and later focusing on the intersection between human rights and AI. In 2020, she was nominated by the Ministry of Foreign Affairs of Mexico as an independent expert at the Global Alliance on Artificial Intelligence (GPAI), where she contributed to various publications and panels, and led the design of the research “Towards true gender equality and diversity in AI” that is currently being implemented. In 2020, Wanda Muñoz was recognized by the Nobel Women's Initiative as "a peacebuilder working for peace, justice and equality" and by UNLIREC as one of Latin America's "forces of change, working for humanitarian disarmament, non-proliferation and arms control. Wanda also just recently won DEI Champion of Year Award from Women in AI.Karen Palacio, aka kardaver, is an interdisciplinary digital artist, industrial programmer specialized in AI, and data scientist from Córdoba, Argentina. She researches and creates through iterative loops of implementation and reflection, aiming to understand what it means to articulate artistic-technological discourses from the Global South. Her performances, installations, and audiovisual works engage critically and rootedly with the depths of computation, the histories of computing and archives, freedom of knowledge, feminisms, and the pursuit of technological sovereignty. She develops and works with Free Software in her processes, resemanticizing technologies she knows from her background as an industrial programmer.Dr Matt Mahmoudi is Assistant Professor in Digital Humanities at the University of Cambridge, and a Researcher/Advisor on Artificial Intelligence and Human Rights at Amnesty International. Matt’s work has looked at AI-driven surveillance from the NYPD’s surveillance machine to Automated Apartheid in the occupied Palestinian territory. Matt is author of Migrants in the Digital Periphery: New Urban Frontiers of Controls (University of California Press, February 2025), and co-editor of Resisting Borders & Technologies of Violence (Haymarket, 2024) together with Mizue Aizeki and Coline Schupfer.Marwa Fatafta leads Access Now’s policy and advocacy work on digital rights in the Middle East and North Africa (MENA) region. Her work spans a number of issues at the nexus of human rights and technology including content governance and platform accountability, online censorship, digital surveillance, and transnational repression. She has written extensively on the digital occupation in Palestine and focuses on the role of new technologies in armed conflicts and humanitarian contexts and their impact on historically marginalized and oppressed communities. Marwa is a Policy Analyst at Al-Shabaka: The Palestinian Policy Network, an advisory board member of the Tahrir Institute for Middle East Policy, and an advisory committee member for Bread&Net. Marwa was a Fulbright scholar in the US and holds an MA in International Relations from Maxwell School of Citizenship and Public Affairs, Syracuse University. She holds a second MA in Development and Governance from the University of Duisburg-Essen.

Jun 6, 2025 • 44min
Logging Off w/ Adele Walton
Adele Walton, a British Turkish journalist and online safety advocate, shares her journey from a youth consumed by social media to a campaigner for safer online spaces. She candidly discusses the emotional toll of her sister's loss due to online harms and her motivations for writing *Logging Off*. The conversation delves into the burdens of online perfectionism, the need for trauma-informed design in technology, and the pressing demand for accountability in digital policy, emphasizing the importance of genuine human connections amidst a complex digital landscape.

Jun 4, 2025 • 21min
Short: Sam Altman’s World w/ Billy Perrigo
Sam Altman is doing another big infrastructure push with World (previously Worldcoin) — the universal human verification system.We had journalist Billy Perrigo on to chat what’s what with World. Is Sam Altman just providing a solution to a problem that he himself caused with OpenAI? Do we really need human verification, or is this just a way to side-step the AI content watermarking issue?Further reading & resources:The Orb Will See You Now by Billy PerrigoThe ethical implications of AI agents by DeepMindComputer Says Maybe Shorts bring in experts to give their ten-minute take on recent news. If there’s ever a news story you think we should bring in expertise on for the show, please email pod@saysmaybe.comPerrigo is a correspondent at TIME, based in the London bureau. He covers the tech industry, focusing on the companies reshaping our world in strange and unexpected ways. His investigation ‘Inside Facebook’s African Sweatshop’ was a finalist for the 2022 Orwell Prize.