
Kate Crawford
Co-founder of the AI Now Institute, studying the social and ethical implications of AI. A leading researcher in the field.
Top 10 podcasts with Kate Crawford
Ranked by the Snipd community

34 snips
Apr 18, 2023 • 1h 10min
The Discord Leaks, San Francisco Safety, and Kate Crawford on AI
This week’s guest is Kate Crawford, a Principal Researcher at Microsoft Research Lab and a Professor at USC Annenberg, who delves into the societal impacts of AI. She discusses the urgent need for AI literacy and the complexities of online behavior, especially among younger generations. The conversation also touches on the implications of recent classified document leaks and the role of technology in urban safety. Crawford raises critical points on the transformative potential of AI while highlighting the necessary regulatory measures to mitigate its risks.

17 snips
May 12, 2024 • 51min
Kate Crawford: A Leading Scholar and Conscience for A.I.
Kate Crawford, a leading scholar in AI, discusses the environmental impacts of large AI systems, biases in AI data sets, exploitation of human labor in AI development, interdisciplinary collaboration in AI research, and concerns surrounding AI applications in medicine. She sheds light on the energy consumption and resource extraction involved in AI, societal biases embedded in AI, labor conditions in the industry, and ethical considerations in healthcare applications.

8 snips
Aug 29, 2023 • 27min
Charting the true cost of AI
Guests include academic researcher Kate Crawford discussing the true cost of AI, reporter Chris Vallance on DeepMind's watermark system for AI-generated images, Mansoor Hamayun on the smart cooking valve in Rwanda, and Fu’ad Lawal on archiving Nigeria's newspapers. Topics include environmental impact of AI, gas canister system for affordable fuel, preserving Nigerian history through digitizing newspapers, and the impact of policy and cultural expression in AI.

4 snips
Apr 18, 2023 • 27min
To Solve the AI Problem, Rely on Policy, Not Technology
Leading scholar Kate Crawford discusses the potential harms of AI for society, emphasizing the need for policy, not just technology. They explore demystifying AI, addressing bias, and the importance of regulation to ensure safety, accountability, and transparency in AI systems.

Jun 22, 2020 • 47min
Interdependence 9: Kate Crawford (AI Now)
Kate Crawford, founder of the AI Now Institute and a leading voice on the political implications of AI, dives into the hidden complexities of artificial intelligence. She discusses how platforms like Amazon Echo exploit human and ecological resources and critiques the power dynamics of platform capitalism. The conversation also covers data-driven hiring biases, the impact of algorithmic feedback on communication, and the socio-economic disparities amplified by technology. Crawford's insights provide a thought-provoking look at the intersection of AI, society, and political power.

Oct 16, 2025 • 49min
Leading AI Professor: We Must Address AI's Climate Impact Before It’s Too Late | Kate Crawford
In a riveting discussion, Kate Crawford, a leading AI researcher and author of Atlas of AI, dives into the profound environmental and ethical implications of artificial intelligence. She reveals how AI's massive infrastructure consumes significant resources, challenging the myth of its virtuality. Kate warns that the AI race among nations increases both climate risk and societal harm. With insights on the need for sustainable practices and the dangers of concentrated corporate power, she emphasizes that systemic change is essential for a greener future.

Oct 1, 2025 • 52min
Brené Brown and Kate Crawford on Artificial Intelligence and the Human Spirit
Brené Brown, a research professor known for her work on courage and empathy, joins Kate Crawford, an AI scholar and author, to explore the profound impacts of artificial intelligence on human connection. They discuss how AI, driven by exploitation and material extraction, can lead to feelings of emptiness. Brené critiques reassuring platitudes that oversimplify human traits, while Kate highlights the risks of cognitive outsourcing. Together, they advocate for authentic connections and the importance of discernment in navigating an AI-driven world.

Mar 6, 2025 • 57min
Campus podcast: Why we need interdisciplinarity in teaching and research
Join Gabriele Bammer, a leading voice in interdisciplinary sciences, and Kate Crawford, a pivotal scholar on AI's societal impacts, as they explore the crucial need for breaking down academic silos. They discuss the challenges faced in integrating interdisciplinary methods into education and how collaboration can address complex global issues. Bammer emphasizes effective teamwork and communication, while Crawford highlights the inconsistencies in AI development, advocating for a diverse approach to tackle ethical concerns around technology.

Jun 18, 2024 • 47min
The Real World Cost of AI
Kate Crawford, co-founder of the AI Now Institute and author of Atlas of AI, dives into the complex realities of artificial intelligence. She debunks the myth that AI is immaterial, exposing its vast physical and environmental footprint. Crawford discusses the hidden human labor behind AI systems and the significant biases encoded in massive datasets. She emphasizes the urgency of addressing the planetary costs of AI and advocates for moral restraint in its expansion. Collective action, she argues, is key to shaping AI's future responsibly.

Apr 8, 2019 • 1h 3min
Recode Decode: Meredith Whittaker and Kate Crawford
Meredith Whittaker and Kate Crawford, founders of the AI Now Institute, dive deep into the societal implications of artificial intelligence. They discuss the dangers of 'dirty data' and biased search results that can skew AI conclusions. The importance of diversity in tech is highlighted, alongside the ethical concerns of technologies like facial recognition. They also critique current AI self-regulation efforts and explore international approaches, notably China's social credit system. Their insights underscore the need for transparency, accountability, and a more inclusive tech landscape.