undefined

Sasha Luccioni

Computer scientist pioneering research on AI''s environmental impact; featured on the BBC 100 Women list.

Top 10 podcasts with Sasha Luccioni

Ranked by the Snipd community
undefined
43 snips
Jun 8, 2023 • 43min

AI: Is It Out Of Control?

Artificial Intelligence seems more human-like and capable than ever before — but how did it get so good so quickly? Today, we’re pulling back the curtain to find out exactly how AI works. And we'll dig into one of the biggest problems that scientists are worried about here: The ability of AI to trick us. We talk to Dr. Sasha Luccioni and Professor Seth Lazar about the science.This episode contains explicit language.There’s also a brief mention of suicide, so please take care when listening. Here are some crisis hotlines:  United States: US National Suicide Prevention Lifeline 1-800-273-TALK (2755) (Online chat available); US Crisis Text Line Text “GO” to 741741 Australia: Lifeline 13 11 14 (Online chat available) Canada: Canadian Association for Suicide Prevention (See link for phone numbers listed by province) United Kingdom: Samaritans 116 123 (UK and ROI) Full list of international hotlines here  Find our transcript here: https://bit.ly/ScienceVsAIIn this episode, we cover:(00:00) 64,000 willies(05:13) A swag pope(06:36) Why is AI so good right now?(09:06) How does AI work? (17:43) Opening up AI to everyone(20:42) A rogue chatbot(27:50) Charming chatbots(29:42) A misinformation apocalypse?(33:16) Can you tell me something good?!(36:08) Citations, credits, and a special surprise… This episode was produced by Joel Werner, with help from Wendy Zukerman, Meryl Horn, R.E. Natowicz, Rose Rimler, and Michelle Dang. We’re edited by Blythe Terrell. Fact checking by Erica Akiko Howard. Mix and sound design by Jonathon Roberts. Music written by Bobby Lord, Peter Leonard, Emma Munger So Wylie and Bumi Hidaka. Thanks to all the researchers we spoke to including Dr Patrick Mineault, Professor Melanie Mitchell, Professor Arvind Narayanan, Professor Philip Torr, Stella Biderman, and Arman Chaudhry.Special thanks to Katie Vines, Allison, Jorge Just, the Zukerman Family and Joseph Lavelle Wilson. Science Vs is a Spotify Original Podcast. Follow Science Vs on Spotify, and if you wanna receive notifications every time we put out a new episode, tap the bell icon! Learn more about your ad choices. Visit podcastchoices.com/adchoices
undefined
34 snips
Oct 31, 2023 • 11min

AI is dangerous, but not for the reasons you think | Sasha Luccioni

Sasha Luccioni, an AI ethics researcher, discusses the current negative impacts of AI, such as carbon emissions, copyright infringement, and biased information. She offers practical solutions to regulate AI for inclusivity and transparency. Topics include environmental costs, bias in facial recognition, and addressing biases in AI systems.
undefined
22 snips
Nov 14, 2024 • 30min

Dr Sasha Luccioni: The climate cost of AI

Dr. Sasha Luccioni, the AI and Climate Lead at Hugging Face, discusses the hidden carbon costs of generative AI technologies like ChatGPT. She sheds light on the significant energy consumption in data centers and the collaborative efforts to analyze AI’s full lifecycle. Luccioni shares her journey from AI research to environmental advocacy, emphasizing the ethical implications of tech company's climate commitments. The conversation also tackles AI's potential role in mitigating climate change while highlighting the need for transparency and accountability in the industry.
undefined
11 snips
Jun 3, 2024 • 48min

Energy Star Ratings for AI Models with Sasha Luccioni - #687

Sasha Luccioni, AI and Climate lead at Hugging Face, discusses the energy consumption of AI models. She compares the efficiency of pre-trained models vs. task-specific models, highlighting the implications and challenges. Sasha introduces Energy Star Ratings for AI Models as a system to select energy-efficient models. The discussion explores the environmental impact, challenges in evaluation, and the importance of documentation standards for AI models.
undefined
10 snips
Jul 18, 2024 • 60min

Generative AI is a Climate Disaster w/ Sasha Luccioni

Sasha Luccioni discusses the environmental impact of generative AI, highlighting companies' rising emissions and struggles to meet climate goals. The chapter explores the energy consumption differences between traditional AI and generative AI, and the push for general AI models driven by major tech companies. It also addresses the lack of transparency in obtaining data on GPU energy usage and the tension between profit-driven tech companies and environmental responsibility.
undefined
9 snips
Jan 10, 2024 • 34min

Managing AI’s Carbon Footprint

Sasha Luccioni, an AI researcher and climate lead at Hugging Face, joins Azeem Azhar to discuss the environmental impact of AI, including energy consumption and carbon emissions. They explore the challenges of categorizing AI's climate impact and the importance of setting standards for generative AI models. They also touch on the challenges of AI infrastructure, existential risks, and the distracted focus on AI risks.
undefined
8 snips
Apr 18, 2024 • 1h 3min

Sasha Luccioni: Connecting the Dots Between AI's Environmental and Social Impacts

Sasha Luccioni, AI and Climate Lead, discusses the environmental impact of AI systems, quantifying emissions, efficient hardware, power-hungry processing, and biases in AI models. The podcast explores the challenges of mitigating carbon footprint in AI experimentation, hardware utilization, energy consumption, and lifecycle assessment of AI models, as well as the difficulties in measuring AI ethics and societal impacts.
undefined
May 10, 2024 • 50min

Our Tech has a Climate Problem: Here's how we solve it

AI researchers and climate researchers discuss the environmental impact of AI models, mining for essential minerals, reimagining transportation for sustainability, battery recycling, and using satellites and AI to monitor greenhouse gas emissions. They also explore the challenges of space debris collisions and harnessing AI for wind energy and electricity supply forecasting to mitigate climate change.
undefined
Dec 15, 2023 • 13min

AI is dangerous, but not for the reasons you think | Sasha Luccioni

Sasha Luccioni, an AI ethic researcher, discusses the current negative impacts of AI, including carbon emissions, copyright infringement, and biased information. She offers practical solutions for regulation to ensure inclusivity and transparency.
undefined
Nov 8, 2023 • 1h 1min

Episode 19: The Murky Climate and Environmental Impact of Large Language Models, November 6 2023

AI researchers Emma Strubell and Sasha Luccioni discuss the environmental impact of large language models, addressing carbon emissions, water and energy consumption. They emphasize the need for education, transparency, and awareness within the AI community. The podcast also covers AI's negative effects on dating apps, ethical concerns in relationship advice, debunking misconceptions about AI capabilities, and the potential negative impact of large language models in generating hateful content.