AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
What Haven't We Covered, Bryant?
Bryant: The status quo is not perfect, the technologies aren't going to be a panacea. And grappling with that, i think, really would improve discourse on the issues. We have lots of prior examples that, like any analogy, are useful until they're not. Any technology needs to be understood in terms of broader societal change,. Not in isolation under the erroneous assumption that everything else remains fixed. That's really scary. I think it's really exciting to imagine a much fuller future.
Autonomous vehicles hardly live up to their name. The goal of true “driverlessness” was originally hyped in the 1930s but keeps getting kicked further and further into the future as the true complexity of driving comes into ever-sharper and more daunting focus. In 2022, even the most capable robotic cars aren’t self-determining agents but linked into swarms and acting as the tips of a vast and hidden web of design, programming, legislation, and commercial interest. Infrastructure is more than the streets and signs but includes licensing requirements, road rules, principles of product liability, and many other features that form the landscape to which driverless cars continue to adapt, and which they will increasingly alter.
While most ethical debates about them seem to focus on the so-called “Trolley Problem” of how to teach machines to make decisions that minimize human casualties, there are many other wicked problems to consider:
Is automated driving a technological solution or a policy solution? Should policymakers have the same expectations for automated and conventional driving? How safe must an automated vehicle be for deployment? Should humans or computers have ultimate authority over a given action? Should harm that a human could have prevented somehow outweigh harm that a human caused? Given that a hacker could infect entire fleets, maps, or real-time communication between cars, how much new risk are we willing to take to reduce the more traditional safety hazards with which we are familiar? And, perhaps most surreally: How do you ticket a robot, and who should pay?
Welcome to COMPLEXITY, the official podcast of the Santa Fe Institute. I’m your host, Michael Garfield, and every other week we’ll bring you with us for far-ranging conversations with our worldwide network of rigorous researchers developing new frameworks to explain the deepest mysteries of the universe.
This week on complexity, we speak to Bryant Walker Smith (Twitter) at the University of South Carolina School of Law and The Center for Internet and Society at Stanford, whose work centers on the ethics of autonomous vehicles. We link up to explore the myriad complexities — technological, regulatory, and sociocultural — surrounding the development and roll-out of new mobility platforms that challenge conventional understanding of the boundaries between person, vehicle, institution, and infrastructure. Buckle up and lean back for a dynamic discussion on the ever-shifting locii of agency, privacy and data protection, the relationship between individuals, communities, and corporations…
If you value our research and communication efforts, please subscribe to Complexity Podcast wherever you prefer to listen, rate and review us at Apple Podcasts, and/or consider making a donation at santafe.edu/give.
Thank you for listening!
Join our Facebook discussion group to meet like minds and talk about each episode.
Podcast theme music by Mitch Mignano.
Follow us on social media:
Twitter • YouTube • Facebook • Instagram • LinkedIn
Discussed:
• Ethics of Artificial Intelligence in Transport
• Who is driving driverless cars?
• From driverless dilemmas to more practical commonsense tests for automated vehicles
• Who’s Responsible When A Self-Driving Car Crashes?
• How Do You Ticket a Driverless Car?
• Controlling Humans and Machines
• Regulation and the Risk of Inaction
• Government Assessment of Innovation Shouldn’t Differ for Tech Companies
• New Technologies and Old Treaties
• It’s Not The Robot’s Fault! Russian and American Perspectives on Responsibility for Robot Harms
Mentioned:
Melanie Mitchell - A.I.: A Guide for Thinking People + Complexity ep. 21
Kathy Powers & Melanie Moses on The Complexity of Harm, Complexity ep. 75
Cris Moore on Algorithmic Injustice, Complexity ep. 51
Luis Bettencourt on Urban Networks, Complexity ep. 4
Sabine Hauert on Swarming Robots, Complexity ep. 3
Kevin Kelly - Out of Control
Emergent Engineering
Cory Doctorow
Jake Harper (formerly of Zoox)
InterPlanetary Festival
Jose Luis Borges
W. Brian Arthur - The Nature of Technology + Complexity ep. 13
Ricardo Hausmann
Amazon Prime Video - Upload
Charles Stross - Halting State
Doyne Farmer on Market Malfunction, Complexity ep. 56
Marten Scheffer on Autocorrelation & Collapse, Complexity ep. 33
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode