Terms of Service Podcast

Mary Camacho
undefined
Nov 26, 2025 • 59min

Changing Minds and Making Space: Curiosity, Emotion, and Democracy with Dr. Sarah Stein Lubrano

In this episode of Terms of Service, host Mary Camacho speaks with Dr. Sarah Stein Lubrano—author of Don’t Talk About Politics: How to Change 21st Century Minds—about what it takes to think, connect, and persuade in a time of rapid technological and cultural disruption. Drawing from her background in philosophy, psychology, and political theory, Sarah explores how emotions shape our cognition, why curiosity is a democratic virtue, and how design and technology can either open or close off possibilities for shared understanding.Together, they examine how modern systems—from social media to AI agents—can reduce nuance, flatten emotional range, and reward performance over reflection. This conversation invites us to think more deeply about how we encounter difference—and what it takes to stay open when the world feels overwhelming.Key TakeawaysChanging minds isn’t about winning arguments. It starts with curiosity, emotional intelligence, and building the cognitive space for reflection.Democracy requires mental infrastructure. That means not just freedom of speech, but the psychological and social capacity to listen, consider, and evolve.AI and social platforms risk “flattening” cognition. Speed and frictionless interaction can reduce the emotional and epistemic range of public discourse.Design can support or inhibit dignity. How we architect systems of learning, debate, or health shapes what kinds of people and conversations they enable.We don’t need agreement to coexist. But we do need structures that protect space for difference—both in ideas and identities.Topics Covered / Timestamped Sections02:10 – Sarah’s intellectual path: from Oxford and Harvard to emotional epistemology and political learning04:24 – Why she wrote Don’t Talk About Politics and what “changing minds” really involves.13:30 – How certain academic and tech cultures mistake argument for insight, and why more discussion doesn’t necessarily lead to understanding or change.17:50 –The tension between emotional speed and civic depth — what technology amplifies, and what it erodes.24:06 – Designing for reflection: what it takes to build platforms that support empathy, not outrage.39:11 – Bringing emotional education into institutions, policymaking, and design.46:01 –Reflections on where we go from here — cultivating the emotional capacity democracy requires.Guest Bio and LinksDr. Sarah Stein Lubrano – Researcher, educator, and author focused on the psychology of political learning and epistemic humility. She holds a doctorate from Oxford and is the author of Don’t Talk About Politics: How to Change 21st Century Minds.Sarah’s WebsiteDon’t Talk About Politics – Book LinkResources MentionedThe School of Life – Where Sarah developed emotional learning contentTrauma-informed pedagogy – Educational design that recognizes emotional safety and regulation.Patient experience research – How listening and context shape clinical outcomes.AI as cognitive scaffolding – The potential and risks of AI agents in deliberative thinking.Further Reading / Related EpisodesEpisode 6: "Emotional Intelligence in the Age of AI: A Conversation with Marisa Zalabak".Episode 7: "Who Watches the Watchers? Privacy Law, AI, and Power with William McGeveran"Call to ActionHow do we create room for real thought—and for each other—in an age of constant noise? Dr. Sarah Stein Lubrano offers a thoughtful and hopeful path forward, grounded in emotion, curiosity, and civic design.🎧 Listen now: Episode LinkCreditsHost: Mary CamachoGuest: Dr. Sarah Stein LubranoProduced by Terms of Service PodcastSound Design: Arthur Vincent and Sonor LabCo-Producers: Nicole Klau Ibarra & Mary Camacho
undefined
Sep 19, 2025 • 55min

You Don’t Own It If You Can’t Fix It: The Fight for the Right to Repair

Episode SummaryIn this episode of Terms of Service, host Mary Camacho speaks with Gay Gordon-Byrne, Executive Director of the Digital Right to Repair Coalition, about how manufacturers are rewriting the rules of ownership in the digital age. Drawing on decades of experience in enterprise computing and leasing, Gay shares how restrictive repair policies—hidden behind software locks, proprietary tools, and legal fine print—are quietly eroding our rights as consumers.From absurd real-world examples to legislative progress across the U.S., this conversation reveals what’s at stake when we lose the ability to fix the things we own—and how the Right to Repair movement is pushing back.Key TakeawaysRepair is a right, not a loophole. Companies have used copyright law, contracts, and DRM to block basic repairs—redefining ownership in the process.You don’t void your warranty by repairing your own device. Under U.S. law, that’s been protected since the 1970s.Tractors, phones, and dishwashers now run on software. That means repair is increasingly a legal and digital issue, not just mechanical.Fixing things is a cultural practice. It's being squeezed out by design, but it offers economic, environmental, and emotional benefits.State-level legislation is gaining traction. While federal regulators stall, local organizing and public pressure are driving change.Topics Covered / Timestamped Sections04:30 – Understanding Right to Repair - From leasing and enterprise sales to grassroots repair advocacy.08:10 – The slow erosion of repair rights through software and service bundling10:50 – What “Right to Repair” actually means—and what it doesn’t12:52 – The Shift in Consumer Expectations.13:50 – The Economics of Repairability.15:40 – Legal Implications of Ownership.19:00 – Tractors, cars, and consumer electronics: software as the new lock24:00 – The Global Perspective on Repair Culture.27:26 – The Magnuson-Moss Warranty Act and the myth of “voided” warranties.28:00 – Legislative Changes and Consumer Power31:25 – Antitrust and tying agreements: the legal dimension of forced service35:05 – The Role of Consumers in Advocacy- France’s repairability index and global momentum for consumer rights.45:38 – Stories from the field: absurd repair scenarios and growing public awareness.Guest Bio and LinksGay Gordon-Byrne – Executive Director of the Digital Right to Repair Coalition (Repair.org). With decades of experience in the computer leasing industry, Gay has spent the past decade fighting to restore ownership and repair rights for consumers and independent businesses across the U.S.Repair.org Gay Gordon-Byrne on LinkedInResources MentionedMagnuson-Moss Warranty Act (FTC.gov) – Protecting U.S. consumers from deceptive warranty practicesFrance’s Repairability Index – Labeling systems that inform buyers on repair potentialiFixit – Repair guides, community support, and advocacyFurther Reading / Related EpisodesEpisode 3: “Empowerment Tech: Unlocking Customer Data for Better Choices and Better Business”Call to ActionWhat if you couldn’t fix your own tools, car, or phone—even when it’s a simple repair? Listen to Gay Gordon-Byrne explain why the right to repair is about more than gadgets—it’s about autonomy, sustainability, and democratic accountability.🎧 Listen now: Episode LinkCreditsHost: Mary CamachoGuest: Gay Gordon-ByrneProduced by Terms of Service PodcastSound Design: Arthur Vincent and Sonor LabCo-Producers: Nicole Klau Ibarra & Mary Camacho
undefined
Aug 6, 2025 • 1h 6min

Designing Privacy You Can Feel: Smooth, Supportive, Empowering

Eriol Fox, a designer and product manager at Superbloom, tackles complex challenges like sustainable food systems, while Molly Willson leads design research focused on public interest technology. They discuss how privacy isn't just about compliance but requires emotional and relational understanding. The conversation highlights the Privacy Experience Heuristics, aimed at creating intuitive tools that prioritize marginalized users. They emphasize that supportive design can empower users, bridging the gap between secure systems and real human needs.
undefined
Jul 2, 2025 • 46min

Mission, Complexity, and Crisis: Leading in a Rapidly Changing World

Episode SummaryIn this episode of Terms of Service, host Mary Camacho speaks with Dr. David Bray, a seasoned leader who has served in senior roles across the U.S. government, tech, and civil society. From bioterrorism response at the CDC to digital transformation efforts in national intelligence, Bray brings a unique perspective on leadership in complexity. They explore how institutions can adapt in times of disruption, why trust is a critical infrastructure, and how positive change agents can build bridges across sectors—even in polarized environments. With a deep systems lens, Bray challenges us to align technological innovation with human values and long-term mission.Key TakeawaysMission-driven leadership matters most in times of complexity and crisis. Leaders must be able to hold contradictions, listen deeply, and navigate uncertainty with clarity of purpose.Trust is infrastructure. Societal systems—especially in democracies—depend on mutual trust, and technology can either degrade or strengthen that foundation.The U.S. is structured for stalemate, not for rapid transformation. But transformation is still possible—especially in crises—through coalitions and adaptive strategies.Cross-sector collaboration is essential. Government, civil society, and private enterprise must learn to speak a shared language of values and resilience.We must redesign metrics for success. Quarterly profits aren’t the only or best measure; we need frameworks that value long-term human and ecological well-being.Topics Covered / Timestamped Sections02:52 – The Neutrality of Technology and Its Implications06:30 – Agency in the Age of AI and Information09:32 – Policy Evolution in the Face of Rapid Technological Change13:43 – Building Trust Across Divided Sectors14:25 – When institutions break down: adaptive leadership and finding windows of possibility18:10 – Personal Journeys and Motivations in Leadership. 22:48 – Advice for Leaders Amidst Polarization23:20 – Navigating polarized environments with shared values and pluralist frames28:10 – Decision-Making Frameworks for Leaders30:15 – Fostering Healthy Tension in Leadership32:09 – Empowering Others and Agency in Leadership39:18 – The Ethics of Power and ResponsibilityGuest Bio and LinksDr. David Bray is a strategist and transformation leader working at the intersection of technology, policy, and complex change. Currently Distinguished Chair of the Accelerator at the Stimson Center and Principal at LeadDoAdapt Ventures, he’s led efforts ranging from bioterrorism preparedness to countering disinformation for U.S. Special Operations. A former FCC CIO and Executive Director for bipartisan national commissions, David has advised 12 startups, worked globally on the future of tech and data, and earned honors including the National Intelligence Exceptional Achievement Medal and CIO 100 Awards. He’s also served as Executive-in-Residence at Harvard and was named one of Business Insider’s “24 Americans Changing the World.”David Bray on LinkedIn CXO TALKStimson Center – Bray’s ProfileResources MentionedPeople-Centered Internet Coalition - Dr. Bray served as Executive Director for this initiative co-founded by Vint Cerf. It promotes digital infrastructure that empowers people. Edelman Trust Barometer – He references the 2025 edition of the Edelman Trust Barometer, particularly noting statistics on global grievance and willingness to justify violence.Rousseau’s Theory of Pluralities – Bray refers to Rousseau’s idea that democracies require civic responsibility from at least 20% of people to function well—a power-law principle still relevant today.Further Reading / Related EpisodesEpisode 11: "Who Watches the Watchers? Privacy Law, AI, and Power"Episode 8: "The Great Disruption: Building Human-Centered Digital Futures"Episode 5: "Regenerating Social Fabric & Innovating Governance"Episode 4: "Dynamics of Digital Spaces: Rethinking Democracy Online"Call to ActionHow do we lead with courage and clarity when everything is changing? This conversation with Dr. David Bray offers a roadmap for leadership in uncertain times—grounded in systems thinking, public service, and a deep respect for human agency.🎧 Listen now: Episode LinkCreditsHost: Mary CamachoGuest: Dr. David BrayProduced by Terms of Service PodcastSound Design: Arthur Vincent and SonorLabCo-Producers: Nicole Klau Ibarra & Mary Camacho
undefined
Jun 3, 2025 • 54min

Who Watches the Watchers? Privacy Law, AI, and Power

Episode SummaryIn this episode of Terms of Service, Mary Camacho sits down with William McGeveran—Dean of the University of Minnesota Law School and author of a leading privacy law casebook—to explore the evolving landscape of data protection, surveillance, and individual rights. With deep insights into both U.S. and European frameworks, McGeveran breaks down where current laws fall short, why consent alone doesn’t protect privacy, and how legal systems can (and should) evolve to meet the challenges posed by AI, big tech, and systemic data collection.Key TakeawaysMost of the world—including the EU—follows a “data protection” model that assumes personal data must be protected on behalf of individuals. This gives people broad rights to know, limit, and contest how their data is collected and used. In contrast, the U.S. lacks a unified data protection framework. Instead, companies are largely free to collect and use personal data unless a specific law prohibits it—prioritizing institutional autonomy over individual rights.Consent is an inadequate foundation for privacy protection. Relying on individuals to understand and agree to complex data practices shifts responsibility away from those in power and undermines meaningful control.Legal design matters. Structural choices—like creating intentional silos for data—can strengthen protections rather than limit innovation.Data breaches are no longer unusual—they’re inevitable. But legal standards still play a critical role in enforcing accountability and incentivizing better security practices.Younger generations see privacy not as a personal failure but as a systemic issue. And they're looking for collective, enforceable solutions—not just more terms of service.Topics Covered / Timestamped Sections01:39 – From Capitol Hill to privacy casebooks: McGeveran’s path into data law.02:48 – The wild west of the early internet and Lessig’s “Code”.04:32 – Silos in surveillance and the importance of intentional data separation.08:00 – Privacy law vs. data protection law: U.S. and EU’s contrasting assumptions.11:04 – Why California's privacy laws are stronger—but still fundamentally U.S. in approach.14:11 – Why it’s not “all over”: What legal protections still matter.17:33 – Aggregation harms and why individuals can’t foresee long-term data consequences.24:03 – How digital-native students view privacy today—and what gives them hope.27:00 – Why privacy policies can’t be read, and how AI can help interpret them.35:30 – GDPR’s global ripple effects and Max Schrems' legal victories.40:00 – Casebooks, case studies, and how law students are shaping future data policy.41:45 – Data breaches, legal gaps, and the human side of cybersecurity.50:35 – AI is both revolutionary and familiar—and requires caution, not panic.Guest Bio and LinksWilliam McGeveran – William McGeveran was named the twelfth dean of the University of Minnesota Law School in 2024. He originally joined the faculty of Minnesota Law in 2006 and previously served as the interim dean and the associate dean for academic affairs. Dean McGeveran’s research focuses on information law, with particular focus on data privacy and trademark law. His scholarship in trademark law considers the balance between prevention of harmful consumer confusion and protection of valuable speech including parody, commentary, and comparative advertising. McGeveran is also the sole author of a casebook, Privacy and Data Protection Law, used by instructors at dozens of U.S. law schools. Dean McGeveran has been a resident fellow at the University of Minnesota Institute of Advanced Study, a visiting professor at University College Dublin School of Law, and an instructor in the Notre Dame Law School London Programme. He frequently speaks to the media, submits amicus briefs, works with policymakers, and teaches continuing legal education courses in his specialty areas. Dean McGeveran earned a J.D., magna cum laude, from New York University and a B.A., magna cum laude, in political science from Carleton College. While an undergraduate he spent one year as a nonmatriculated visiting student at Worcester College, Oxford. Prior to joining Minnesota Law, he was a resident fellow at the Berkman Center for Internet and Society at Harvard Law School. He previously clerked for Judge Sandra Lynch on the United States Court of Appeals for the First Circuit and practiced as an intellectual property litigator at Foley Hoag LLP in Boston. Before law school, Dean McGeveran worked in national politics for seven years, primarily as a senior legislative aide to then-Rep. Charles Schumer.Follow William McGeveran on Linkedin Faculty Profile – University of Minnesota LawResources MentionedGDPR (General Data Protection Regulation) – Europe’s landmark data privacy lawCalifornia Consumer Privacy Act (CCPA) – A leading example of enhanced U.S. state-level regulation.Max Schrems and NOYB – Strategic litigation challenging EU-U.S. data sharing agreements.Carnegie Mellon - The Cost of Reading Privacy Policies Study – Analysis of time required to read all privacy policies.Privacy and Data Protection Law (University Casebook Series)Further Reading / Related EpisodesEpisode 1: "From AI Anxiety to IP Integrity: Navigating Rights in a Tech-Driven World"Episode 3: “Empowerment Tech: Unlocking Customer Data for Better Choices and Better Business”Call to ActionPrivacy isn't dead—but it is under pressure. If you're tired of shrugging at every “accept cookies” pop-up, this episode will help you rethink what’s possible through law, accountability, and systemic reform. Listen to Dean William McGeveran on how to reclaim digital dignity.🎧 Listen now: Episode LinkCreditsHost: Mary CamachoGuest: William McGeveranProduced by Terms of Service PodcastSound Design: Arthur Vincent and Sonor LabCo-Producers: Nicole Klau Ibarra & Mary Camacho
undefined
May 13, 2025 • 47min

When Alexa Says Sorry: What We Risk When AI Sounds Human

Episode SummaryIn this episode of Terms of Service, host Mary Camacho speaks with Marisa Zalabak, an AI ethicist and psychologist who explores how our relationships with artificial intelligence impact emotional intelligence, learning, communication, and mental health. With a rich background in education, social justice, psychology, and theater arts, Marisa offers deep insights into the emotional and ethical implications of anthropomorphizing AI, the risks of synthetic empathy, and the importance of slowing down to ask better questions. Together, they unpack how emotional and cognitive habits are being shaped by our daily interactions with machines—and what it means for our shared future.Key TakeawaysAnthropomorphizing AI—treating machines as if they are human—is natural but dangerous, especially when synthetic empathy (like chatbots saying “I’m sorry”) reinforces emotional trust in non-human systems.Marisa emphasizes the importance of asking better questions about the tools we use, why we use them, and what long-term effects they may have.Research shows people increasingly treat AI systems as coworkers or even confidants, which can affect trust, mental health, and social connection.Systems like Alexa and humanoid AIs often reinforce gender bias, particularly when defaulted to women’s voices.Encouraging digital literacy, slow learning, and psychological grounding helps individuals—and especially children—build healthy habits with technology.Topics Covered / Timestamped Sections01:55 – Marisa’s unconventional journey from performing arts to educational psychology to AI ethics05:48 – Discovering AI and contributing to one of the first IEEE standards on human well-being in AI design.08:27 – First deep AI encounter: conversing with NASA's humanoid BINA48 and the psychology of human-machine interaction.13:22 – Synthetic empathy and the blurry boundaries of trust in conversational AI.18:10 – How politeness and pronouns affect human habits and communication patterns.21:45 – Designing meaningful research on emotional and psychological effects of AI.23:14 – Children and AI: the real impacts of early and normalized interaction with synthetic personalities.38:00 – Why education should be an invitation to inquiry, not a race toward certainty.33:31 – Gendered AI voice assistants and their unintended social consequences.37:40 – Why education should be an invitation to inquiry, not a race toward certainty.42:05 – Breaking down complexity through “aunt Dorothy” explanations and slow, focused inquiry.Guest Bio and LinksMarisa Zalabak is an AI ethicist, psychologist, and thought leader specializing in responsible AI, education, sustainability, and human well-being. Her talks emphasize adaptive leadership, ethical innovation, and climate action through sustainable practices. A two-time TEDx and international keynote speaker, Marisa has contributed to global forums such as Stratcom, UN Summit of the Future, and AI House in Davos during the World Economic Forum. As Co-Founder of GADES (Global Alliance for Digital Education and Sustainability), Resident Fellow with The Digital Economist Center of Excellence and faculty member at the Trocadéro Forum Institute, Marisa champions education aligning responsible technology with regenerative design for human and planetary flourishing. Chairing IEEE's AI Ethics Education and Planet Positive 2030 initiatives, Marisa has co-authored ethical AI standards for human-wellbeing with AI technologies. Collaborating across sectors with organizations like Microsoft, SAP, and Stanford University Marisa addresses emerging issues in AI for a sustainable future.Marisa’s WebsiteMarisa’s LinkedinMarisa’s InstagramMarisa’s FacebookMarisa’s TEDx TalkIEEE Global Initiative on Ethics of Autonomous and Intelligent SystemsResources MentionedBINA48 – One of the first advanced humanoids trained for human interaction and space exploration.Synthetic Emotion in AI – IEEE working group focused on standards for AI that emulates human emotions.Digital Assistants & Bias – Ongoing research into how voice assistants perpetuate societal norms and stereotypes.Further Reading / Related EpisodesEpisode 5: “Regenerating Social Fabric & Innovating Governance”Call to ActionHow are your emotional habits being shaped by the tools you use every day? Marisa Zalabak invites us to slow down, ask better questions, and reimagine AI as a tool for well-being—not just productivity. Listen now and rethink the terms of service we accept in our digital lives.🎧 Listen now: Episode LinkCreditsHost: Mary CamachoGuest: Marisa ZalabakProduced by Terms of Service PodcastSound Design: Arthur Vincent and Sonor LabCo-Producers: Nicole Klau Ibarra & Mary Camacho
undefined
Apr 22, 2025 • 41min

Beyond the Dataset: Building Human-Centered Research

Episode SummaryIn this episode of Terms of Service, host Mary Camacho speaks with Elizabeth Eagen, Deputy Director of the Citizens and Technology (CAT) Lab at Cornell University. Elizabeth shares how her work in human rights led her to explore the impact of emerging technologies on civil society, and how citizen science can be used to shape better digital spaces. Together, they discuss algorithmic bias, data ownership, community-driven research, and how regulation often lags behind both the harm and the science. With sharp insights and powerful stories, Elizabeth unpacks the complex dynamics of platform accountability, participatory research, and digital equity.Key TakeawaysThe CAT Lab supports communities in generating their own research questions and data to investigate online life, shifting power dynamics away from institutions.Effective regulation of technology (e.g. AI hiring algorithms) requires better alignment between legal timelines and scientific inquiry.Community-led research can influence both platform behavior and public policy—while honoring participant agency.The human cost of data loss or mismanagement is real—whether in human rights documentation or everyday digital life.Human values like mutual aid and accountability must be embedded into technological systems and policy frameworks to ensure equity and resilience.Topics Covered / Timestamped Sections00:48 – Introduction to Elizabeth Eagen and the CAT Lab’s mission: “citizen science for the internet.”02:53 – Redefining research through community participation and shifting who owns the data.05:39 – Institutional barriers to community-led science, including IRB processes and timeline mismatches.08:50 – Real-world issues CAT Lab explores: algorithmic hiring bias, content moderation, and digital inclusion.11:20 – Community impact: research results go first to participants, enabling operational improvements.12:52 – Case Study: Local Law 144 and AI auditing for hiring discrimination in NYC.21:00 – Regulation is often slower than technological impact—but still faster than science.23:40 – Elizabeth’s journey from Human Rights Watch to building the Emerging Tech portfolio at Open Society Foundations.25:00 – The responsibility to protect data as people—not just points.29:48 – Tradeoffs in data ownership, portability, and government involvement.35:16 – Digital identity and the folklore of “messifying” databases for privacy and security.37:08 – The risks of corporate donations of tech tools to civil society—and the ethics of mutual dependency.Guest Bio and LinksElizabeth Eagen is Deputy Director of the Citizens and Technology Lab at Cornell University, which works with communities to study the effects of technology on society and test ideas for changing digital spaces to better serve the public interest, so that digital power is guided by evidence and accountable to the public. She was also a 2022-23 Practitioner Fellow at Stanford University’s Digital Civil Society Lab. Previously, she established and led the Emerging Technology portfolio at the Open Society Foundations’ Information Program. This initiative funded the use of emergent technologies in evidence and advocacy, building the role of knowledge management, and the use of data visualization tools, data science, statistics, and new media tactics by civil society and policymakers. She founded the Human Rights Data Initiative, and led the Urbanization Working Group, which explored urbanization and open society through programming, research, and debate. She holds an MA/MPP in Public Policy and Russian and Eastern European Studies from the University of Michigan, and a BA from Macalester College.Citizens and Technology LabElizabeth on LinkedinElizabeth on BskyResources MentionedLocal Law 144 - NYC legislation requiring audits for AI-based hiring tools.IRB (Institutional Review Board) – Systems for ethical oversight of academic research.Data and Society Research Institute – Partner in algorithmic accountability research.Open Society Foundations – Supporting rights-based civil society through tech.Further Reading / Related EpisodesEpisode 3: "Empowerment Tech: Unlocking Customer Data for Better Choices and Better Business"Episode 5: Regenerating Social Fabric & Innovating GovernanceEpisode 7: "Breaking the Binary: Rethinking Law, Power, and Possibility"Call to ActionWant to know what it means to shift power in research, regulation, and digital life? Don’t miss this conversation with Elizabeth Eagen, and learn how citizen science can create more inclusive and accountable tech systems.🎧 Listen now: Episode LinkCreditsHost: Mary CamachoGuest: Elizabeth EagenProduced by Terms of Service PodcastSound Design: Arthur Vincent and Sonor LabCo-Producers: Nicole Klau Ibarra & Mary Camacho
undefined
Apr 1, 2025 • 41min

The Great Disruption: Building Human-Centered Digital Futures

Episode SummaryIn this episode of Terms of Service, host Mary Camacho sits down with Mei Lin Fung, co-founder of the People-Centered Internet and a pioneer in customer relationship management (CRM). Mei Lin shares her personal journey from Singapore to MIT, her role in shaping the CRM industry, and her commitment to ensuring that technology remains a tool for human flourishing. Together, they discuss the current “Great Disruption” brought on by digital transformation, the importance of community-driven technology, and why feedback and inclusion are key to resilient societies.Key TakeawaysHuman collaboration is the foundation of societal survival and progress, magnified now by digital technologies.The “Great Disruption” refers to the rapid changes in how humans interact due to digital transformation, with both challenges and opportunities.Mei Lin’s career, including early CRM development and co-founding People-Centered Internet, emphasizes technology’s potential to empower rather than exploit.Singapore’s example shows how numerate, inclusive governance and long-term investment can enable societies to thrive.Effective leadership today requires participation, feedback systems, and learning from all voices—not just the powerful few.Topics Covered / Timestamped Sections01:30 – Introduction to Mei Lin Fung’s background and lifelong mission.03:30 – Growing up in Singapore and the impact of investing in education and infrastructure.08:56 – Early CRM days: Shaping the industry through community and engagement.12:30 – Why she answers hundreds of questions on Quora: listening to people as a leadership practice.13:15 – Defining the “Great Disruption” and how human collaboration is evolving.18:30 – Lessons from Singapore’s digital governance and health policy success.21:00 – Founding the People-Centered Internet: Technology as a tool for inclusion and equity.31:20 – Building networks of communities to experiment and grow together.36:00 – Policy leadership and shaping digital equity at the G7 level.38:15 – Mei Lin’s magic wand wish: Making participation and feedback essential to future societies.Guest Bio and LinksMei Lin Fung – Co-founder with Vint Cerf, of People-Centered Internet in 2015 is a leading voice in Digital Public Infrastructure for opportunity and community resilience. An early pioneer of Customer Relationship Management and ERP systems, while at Oracle she worked with Marc Benioff, now CEO of Salesforce as well as Tom Siebel who sold Siebel Systems - the first CRM company, to Oracle. She served as socio-technical lead for the U.S. Federal Health Futures and was honored to be business partner with Internet pioneer Douglas Engelbart. Mei Lin organized the 2024 UN Science Summit Digital Governance Series which included the UN AI Report and a rare plenary by Turing Award winner, Dr Alan Kay; The 50th anniversary of the Internet was the occasion for the celebration in Palo Alto California - with IEEE, in London - with the Royal Society, in Brussels - with the I50Y, the youth celebration, where Mei Lin served as catalyst, initiator and speaker. Today Mei Lin is the Chair of the Technical Committee on Sustainability for the IEEE, Society on the Social Implications of Technology and serves as the liaison to the IEEE Industry Engagement Committee for IEEE-USA.People-Centered InternetFollow Mei Lin on QuoraResources MentionedDouglas Engelbart’s concept of Networked Improvement CommunitiesNobel Prize-winning research on social institutions and economic developmentFurther Reading / Related EpisodesEpisode 4: “Dynamics of Digital Spaces: Rethinking Democracy Online”Episode 5: “Regenerating Social Fabric & Innovating Governance”Call to ActionCurious about how to navigate the Great Disruption and build people-centered digital futures? Listen to Mei Lin Fung’s inspiring insights and learn how inclusion, feedback, and collaboration can reshape our digital society.🎧 Listen now: Episode LinkCreditsHost: Mary CamachoGuest: Mei Lin FungProduced by Terms of Service PodcastSound Design: Arthur Vincent and Sonor LabCo-Producers: Nicole Klau Ibarra & Mary Camacho
undefined
Mar 11, 2025 • 55min

Breaking the Binary: Rethinking Law, Power, and Possibility

Episode SummaryIn this episode of Terms of Service, host Mary Camacho speaks with Helen Slottje, an award-winning attorney and co-founder of the Regenerative Law Institute. Helen shares her journey from corporate law to leading a groundbreaking legal movement against fracking, which earned her the Goldman Environmental Prize. They discuss the deeper patterns of power and control in legal systems, how governance structures enforce extractive models, and the need for transformative legal frameworks that align with natural systems. Helen’s work challenges conventional legal thinking, moving beyond fixing broken systems to designing entirely new paradigms for governance and community resilience.Key TakeawaysBeyond Extractive Systems: Legal and governance structures often reinforce power imbalances, prioritizing control over coherence.The Fracking Fight as a Model for Change: Helen’s legal strategy helped shift an "inevitable" industry into an impossible one, leading to New York’s fracking ban.Predator-Prey Dynamics in Law: Legal systems replicate extraction-based power structures, often reinforcing historical violence rather than challenging it.Regenerative Law vs. Sustainability: The goal is not just to sustain broken systems but to design new legal structures that support thriving, decentralized communities.Reframing Ownership and Control: From nonprofit governance to alternative currencies, emerging models challenge the idea that control must always be centralized.Topics Covered / Timestamped Sections01:19 - Introduction to Helen Slottje and her shift from corporate law to environmental law.03:19 - The fight against fracking in New York: How legal strategy led to a statewide ban.07:42 - The Power of Patterns & Systemic Control – How binary thinking limits our ability to create real change.10:07 - The Role of Narratives in Governance & Social Change – The importance of redefining governance beyond the current nation-state model.13:45 - Challenging the Predator-Prey Dynamic: How society reinforces power imbalances and why we must shift toward mutual thriving.22:38 - Hope, Community Building, & Mobilizing Change: The lessons from the fracking fight on organizing and redefining what’s possible.33:09 - The Evolutionary Leap: Shifting Consciousness – Why we need new frameworks to break conventional thinking and evolve. 35:50 - The Process of Creating Transformational Change – The importance of starting with small groups before scaling change. 40:40 - The Future of Law & Governance: Quantum Thinking – Applying nature, physics, and alternative governance models to drive systemic change..42:33 - Helen’s "magic wand" wish: Creating language and legal tools to make transformative governance accessible.Guest Bio and LinksHelen Slottje is a Harvard-educated lawyer and a recipient of the Goldman Environmental Prize (‘Green Nobel’). As the founder of the Regenerative Law Institute, Helen helps leaders navigate high-pressure challenges with coherence and emergent design rather than brute force. At the core of her work is the conviction that real solutions emerge by leveraging pressure as a catalyst, embracing coherence with nature’s patterns, and making quantum leaps beyond the limits of conventional thinkingRegenerative Law InstituteGoldman Environmental PrizeLinkedIn IG Resources MentionedFracking and Local Bans - The legal strategy that led to New York’s statewide fracking prohibition.Alternative Governance Models - Expanding democratic decision-making structures.Rene Girard’s Mimetic Theory - Understanding power through hidden cycles of violence and control.Further Reading / Related EpisodesEpisode 1: "From AI Anxiety to IP Integrity: Navigating Rights in a Tech-Driven World"Episode 3: "Dynamics of Digital Spaces: Rethinking Democracy Online"Episode 5: "Regenerating Social Fabric & Innovating Governance"Call to ActionCan legal systems evolve beyond extractive models? Listen to Helen Slottje’s transformative insights on law, governance, and power—and explore how regenerative law might shape the future.🎧 Listen now: Episode LinkCreditsHost: Mary CamachoGuest: Helen SlottjeProduced by Terms of Service PodcastSound Design: Arthur Vincent and Sonor LabCo-Producers: Nicole Klau Ibarra & Mary Camacho
undefined
Feb 18, 2025 • 43min

Constitutional Rights, Tech Governance, and Power Structures

In this episode of Terms of Service, host Mary Camacho welcomes Nora Mbagathi, Executive Director of the Katiba Institute, for a deep dive into constitutional rights, technology governance, and power dynamics in Kenya and beyond. They explore how constitutions function as the "terms of service" of a society, shaping citizen rights and responsibilities. Nora highlights the risks posed by centralized digital identity systems, the role of transnational corporations in shaping the digital landscape, and the importance of grassroots activism in defending constitutional protections.Key TakeawaysConstitutions as Societal Contracts: Just like digital terms of service, constitutions define the relationship between citizens and power structures.Kenya’s 2010 Constitution: A strong rights-based document that emerged from political unrest, yet faces implementation challenges due to literacy gaps and power imbalances.Tech Governance in the Global South: Digital ID systems, centralized data collection, and lack of local tech solutions create unique vulnerabilities.Extractive Tech Models: Nairobi is often called "the Silicon Valley of Africa," but many systems prioritize corporate interests over community empowerment.Listening as a Solution: Instead of imposing external solutions, policymakers and tech companies need to engage meaningfully with affected communities.Topics Covered / Timestamped Sections00:49 - Introduction to Nora Mbagathi and her journey from human rights law to constitutional implementation.05:33 - The role of constitutions in protecting citizens and the Katiba Institute’s mission.07:46 - Kenya’s 2010 Constitution: A turning point in governance after election violence.12:31 - Constitutional literacy: Why some citizens benefit while others remain unaware of their rights.16:23 - The intersection of constitutional rights and technology governance.20:25 - The role of centralized digital ID systems and their risks.25:14 - The myth of Nairobi as the "Silicon Valley of Africa"—who really benefits?30:53 - The dangers of centralization vs. the potential of decentralized identity solutions.36:04 - The importance of designing technology with privacy, transparency, and equality at its core.40:25 - Building international coalitions to challenge corporate and governmental overreach.Guest Bio and LinksNora Mbagathi is the Executive Director at Katiba Institute in Kenya. She is a qualified lawyer in multiple jurisdictions and has worked in human rights campaigning and strategic litigation for over ten years. Nora has participated in cases relating to digital ID, platform accountability, criminal justice, and the right to nationality in Kenya. Prior to joining Katiba Institute, Nora was a senior lawyer with the Open Society Justice Initiative, based in London.Nora Mbagathi on X Katiba Institute WebsiteKatiba Institute on XResources MentionedKenya’s 2010 Constitution - A landmark rights-based document.GovZero - A movement in Taiwan promoting citizen-driven government accountability.Digital ID Systems - Centralized identity databases and their risks in Kenya.Further Reading / Related EpisodesEpisode 4: "Dynamics of Digital Spaces: Rethinking Democracy Online"Episode 5: "Regenerating Social Fabric & Innovating Governance"Call to ActionHow can we ensure technology serves citizens rather than undermining their rights? Listen to this thought-provoking conversation with Nora Mbagathi and join the discussion on tech governance, rights, and digital power structures.🎧 Listen now: Episode LinkCreditsHost: Mary CamachoGuest: Nora MbagathiProduced by Terms of Service PodcastSound Design: Arthur Vincent and Sonor LabCo-Producers: Nicole Klau Ibarra & Mary Camacho

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app