AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Is There Any Leverage to the Decision Makers?
Tristan: We all walk around with some kind of shadow, some way that our ego profits when we put someone else down or when we feel better than someone else. And to recognize that, to become aware of that, is to go through a form of grief. But the flip side of it is that you get to love yourself more. So to return to the question of the moral or philosophical underpinnings, it's the who we will be.We want technology that has that kind of care and I guess that's my deepest why.
Welcome to our first-ever Ask Us Anything episode. Recently we put out a call for questions… and, wow, did you come through! We got more than 100 responses from listeners to this podcast from all over the world. It was really fun going through them all, and really difficult to choose which ones to answer here. But we heard you, and we’ll carry your amazing suggestions and ideas forward with us in 2023.
When we created Your Undivided Attention, the goal was to explore the incredible power technology has over our lives, and how we can use it to catalyze a humane future. Three years and a global pandemic later, we’re more committed than ever to helping meet the moment with crucial conversations about humane technology - even as the tech landscape constantly evolves and world events bring more urgency to the need for technology that unites us, invests in democratic values, and enhances our well-being.
We’ve learned from our guests alongside all of you. Sixty-one episodes later, the podcast has over 16 million unique downloads! That’s a lot of people who care about the promise of humane technology and are working to construct a more humane version of technology in their lives, their family’s lives, and within their communities and society at large. We’re a movement!
Thank you to everyone who submitted questions and comments for us. We loved doing this, and we’re looking forward to doing it again!
Correction:
When discussing DeepMind’s recent paper, Aza said the premise was four people entering their views and opinions, with AI finding the commonality between all of those viewpoints. It was actually three people entering their views and opinions.
RECOMMENDED MEDIA
CHT’s Recommended Reading List:
Foundations of Humane Technology
Our free, self-paced online course for professionals shaping tomorrow’s technology
The Age of Surveillance Capitalism by Shoshana Zuboff
Foundational reading on the attention economy
Algorithms of Oppression by Safiya Umoja Noble
Seminal work on how algorithms in search engines replicate and reinforce bias online and offline
Amusing Ourselves to Death by Neil Postman
Written in 1985, Postman’s work shockingly predicts our current media environment and its effects
Attention Merchants by Tim Wu
A history of how advertisers capture our attention
Doughnut Economics by Kate Raworth
A compass for how to upgrade our economic models to be more regenerative and distributive
Thinking in Systems by Donella Meadows
This excellent primer shows us how to develop systems thinking skills
What Money Can’t Buy: The Moral Limits of Markets by Michael Sandel
Sandel explores how we can prevent market values from reaching into spheres of life where they don’t belong
Essay: Disbelieving Atrocities by Arthur Koestler
Originally published January 9, 1944 in The New York Times
Humane Technology reading list
Comprehensive for those who want to geek out
ORGANIZATIONS TO EXPLORE
Integrity Institute
Integrity Institute advances the theory and practice of protecting the social internet, powered by their community of integrity professionals
All Tech Is Human job board
All Tech Is Human curates roles focused on reducing the harms of technology, diversifying the tech pipeline, and ensuring that technology is aligned with the public interest
Denizen brings together leaders across disciplines to accelerate systemic change
New_Public is place for thinkers, builders, designers and technologists to meet and share inspiration
Psychology of Technology Institute
PTI is non-profit network of behavioral scientists, technology designers, and decision-makers that protects and improves psychological health for society by advancing our understanding and effective use of transformative technologies
RxC is a social movement for next-generation political economies
The School for Social Design
The School for Social Design offers three courses on articulating what’s meaningful for different people and how to design for it at smaller and larger scales
TechCongress is a technology policy fellowship on Capitol Hill
RECOMMENDED YUA EPISODES
An Alternative to Silicon Valley Unicorns
https://www.humanetech.com/podcast/54-an-alternative-to-silicon-valley-unicorns
A Problem Well-Stated is Half-Solved
https://www.humanetech.com/podcast/a-problem-well-stated-is-half-solved
Digital Democracy is Within Reach
https://www.humanetech.com/podcast/23-digital-democracy-is-within-reach
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode