There is a sense in which we're being used as experimental subjects. There's no way you can give informed consent if you don't know what experiments are going to be formed on you ahead of time. And then the other thing is that when we're being manipulated, there isn't a clear sense of how it's got an impact us. Even if you can't get their explicit consent for whatever particular change you want to make, somebody else needs to be vetting it before using people in this way. We need a proportional scale amount of responsibility,. sort of in your terms, larrily up front commitments to understand and demonstrating understanding of the complexity and scale of impact.
How do we decide whether to undergo a transformative experience when we don’t know how that experience will change us? This is the central question explored by Yale philosopher and cognitive scientist L.A. Paul.
Paul uses the prospect of becoming a vampire to illustrate the conundrum: let's say Dracula offers you the chance to become a vampire. You might be confident you'll love it, but you also know you'll become a different person with different preferences. Whose preferences do you prioritize: yours now, or yours after becoming a vampire? Similarly, whose preferences do we prioritize when deciding how to engage with technology and social media: ours now, or ours after becoming users — to the point of potentially becoming attention-seeking vampires?
In this episode with L.A. Paul, we're raising the stakes of the social media conversation — from technology that steers our time and attention, to technology that fundamentally transforms who we are and what we want. Tune in as Paul, Tristan Harris, and Aza Raskin explore the complexity of transformative experiences, and how to approach their ethical design.