In Reality  cover image

In Reality

Latest episodes

undefined
Jul 20, 2022 • 39min

The elite’s blind spots and the illusion of truth with Gillian Tett

In this episode of In Reality, host Eric Schurenberg sits down with Gillian Tett, Chair of the Editorial Board and Editor-at-Large for the Financial Times, US. Gillian is also trained as an anthropologist, which gives her a unique perspective on the tribal divides within American society.  If you believe that your grasp of reality is the only legitimate one, prepare to be challenged. Anthropologists, Gillian explains, view sub-cultures as self-contained. The belief in conspiracies may seem incomprehensible to most In Reality listeners, but it makes sense to groups who feel abandoned and belittled by elites. All of us have trouble seeing our biases as anything other than ground truths. For example, elites in media, government, entertainment, academe, and so on, regard command of language as an indisputable sign of seriousness and status. For other tribes in America, articulateness is irrelevant. What matters instead is loyal adherence to the tribe’s fears and grievances.  For members of those groups, the facts presented by institutions like the media and legal system are suspect on their face. The only information that is really trustworthy is what’s conveyed by other members of the tribe.Gillian and Eric take the anthropologist’s view of a wide range of contemporary news events: Why the best way to understand Trump supporters is to attend professional wrestling; what Trump’s use of the neologism “bigly” reveals about professional media’s blind spots; and why whistleblowers are disproportionately women. Listen, and prepare to confront your own blind spots. Website - free episode transcriptswww.in-reality.fmProduced by Tom Platts at Sound Sapiensoundsapien.comAlliance for Trust in Mediaalliancefortrust.com
undefined
Jun 30, 2022 • 40min

On the front lines of the disinformation fight with Áine Kerr

In the fight against disinformation, the last line of defense between audiences and malicious falsehoods are the “trust and safety” teams, also known as content moderators. Some of them are employed by social media platforms like Facebook and Spotify, but increasingly the platforms outsource the work of identifying and countering dangerous lies to fact-checking organizations like the fast-growing Irish company, Kinzen.In this episode of In Reality, host Eric Schurenberg sits down with Áine Kerr, co-Founder, and COO of Kinzen. Áine is a serial risk-taker with extensive experience in the intersection of journalism and technology, most recently as the global head of journalism partnerships at Facebook. Kinzen helps platforms, policymakers, and other defenders “get ahead and stay ahead” of false and hateful content in video, podcast, and text platforms. The company uses artificial intelligence to sniff out objectionable content and then when needed, invites human readers to judge for context and nuance. What Kinzen calls  “human in the loop technology” minimizes errors while still allowing for fact-checking at social media scale. In the recent Brazilian elections, for example, Áine explains that disinformation actors came to realize that phrases like “election fraud” and “rigged election” were alerting content moderators who could take down their false claims. So, the actors began substituting seemingly innocuous phrases like “we are campaigning for clean elections.” Kinzen’s human moderators spotted the changes and helped authorities intercept the false messages. Áine and Eric also dive into the many reasons that someone may participate in sharing harmful content online, ranging from sheer amoral greed to ideological commitment. She ends with a warning that the spreaders of disinformation currently have the upper hand. It is always easier to spread lies than to counteract them. The allies of truth–researchers, social media platforms, entrepreneurs, and fact-checking organizations like hers–need to get better at coordinating their efforts to fight back, or democracy will remain an existential risk around the world. Website - free episode transcriptswww.in-reality.fmProduced by Tom Platts at Sound Sapiensoundsapien.comAlliance for Trust in Mediaalliancefortrust.com
undefined
Jun 14, 2022 • 57min

How to Build True Public Spaces Online with Eli Pariser

In this episode of In Reality, co-hosts Eric Schurenberg and Joan Donovan are joined by Eli Pariser, co-director of New Public and former president of MoveOn.org. Pariser is a long-time advocate for creating healthy communities online, and he now advocates for reimagining the Internet as a trustworthy public space analogous to local parks or public libraries.It’s an appealing analogy. Pariser notes that public spaces are critical for holding democratic societies together, spaces where people come together and work through conflict, raise concerns and demands, and share experiences. A key element of physical public spaces is that they are local in scale. Some digital spaces share some of that “local” flavor. Reddit, for example, fosters local discussions and debates through multiple domains and communities that have their own moderation. That stands in contrast to platforms like Facebook and Twitter, where there is no visible moderation and information is global in nature, making it hard to develop a sense of community.  Moderation alone isn’t quite enough, though. Another key element of healthy public spaces is self-governance because it depends on collaboration. Wikipedia is an example of a digital space that offers contributors power checked by governing principles and steered by collaborative norms. Digital “parks” and “libraries” are a distant cry from the barely controlled chaos that has characterized digital spaces to date. But as our civic lives increasingly move online, the need for them is clear. Website - free episode transcriptswww.in-reality.fmProduced by Tom Platts at Sound Sapiensoundsapien.comAlliance for Trust in Mediaalliancefortrust.com
undefined
May 31, 2022 • 53min

Truth and Trade-offs amid a Polarized Pandemic with Dr Leana Wen

The covid pandemic has created the kind of situation in which misinformation thrives. Public health authorities met surging demand for knowledge about how to protect against covid with inconsistent or inadequate guidance. Misinformation rushed in to fill the gap.   In this episode of In Reality, Dr Leana Wen, emergency physician & public health professor at George Washington University, joins co-hosts Eric Schurenberg and Joan Donovan to discuss how health misinformation spreads and how public health institutions can regain trust.Dr Wen explains that much of the mistrust of public health agencies during the pandemic arose because the agencies continually changed guidance. This is a normal, even desirable reaction to new research and evolving risk assessments, but many in the public regarded the shifting guidance as a sign that authorities didn’t really know the truth or had a hidden agenda.Dr Wen distributes blame for health misinformation liberally. She explains how the major news media covering the baby formula shortage encouraged frightened parents to hoard formula, depleting stocks of the product in stores and worsening the situation. As trust in public health authorities shrinks, Dr Wen explains, people are more likely to absorb information from sources like their neighbors, rather than from qualified agencies such as pediatricians and public health organizations.  This is understandable–but potentially dangerous.To combat mistrust, Dr Wen says, public health authorities must not be afraid to give nuanced advice. Authorities should be willing to admit that they don’t always have the answers and that guidance will inevitably change as new information comes to light. It’s also essential to meet people “where they are”--meaning that authorities should default to the information platforms (including social media) that audiences consume and to local (as opposed to national) authorities that they are more likely to trust.Website - free episode transcriptswww.in-reality.fmProduced by Tom Platts at Sound Sapiensoundsapien.comAlliance for Trust in Mediaalliancefortrust.com
undefined
May 18, 2022 • 1h 3min

How We Know What’s True with Jonathan Rauch

Truth–and the institutions that defend it–are under attack. What can the rest of us do? In this episode of In Reality, co-hosts Eric Schurenberg and Joan Donovan are joined by Jonathan Rauch, a Senior Fellow at the Brookings Institution and author of ‘The Constitution of Knowledge: A Defense of Truth’. In this captivating discussion, Jonathan unpacks what is best described as a crisis of knowledge in Western culture, the result of a multi-front challenge to citizens’ ability to distinguish fact from fiction and elevate truth above falsehood.What has always bound Western societies together in a shared sense of reality, Rauch explains, is a commitment–not to a set of pre-ordained beliefs but rather to a process of constantly testing claims against objective experience to determine which claims are true. Rauch calls this process ‘The Constitution of Knowledge’ because, like the US Constitution, it relies on a system of checks and balances to prevent the truth from being defined only by those in power. Up to this point, we have implicitly trusted institutions like science, medicine, government and media–what Rauch calls “the reality-based community”--to safeguard the process.Social media, however, has short-circuited all of this. Social media makes no attempt to test the claims that appear in its content, and instead revels in broadcasting claims to millions online at Internet speed, without regard to whether they are true or not. Social media exalts popularity over expertise, speed over reflection and division over consensus. It’s no surprise that trust in the reality-based community is crumbling, and many citizens are no longer sure where to turn for truth. By the interview’s end, though, Rauch expresses cautious optimism. At the moment, fake news, misinformation and extremist propaganda (from both sides) seem to have the upper hand. But truth has a singular advantage: It describes the world as it really is. It works–while falsehoods inevitably collide with reality and fail. The reality-based community–and reasonable citizens outside those institutions–have their work cut out for them, Rauch says. But in the end, they will win. Website - free episode transcriptswww.in-reality.fmProduced by Tom Platts at Sound Sapiensoundsapien.comAlliance for Trust in Mediaalliancefortrust.com
undefined
May 3, 2022 • 41min

A new definition of making America great again with Kathleen Belew

In this episode of In Reality, Kathleen Belew, University of Chicago historian and author of ‘Bring The War Home: The White Power Movement and Paramilitary America’, joins co-hosts Eric Schurenberg and Joan Donovan. In a fascinating conversation, Belew outlines how social media and the tactics of disinformation energized the white power movement that reached a watershed moment in the violent attack on the U.S. Capitol on January 6th.Belew traces the current white supremacist surge to a movement that took root among veterans returning from the Vietnam war. The movement is made up of a number of loosely affiliated groups, whose ideology and goals changed little over the past 45 years. Indeed, the storming of the U.S. Capitol eerily recalled a similar event in the 1978 neo-Nazi handbook ‘The Turner Diaries’. Belew explains how these groups opportunistically latched on to the economic and racial resentments that brought Donald Trump to power and then used social media to communicate, organize and radicalize members. Belew explains that white power movements have no intention of “making America great again” and instead agitate for the overthrow of democracy. To really make America great, she concludes, Americans need a better understanding of our government and our imperfect history. We can then address questions of what has made America great in the past and what remains to be done to make it great again.Website - free episode transcriptswww.in-reality.fmProduced by Tom Platts at Sound Sapiensoundsapien.comAlliance for Trust in Mediaalliancefortrust.com
undefined
Apr 19, 2022 • 1h 2min

From optimism to doubt (and back) at Facebook with Joaquin Quiñonero Candela

In this episode of In Reality, co-hosts Eric Schurenberg and Joan Donovan sit down with Joaquin Quiñonero Candela, technical fellow for AI at LinkedIn and a former distinguished technical lead for responsible AI at Facebook. Before this, Joaquin led the Applied Machine Learning team at Facebook, creating the algorithms that made Facebook advertising so effective. It’s safe to say Facebook would not be the profit behemoth it is today without the innovations he introduced.2011 saw the broad public adoption of social media and the democratization of public voice that it enabled. The benefits for democracy were immediately apparent in movements like the Arab Spring, which held special meaning for Joaquin as a native of Morocco. After the 2016 election in the US and the 2018 Cambridge Analytica data scandal, however, Joaquin realized that the tools he helped create could be misused and began to devote himself to AI ethics and responsible use of the technology at Facebook, a mission that he carries on at LinkedIn. You could say that the arc of Joaquin’s career parallels that of society’s evolving relationship to social media.  The optimism that defined social media’s early adoption has been replaced by an alarmed awareness that its obvious benefits come with consequences–a polluted information stream, political polarization and erosion of the institutions needed to uphold  democracy. Joaquin is now deeply involved in leading efforts to minimize the harms that social media can unleash. “We’ve come to realize that  anything open will be exploited,” sums up In Reality co-host Joan Donovan, “and it is time for us to take the measure of that”. Website - free episode transcriptswww.in-reality.fmProduced by Tom Platts at Sound Sapiensoundsapien.comAlliance for Trust in Mediaalliancefortrust.com
undefined
Apr 5, 2022 • 58min

The reboot that can save social media with Rob Reich

In the first episode of In Reality, co-hosts Eric Schurenberg and Joan Donovan are joined by Rob Reich, Professor of Political Science and Philosophy at Stanford University and Author of System Error: Where Big Tech Went Wrong and How We Can Reboot.At its birth, social media promised to be a tool to promote democracy. Instead, it has become the accelerant to a firestorm of lies and, far from democratizing power has concentrated it among a few social media giants. “Mark Zuckerberg is now the unelected mayor of three billion people,” says Rob Reich. “That is unacceptable.” How did things go so wrong? Reich blames, what he calls, the “engineering mindset” of social media’s inventors and the financial ecosystem that supports them. Along with co-authors Mehran Sahami and Jeremy M. Weinstein, Reich teaches a class on technology and ethics at Stanford University, the high temple of the engineering mindset. He knows what he is talking about! Engineers seek to “optimize” for a specific, measurable outcome without regard to social ramifications. Thus, for example, algorithms designed to give social media users engaging content to wind uploading news feeds or search results with content that triggers outrage, hatred or fear. Engagement—measured by clicks or time spent on the site climbs exponentially as a result--but at an enormous social cost.   Reich believes that the solutions lie in tempering the optimization mindset with regulations that weigh a technology’s social costs against its effectiveness, much as stop signs moderate optimal traffic flow in the interests of safety. Listen and judge for yourself. His ideas require political resolve to execute, to be sure. But the need is urgent. Democracy is at stake.Website - free episode transcriptswww.in-reality.fmProduced by Tom Platts at Sound Sapiensoundsapien.comAlliance for Trust in Mediaalliancefortrust.com

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode