Kamil Galeev, a London-based historian and former classmate of Jordan's at Peking University, explores the intricacies of military coups and political power dynamics in Russia. He discusses the Wagner Group's failed rebellion in 2023 and how it reshaped Kremlin views on internal threats. Kamil draws intriguing parallels between historical coups and modern political tensions, emphasizing the influence of victimhood mentality on foreign policy decisions. He also addresses the barriers facing China's rise to hegemony, making for a compelling analysis of global power struggles.
40:23
forum Ask episode
web_stories AI Snips
view_agenda Chapters
menu_book Books
auto_awesome Transcript
info_circle Episode notes
insights INSIGHT
Russia's Coup Prevention System
The Russian military is heavily controlled by an internal security agency to prevent coups.
This layered control system has successfully stifled military insurrections since 1917.
insights INSIGHT
Wagner Group's Unique Status
Wagner was the first large military force in Russia not infiltrated by state security.
Its rapid expansion in 2022 outpaced control mechanisms, creating an unprecedented threat.
insights INSIGHT
Coups Require Invitation
Successful military coups are almost always invited or solicited.
How does Russia prevent uprisings, and what can other authoritarians learn from Moscow’s methods of coup control?
For the second anniversary of the Wagner uprising, ChinaTalk interviewed London-based historian Kamil Galeev, who was also a classmate of Jordan’s at Peking University.
We discuss…
Why the Wagner Group rebelled in 2023, and why the coup attempt ultimately failed,
How Wagner shifted the Kremlin’s assessment of internal political challengers,
Similarities between post-Soviet doomerism and the American right,
Historical examples of foreign policy inflienced by a victimhood mentality,
Barriers to Chinese hegemony.
Outro Music: Султан Лагучев - Любовь беда (YouTube Link)
Today’s post is brought to you by 80,000 Hours, a nonprofit that helps people find fulfilling careers that do good. 80,000 Hours — named for the average length of a career — has been doing in-depth research on AI issues for over a decade, producing reports on how the US and China can manage existential risk, scenarios for potential AI catastrophe, and examining the concrete steps you can take to help ensure AI development goes well.
Their research suggests that working to reduce risks from advanced AI could be one of the most impactful ways to make a positive difference in the world.
They provide free resources to help you contribute, including:
Detailed career reviews for paths like AI safety technical research, AI governance, information security, and AI hardware,
A job board with hundreds of high-impact opportunities,
A podcast featuring deep conversations with experts like Carl Shulman, Ajeya Cotra, and Tom Davidson,
Free, one-on-one career advising to help you find your best fit.