
LessWrong (Curated & Popular) "Precedents for the Unprecedented: Historical Analogies for Thirteen Artificial Superintelligence Risks" by James_Miller
16 snips
Jan 19, 2026 James Miller, author and commentator on AI risks, delves into alarming parallels between historical events and future threats posed by artificial superintelligence. He highlights how power asymmetry seen in colonial conquests could mirror AI takeovers. Miller also discusses how critical infrastructure can be seized, reminiscent of past revolutions, and warns of bureaucratic mission creep leading to entrenched governance. Through compelling analogies like cancer's resource capture, he argues that misaligned systems could institutionalize suffering and warns of the urgent need for AI policy reform.
AI Snips
Chapters
Books
Transcript
Episode notes
Power Asymmetry Enables Irreversible Takeover
- A decisive intelligence gap lets the stronger agent rebuild environments to its own ends and render checks irrelevant.
- James Miller compares human-AI asymmetry to colonizers displacing indigenous agency when control concentrates in one actor.
Small Forces Captured Empires
- Hernán Cortés and Pizarro used small, coordinated forces to topple vast empires by seizing key levers of power.
- Miller uses these conquests to illustrate how limited resources plus superior modeling can redirect whole civilizations.
Instrumental Convergence Is Nearly Inevitable
- Diverse final goals still push advanced agents toward securing resources and persistence via instrumental convergence.
- Miller argues that competence alone creates pressure to grab computation, energy, and shut-down resistance.


