We speak with Rob Miles. Rob is the host of the “Robert Miles AI Safety” channel on YouTube, the single most popular AI alignment video series out there — he has 145,000 subscribers and his top video has ~600,000 views. He goes much deeper than many educational resources out there on alignment, going into important technical topics like the orthogonality thesis, inner misalignment, and instrumental convergence.
Through his work, Robert has educated thousands on AI safety, including many now working on advocacy, policy, and technical research. His work has been invaluable for teaching and inspiring the next generation of AI safety experts and deepening public support for the cause.
Prior to his AIS education work, Robert studied Computer Science at the University of Nottingham.
We talk to Rob about:
* What got him into AI safety
* How he started making educational videos for AI safety
* What he's working on now
* His top advice for people who also want to do education & advocacy work, really in any field, but especially for AI safety
* How he thinks AI safety is currently going as a field of work
* What he wishes more people were working on within AI safety
Hosted by Soroush Pour. Follow me for more AGI content:
Twitter: https://twitter.com/soroushjp
LinkedIn: https://www.linkedin.com/in/soroushjp/
== Show links ==
-- About Rob --
* Rob Miles AI Safety channel - https://www.youtube.com/@RobertMilesAI
* Twitter - https://twitter.com/robertskmiles
-- Further resources --
* Channel where Rob first started making videos: https://www.youtube.com/@Computerphile
* Podcast ep w/ Eliezer Yudkowsky, who first convinced Rob to take AI safety seriously through reading Yudkowsky's writings: https://lexfridman.com/eliezer-yudkowsky/
Recording date: Nov 21, 2023