EA Forum Podcast (Curated & popular) cover image

EA Forum Podcast (Curated & popular)

“Where I Am Donating in 2024” by MichaelDickens

Dec 7, 2024
01:51:48

Summary

It's been a while since I last put serious thought into where to donate. Well I'm putting thought into it this year and I'm changing my mind on some things.

I now put more priority on existential risk (especially AI risk), and less on animal welfare and global priorities research. I believe I previously gave too little consideration to x-risk for emotional reasons, and I've managed to reason myself out of those emotions.

Within x-risk:

  • AI is the most important source of risk.
  • There is a disturbingly high probability that alignment research won't solve alignment by the time superintelligent AI arrives. Policy work seems more promising.
  • Specifically, I am most optimistic about policy advocacy for government regulation to pause/slow down AI development.

In the rest of this post, I will explain:

  1. Why I prioritize x-risk over animal-focused [...]

---

Outline:

(00:04) Summary

(01:30) I dont like donating to x-risk

(03:56) Cause prioritization

(04:00) S-risk research and animal-focused longtermism

(05:52) X-risk vs. global priorities research

(07:01) Prioritization within x-risk

(08:08) AI safety technical research vs. policy

(11:36) Quantitative model on research vs. policy

(14:20) Man versus man conflicts within AI policy

(15:13) Parallel safety/capabilities vs. slowing AI

(22:56) Freedom vs. regulation

(24:24) Slow nuanced regulation vs. fast coarse regulation

(27:02) Working with vs. against AI companies

(32:49) Political diplomacy vs. advocacy

(33:38) Conflicts that arent man vs. man but nonetheless require an answer

(33:55) Pause vs. Responsible Scaling Policy (RSP)

(35:28) Policy research vs. policy advocacy

(36:42) Advocacy directed at policy-makers vs. the general public

(37:32) Organizations

(39:36) Important disclaimers

(40:56) AI Policy Institute

(42:03) AI Safety and Governance Fund

(43:29) AI Standards Lab

(43:59) Campaign for AI Safety

(44:30) Centre for Enabling EA Learning and Research (CEEALAR)

(45:13) Center for AI Policy

(47:27) Center for AI Safety

(49:06) Center for Human-Compatible AI

(49:32) Center for Long-Term Resilience

(55:52) Center for Security and Emerging Technology (CSET)

(57:33) Centre for Long-Term Policy

(58:12) Centre for the Governance of AI

(59:07) CivAI

(01:00:05) Control AI

(01:02:08) Existential Risk Observatory

(01:03:33) Future of Life Institute (FLI)

(01:03:50) Future Society

(01:06:27) Horizon Institute for Public Service

(01:09:36) Institute for AI Policy and Strategy

(01:11:00) Lightcone Infrastructure

(01:12:30) Machine Intelligence Research Institute (MIRI)

(01:15:22) Manifund

(01:16:28) Model Evaluation and Threat Research (METR)

(01:17:45) Palisade Research

(01:19:10) PauseAI Global

(01:21:59) PauseAI US

(01:23:09) Sentinel rapid emergency response team

(01:24:52) Simon Institute for Longterm Governance

(01:25:44) Stop AI

(01:27:42) Where Im donating

(01:28:57) Prioritization within my top five

(01:32:17) Where Im donating (this is the section in which I actually say where Im donating)

The original text contained 58 footnotes which were omitted from this narration.

The original text contained 1 image which was described by AI.

---

First published:
November 19th, 2024

Source:
https://forum.effectivealtruism.org/posts/jAfhxWSzsw4pLypRt/where-i-am-donating-in-2024

---

Narrated by TYPE III AUDIO.

---

Images from the article:

undefined

Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode