AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Clean energy research and advocacy for clean energy research is a particularly valuable approach to tackling climate change. Investing in clean energy technologies can significantly change the global decision calculus and make renewable energy more accessible and affordable than fossil fuels. The current lack of investment in clean energy research compared to the potential benefits and existing successes highlights the strong potential for impact in this area.
There is great value in focusing on scientific and technological research for addressing climate change. By attending to areas that are less politically salient and leveraging the resources of governments and organizations, we can make advances in clean energy technologies, develop innovative solutions, and overcome challenges related to climate change. Investing in research and development to find more sustainable and efficient energy sources can have substantial long-term benefits for mitigating climate change.
Advocacy for policy change is another crucial aspect of addressing climate change. By influencing governments and policymakers, we can prioritize climate action, promote renewable energy initiatives, and implement effective regulations and incentives. Aligning political will and public support is pivotal in driving transformative changes and accelerating the transition to a low-carbon economy.
Nuclear energy has the potential to provide carbon-free power and help combat climate change, but the strict regulatory burdens and safety standards are hindering its affordability and construction of new plants. The current regulations require levels of safety that exceed those of polluting fossil fuels, resulting in the continued reliance on less clean energy sources. To effectively utilize nuclear energy in fighting climate change, it is crucial to shift public opinion and anti-nuclear activist groups towards a more pro-climate, pro-nuclear stance and find ways to address the regulatory barriers.
The risk of nuclear weapons has evolved over the years, with early concerns ranging from atmospheric ignition to radioactive fallout. The fear of a global nuclear holocaust was prominent during the early stages of the Cold War, but the risk has become more understood and managed since then. Despite the risks, historical evidence suggests that nuclear deterrence has mostly successfully prevented nuclear war, with instances of de-escalation observed among various nuclear powers. However, the risk of accidents and misunderstandings remains, emphasizing the ongoing need for global efforts to reduce nuclear weapons and promote disarmament.
The risks associated with biosecurity and bioweapons are significant, particularly with advances in technology and potential accidental releases. History reveals instances of accidental leaks from bioweapons programs and laboratories, which have led to localized outbreaks of diseases. The risks associated with gain-of-function research, where new viruses are engineered, further highlight the need for robust safety measures. It is crucial to prioritize comprehensive lab safety and containment protocols, improve surveillance for early detection of new pathogens, and focus on research avenues that directly address pandemic response and prevention, such as genetic sequencing and vaccine development.
Through the analysis of past threats and potential risks, it is evident that there have been cases where false alarms or exaggerated claims have been made about the likelihood and severity of certain disasters or threats. This highlights the need for a more discerning approach in evaluating the relative magnitude of different risks and prioritizing interventions accordingly.
The exploration of historical threats and discoveries reveals that there has been a relatively stable set of major risks identified over time, including nuclear weapons, biological weapons, climate change, and civilization destabilization. While there have been some advancements in understanding and certain risks have been ruled out, the core concerns have remained consistent, validating the importance of ongoing efforts in these areas.
The analysis also highlights the uncertainty and challenges in assessing the risks and impacts of emerging technologies, such as artificial intelligence. While the general outline of the risks associated with AI has been recognized, further research and understanding are needed to more accurately determine the likelihood and potential severity of these risks. This underscores the ongoing need for careful study and measurement of emerging technologies.
This century is particularly pivotal because it is expected to witness a significant amount of technological progress. The rate of technological advancement has been accelerating, and this century is predicted to experience a substantial amount of technological growth. As a result, there is a higher probability of transformative technologies being developed, such as advanced artificial intelligence (AI). These technologies have the potential to cause significant changes, including human extinction or the creation of a stable society that sets the course for the future. Additionally, the preemption factor plays a role, as early adoption of these technologies can prevent later opportunities. Therefore, given the expected high rate of technological progress and the potential for transformative events, this century is deemed important in shaping the future.
The large population expected in the future also contributes to the importance of this century. With more people alive in the future, the per capita influence decreases, making it more challenging for individuals to have a significant impact. Therefore, acting now becomes crucial to maximize one's influence and shape the future. Additionally, this century is expected to resolve many uncertainties about humanity's long-term prospects. The acceleration of technological progress and the potential for transformative events will shed light on the possibilities and constraints of the future. Therefore, taking strategic measures in this century becomes vital in anticipation of these resolutions and uncertainties.
While some argue that we may not be in an especially pivotal time in history, counterarguments suggest otherwise. Possible counterarguments include the idea that advancements may occur in subsequent centuries as well. However, the empirical evidence and analysis indicate a higher probability of transformative events happening in this century due to the rapid rate of technological progress, the potential for preemption, and the resolution of uncertainties. Stagnation or collapse are factors that could hinder the transformative events, but evidence suggests that these scenarios are less likely. Therefore, this century emerges as an important and influential period in shaping the future.
Rebroadcast: this episode was originally released in October 2021.
Preventing the apocalypse may sound like an idiosyncratic activity, and it sometimes is justified on exotic grounds, such as the potential for humanity to become a galaxy-spanning civilisation.
But the policy of US government agencies is already to spend up to $4 million to save the life of a citizen, making the death of all Americans a $1,300,000,000,000,000 disaster.
According to Carl Shulman, research associate at Oxford University’s Future of Humanity Institute, that means you don’t need any fancy philosophical arguments about the value or size of the future to justify working to reduce existential risk — it passes a mundane cost-benefit analysis whether or not you place any value on the long-term future.
Links to learn more, summary, and full transcript.
The key reason to make it a top priority is factual, not philosophical. That is, the risk of a disaster that kills billions of people alive today is alarmingly high, and it can be reduced at a reasonable cost. A back-of-the-envelope version of the argument runs:
This argument helped NASA get funding to scan the sky for any asteroids that might be on a collision course with Earth, and it was directly promoted by famous economists like Richard Posner, Larry Summers, and Cass Sunstein.
If the case is clear enough, why hasn’t it already motivated a lot more spending or regulations to limit existential risks — enough to drive down what any additional efforts would achieve?
Carl thinks that one key barrier is that infrequent disasters are rarely politically salient. Research indicates that extra money is spent on flood defences in the years immediately following a massive flood — but as memories fade, that spending quickly dries up. Of course the annual probability of a disaster was the same the whole time; all that changed is what voters had on their minds.
Carl suspects another reason is that it’s difficult for the average voter to estimate and understand how large these respective risks are, and what responses would be appropriate rather than self-serving. If the public doesn’t know what good performance looks like, politicians can’t be given incentives to do the right thing.
It’s reasonable to assume that if we found out a giant asteroid were going to crash into the Earth one year from now, most of our resources would be quickly diverted into figuring out how to avert catastrophe.
But even in the case of COVID-19, an event that massively disrupted the lives of everyone on Earth, we’ve still seen a substantial lack of investment in vaccine manufacturing capacity and other ways of controlling the spread of the virus, relative to what economists recommended.
Carl expects that all the reasons we didn’t adequately prepare for or respond to COVID-19 — with excess mortality over 15 million and costs well over $10 trillion — bite even harder when it comes to threats we’ve never faced before, such as engineered pandemics, risks from advanced artificial intelligence, and so on.
Today’s episode is in part our way of trying to improve this situation. In today’s wide-ranging conversation, Carl and Rob also cover:
Producer: Keiran Harris
Audio mastering: Ben Cordell
Transcriptions: Katy Moore
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode