GCR Policy Newsletter (3 October 2022)
Differential technology development, space governance, nuclear risk, existential climate change
This twice-monthly newsletter highlights the latest research and news on global catastrophic risk. It looks at policy efforts around the world to reduce the risk and policy-relevant research from the field of studies.
GCR in the media
“All these years later, the COVID-19 pandemic reminded all of us about the ramifications of our nation and the world still not being prepared to act decisively with a well-planned federal response to global catastrophic events. The human suffering and the staggering economic costs should be a clarion call for global catastrophic risk strategies to become an actionable, and properly funded, national priority.” 9/11 anniversary underscores unfinished business at DHS
“Humanity tends to lack a long-term perspective because there has been little in our evolutionary history that rewards such thinking. Long-term strategies can help avert existential threats we create ourselves, such as climate change, lab-engineered viruses and artificial intelligence.” Don’t fear nuclear war – a killer plague or rogue AI are more likely to end humanity
“With new evidence that catastrophic climate-change ‘tipping points’ are nearing - from surging sea levels as polar ice melts to spiking temperatures as methane escapes thawing permafrost - scientists are quietly planning for the unthinkable. ‘Extreme climate change risks are under-explored,’ Luke Kemp, a researcher with the Centre for the Study of Existential Risk at the University of Cambridge, warned at a pioneering conference on the theme at the University of Exeter this week.” As climate 'tipping points' near, scientists plan for unthinkable
“Written by an award-winning historian of science and technology, Planet in Peril describes the top four mega-dangers facing humankind – climate change, nukes, pandemics, and artificial intelligence. It outlines the solutions that have been tried, and analyzes why they have thus far fallen short. These four existential dangers present a special kind of challenge that urgently requires planet-level responses, yet today's international institutions have so far failed to meet this need. The book lays out a realistic pathway for gradually modifying the United Nations over the coming century so that it can become more effective at coordinating global solutions to humanity's problems.” Planet in Peril: Humanity's Four Greatest Challenges and How We Can Overcome Them
Latest policy-relevant research
Leveraging differential technology development
Technologies that reduce risks from other technologies or constitute low-risk substitutes might be particularly promising ways to mitigate potential catastrophic risks from emerging technologies, according to researchers at the University of Oxford. This responsible innovation principle - “differential technology development” - calls for leveraging risk-reducing interactions between technologies by affecting their relative timing. A principle of differential technology development could inform government research funding priorities and technology regulation, as well as philanthropic research and development funders and corporate social responsibility measures. (8 September 2022)
Policy comment: Governments have various tools in their toolkit for either delaying risk-increasing technologies or promoting the development of risk-reducing technologies. These tools - financial, regulatory, legal, normative - are generally well-established government mechanisms. The paper provides a very useful overview of each of these mechanisms and how they could be applied. The first steps for governments will be to develop a process for prioritising which risk-increasing technologies to focus on, determining when the differential technology development principle applies, and deciding which mechanisms are most applicable or impactful. A clear decision-making framework and process will also help clarify the timing, scale and direction of the intervention.
Slowing down the development of AI is likely to reduce the risks of existential catastrophe, even when accounting for knock-on effects for other existential risks, according to a new pre-publication article by a fourth year computer science student. An aligned AGI would reduce the risks of disease, war and environmental catastrophe, so slowing down AI development increases these risks. However, since the risks of AI misalignment are considerably higher, slowing down AI alignment is net positive. To reduce total existential risk, humanity should take robustly positive actions such as working on existential risk analysis, AI governance and safety, and reducing all sources of existential risk by promoting differential technological development. (30 August 2022)
Policy comment: Governments should approach global catastrophic risk in a holistic manner and develop better understanding of the interaction between these risks. This piece is a simple but useful start to assessing the trade-off in reducing risks from AI and other GCRs. It demonstrates the importance of viewing GCR holistically. Reducing one risk could potentially exacerbate other risks. When it comes to AI risks in particular, policymakers should consider the relative effectiveness of policies which slow down its development and those which improve safety practices in the field. Slowing down AI development is only useful insofar as it is cashed out in terms of total reduction in the risk of global catastrophe.
Overcoming biases around nuclear risk
Rather than focusing on the low probability of nuclear war in the short run, analysts, policy-makers and activists should emphasise its long-term inevitability, according to a new academic publication in Global Policy. Much of the expected cost of nuclear weapons is externalised to future generations, while present-day possessors capture the lion’s share of their benefits. Governments should explore forms of deterrence that would limit the risk and extent of nuclear winter. Proposals for international reform should emphasise strategies for achieving peace enduring enough for political or technological change that mitigates the threat of nuclear weapons. International relations research should prioritise the study of stable peace, and governments should prioritise its funding. (22 August 2022)
Policy comment: A key policy challenge for reducing nuclear risk is that both policy-makers and the public have little reference point for nuclear catastrophe. Nuclear war was a highly salient issue during the Cold War, but as the memory has faded, nuclear disarmament has become an extremely challenging policy goal. The recent conflict between Russia and Ukraine has once again made the risk more real - a rare opportunity to shape nuclear risk issues on the domestic and international stage. Full disarmament might still be unrealistic. So a greater focus could go towards driving differential technological development in the nuclear domain, reducing geopolitical tensions and increasing resilience to nuclear winter.
Reforming space governance
The current international space governance framework has proven unsuitable for regulating emerging and future space activities, according to a detailed publication by Carson Ezell of the Space Futures Initiative. Rapid technological progress in the outer space domain has led to increasingly fragmented, less inclusive and less effective multilateral institutions. Unless space governance frameworks are improved, risks associated with space exploration will continue to increase. Four key reforms to existing space frameworks - shared infrastructure, horizon scanning, a conflict resolution mechanism, and a verification agency - would effectively shorten cycles between policy implementation and feedback, address regulatory gaps, and promote inclusivity in the governance process. (27 August 2022)
Policy comment: The nexus of space and global catastrophic risk makes space governance a particularly challenging policy issue. Space is an important ingredient in GCR: a potential source of risk (such as near-earth objects), a factor that drives other risks (such as weapons of mass destruction), and a factor that drives resilience (such as interplanetary colonization). Transformative reforms to space governance are needed. But nation states and private industry are increasingly competing over this domain, raising both the stakes and urgency of the problem. Starting small will be key. For example, bolstering Dubai’s Courts of Space and supporting the courts’ efforts to develop a Space Dispute Guide could be a discrete way to kickstart innovative and adaptive governance arrangements.
Framing existential climate risk
Although climate change is widely recognized as a major risk to societies and natural ecosystems, existential climate risk is poorly framed, defined, and analyzed in the scientific literature, according to a recent paper by Huggel et al. To better frame existential risks in the context of climate change, the risks should be defined as threatening the existence of a subject, where this subject can be an individual person, a community, or nation state or humanity, and based on two levels of severity, which are threats to the physical life and to basic human needs. Explicitly including this framework in climate analysis would help in risk assessments and risk management strategies and actions. (12 September 2022)
Policy comment: The new definition of existential risk in this paper - covering risk to one human life up to humanity-level extinction - mostly muddies the water on existential and global catastrophic risk, an already knotty term for policy-makers. The tail risk of climate change is a very understudied part of the field and falls outside of the IPCC line of sight. Expanding the definition of existential risk to cover lower-level risk does not solve this problem. Studying catastrophic harm to humanity from climate change deserves special attention due to both the analytical challenge around low-likelihood risks and the communication challenge of sounding alarmist or driving apathy. National governments and the IPCC could more directly support the research field of climate change to evaluate the existential risk of climate change. In particular, it will be important to identify where policy approaches might differ from addressing baseline scenarios.
Policy database updates
The GCR Policy team continues to collect policy ideas put forward by the field of existential and global catastrophic risk studies. The database currently contains over 800 ideas, of which over 300 are listed publicly here. We will continue to update the database with the latest research and policy ideas. Contact us for more information or access to the full database.
This newsletter is prepared by the team of GCRpolicy.com. Subscribe for twice-monthly updates on the latest policy news and research on global catastrophic risk.