GCR Policy Newsletter (31 October 2022)
The policy and politics of global catastrophic risk research
This twice-monthly newsletter highlights the latest research and news on global catastrophic risk. It looks at policy efforts around the world to reduce the risk and policy-relevant research from the field of studies.
Policy efforts on GCR
On 18 October, President Biden set out the following vision in the National Biodefense Strategy and Implementation Plan: “The United States actively and effectively assesses, prevents, prepares for, responds to, and recovers from naturally occurring, accidental, and deliberate biological threats impacting humans, animals, plants, and the environment and creates a world free from catastrophic biological incidents.” The Nuclear Threat Initiative stated that “the plan’s focus on preventing global biological catastrophe, including through efforts to strengthen biosecurity and biosafety, is crucial. In particular, improving governance of bioscience and biotechnology is essential to effectively guard against deliberate and accidental misuse with potentially catastrophic global consequences.”
GCR in the media
“Homo sapiens have existed on the planet for about 300,000 years, or more than 109 million days. The most dangerous of all those days — the day when our species likely came closer than any other to wiping itself off the face of the Earth — came 60 years ago today, on October 27, 1962. And the person who likely did more than anyone else to prevent that dangerous day from becoming an existential catastrophe was a quiet Soviet naval officer named Vasili Arkhipov.” 60 years ago today, this man stopped the Cuban missile crisis from going nuclear
“From the ICC’s Paris headquarters, Alim, a statistician, now games out the most catastrophic scenarios that might hit the world for the rest of the century—mega-disasters that could collapse the global economy, eviscerate entire industries, kill tens of millions of people, and make the COVID-19 pandemic look like a trivial blip on the radar.” A crisis worse than COVID? This 30-year-old statistician is responsible for spotting the next mega-threat to global business
“In July, Matheny became CEO of the Rand Corporation, the venerable California-based policy think tank that funds research on technology, infrastructure, health care, energy, climate, and many other areas. He’s especially focused on preventing “truth decay” — the decreasing trust in facts and data within the American political debate — and how, across the board, this decay could hold back efforts to improve policy. He still prioritizes preventing technological catastrophe while remaining hopeful that technology can, if used cautiously, solve rather than cause more problems.” The Future Perfect 50: Jason Matheny
“AI risk is the specific cause that Cotra has devoted most of her time to thinking about lately. In 2020, she put out a report that aimed to forecast when we’ll most likely see the emergence of transformative AI (think: powerful enough to spark a major shift like the Industrial Revolution). The question of AI timelines is crucial for figuring out how much funding we should spend on mitigating risks from AI versus other causes — the closer transformative AI is to happening, the more pressing the need to invest in safety measures becomes.” The Future Perfect 50: Ajeya Cotra
“Among his proposals to counter existential risk from biosecurity are the Nucleic Acid Observatory, which would monitor the emergence of dangerous pathogens through metagenomic screening of wastewater and waterways. Then there’s more advanced preparation for pandemics before they happen, through better personal protective equipment and next-generation vaccines. And most novel of all is SecureDNA, an encrypted tool that would screen all synthetic DNA sequence orders to major gene vendors to prevent anyone from obtaining the genes needed to make a threatening pathogen.” The Future Perfect 50: Kevin Esvelt
Latest policy-relevant research
The policy implications of global catastrophic risk studies
Catastrophic climate change scenarios should be studied, and society should prioritize avoiding catastrophic outcomes, but history also shows risks in overemphasizing the likelihood of calamity, according to a response to a Kemp et al. Perspective in August. Mindful of this, Kemp et al. understate the degree to which recent scientific and public discourses already prioritize catastrophic climate scenarios. And overemphasized apocalyptic futures can be used to support despotism and rashness. Kemp et al. disagree that catastrophic scenarios are already adequately or excessively studied, according to their reply. And a lack of attention to extreme risks or completely speculative doom mongering would more likely lead to maladaptive responses and mental health stresses than the informed deliberation over catastrophic risks that we propose. (10 October 2022)
Policy comment: The policy benefits of studying catastrophic scenarios almost certainly outweighs its risks. But the benefits are best realised when that research provides clear and practical recommendations for policy-makers. For example, the study of catastrophic climate risk could help identify which policy measures should be taken for worst-case scenarios beyond that which is needed for baseline scenarios. And researchers should be conscious of how their research could lead to poor policy-making or could misinform policy-makers and the public. Achieving policy impact is an important aspect of research and should be conducted wisely. Meanwhile, policy-makers should be more demanding customers of catastrophic risk studies. Efforts by governments themselves could also inform their own understanding of global catastrophic risk - such as by bolstering risk assessment, futures and horizon-scanning, intelligence and warning, and science capability.
Scientists have some degree of aversion or indifference towards considering policy issues, according to an academic publication by GCR governance researchers, Christopher Nathan and Keith Hyams. When they do not, they consider themselves to be doing it somewhat against a professional backdrop of indifference. Promising directions for future GCR governance include: incorporating GCR into existing incentivised norm structures (such as those codified and endorsed by funding bodies); finding ways to reinvigorate further the role of the public-facing scientist; and addressing the incentives towards short-termism in technology development. (28 October 2022)
Policy comment: In the current geopolitical, societal and economic environment, researchers and scientists must recognise how their work sits in a broader political context. Very little technological development - particularly on critical and frontier technologies such as AI and biotechnology - is a-political. Academic and research organizations might wish to consider employing or engaging with political scientists, policy institutes and policy-makers early in the development of their research agenda. This collaboration would help identify the implications of their work on policy, and implications of policy on their work. Academics and policy-makers could also work together to develop a policy-relevant research agenda, such as the 80 questions for UK biological security that the University of Cambridge developed in collaboration with five UK government agencies.