Global Shield Briefing (February 2025)
What the Doomsday Clock and annual global risk reports are telling us
The latest policy, research and news on global catastrophic risk (GCR).
The start of every year typically comes with a swath of global risk lists and assessments. From scientists, think tanks, risk specialists, geopolitical consultants, insurance companies – each have their own view. Regardless of the methodology, the message is clear: 2026 is a year of disruption, complexity and uncertainty. Policymakers will need to remain agile and adaptable to the times.
In the spirit of adaptability, we are tweaking our regular briefing. You’ll still be receiving our insights into the latest on global catastrophic risk from around the world. We’ll also be sharing a little more about what Global Shield has been up to, and what we’re tracking in the coming months.
What we’ve been doing
Global Shield has had a busy start to the year. To support our global growth, we have been looking to hire a Director of NATO Policy, US Policy Manager, and Operations Associate. The Operations Associate role is still open! The role will help manage and grow the operational infrastructure of Global Shield’s current and future offices. It is for the natural builder. Someone who wants to help a young organization grow and thrive. Someone who enjoys helping others achieve their best. Someone who has grand ambitions and is ready to roll up their sleeves to achieve those ambitions. If this sounds like you, apply now.
Australia
Australia continues to provide opportunities to advance policies that reduce global catastrophic risk. Global Shield Australia’s recent focus has been supply chain resilience and AI risk in the context of Australia’s foreign and trade policy. Global Shield Australia made a submission to the Southeast Asia Free Trade Agreement Modernisation Review outlining how Australia can use trade agreements to strengthen supply chain resilience and build a trusted AI trade agenda with ASEAN partners. We are also engaging the government on the development of an Australian Government Strategy for International Engagement and Regional Leadership on Artificial Intelligence, as outlined in the National AI Plan released in December 2025.
United States
The US office is hard at work on the reauthorization and modernization of the Defense Production Act (DPA). The DPA was given a one-year extension under the 2025 National Defense Authorization Act (NDAA). We continue our efforts to advocate for a full five-year reauthorization, along with improvements to the act that would improve Congressional oversight and Executive Branch implementation.
Policy
Our Policy function has been diving deeper into how governments can plan for and respond to major AI incidents, or “AI crises”. An AI incident would reach crisis-level when it causes wide-ranging harm across multiple sectors or jurisdictions and severe disruptions to a country’s economy, security, or society. As malicious actors become increasingly able to use AI to conduct attacks, and as AI becomes increasingly embedded in critical infrastructure, such crises become more likely and harmful. We are investigating a range of possible policy responses.
What we’ve been tracking
Keeping an eye on global risk indicators
The Bulletin of the Atomic Scientists’ Science and Security Board has set the Doomsday Clock to 85 seconds to midnight, the closest the Clock has ever been to midnight in its history. Alexandra Bell, president and CEO of Bulletin of the Atomic Scientists, said: “The Doomsday Clock’s message cannot be clearer. Catastrophic risks are on the rise, cooperation is on the decline, and we are running out of time. Change is both necessary and possible, but the global community must demand swift action from their leaders.”
The World Economic Forum released their Global Risks Report 2026. It is developed based on a survey of the views of 1,300 global leaders and experts across academia, business, government, international organizations and civil society to gauge perceptions of global risk. The report warns that, “global risks continue to spiral in scale, interconnectivity and velocity, 2026 marks an age of competition. As cooperative mechanisms crumble, with governments retreating from multilateral frameworks, stability is under siege”. According to the report, 50 percent of respondents viewed the risk of global catastrophe to be “looming” or “elevated” over a two-year time frame – up from 36 percent in last year’s report. Over a ten-year time frame, that figure sits at 62 percent, up from 57 percent last year. Only one in 10 viewed the global outlook as likely to be calm or stable over both time frames.
Other business-focused annual risk reports were also recently released, including:
Eurasia Group’s Top Risks for 2026, which highlighted the risk that “some AI companies will adopt extractive business models that threaten social and political stability” and that “water is becoming a loaded weapon in several of the world’s most dangerous rivalries”.
EY-Parthenon’s 2026 Geostrategic Outlook, which notes that “The world is entering 2026 amid a period of heightened uncertainty. The disruptive forces of transformation are increasingly non-linear, accelerated, volatile and interconnected”. It ranks sovereign AI and cyber conflicts, and water scarcity as two of the Top 10 geopolitical developments in 2026.
Allianz’s Risk Barometer, which is based on the ratings of 3,338 risk management experts from 97 countries and territories. Cyber was the greatest concern, and artificial intelligence climbed to second, after being tenth in the previous year. Both cyber and AI ranked as top five concerns for companies in almost every sector. The report states that “As AI adoption accelerates and becomes more deeply embedded in core business operations, respondents expect related risks to intensify.”
The Stimson Center’s Ten Top Risks for 2026, which lists climate decline, a third nuclear era, and AI disruption as three major concerns.
Control Risks’ Riskmap2026, which notes that “Natural disasters, infrastructure failures and inter-state tensions may not reach headlines, but their implications for people, supply chains, and security are often no less profound. Each straw is a potential breaking point.”
Policy comment: Policymakers must recognize that catastrophic risk is not only a national security challenge, but front and centre for businesses and corporate executives. And the private sector is looking to policymakers for guidance, support and regulations on how to navigate treacherous times. In the meantime, these risk perceptions could impair capital allocation and investment, especially when corporate executives cannot be confident that their business operations, supply chains, and infrastructure will be resilient to complex and systemic risk. Governments should take immediate action to respond to increasing corporate alarm. They could coordinate and communicate with corporate executives to understand their greatest concerns, where they are most vulnerable, and how they might need support. Policymakers should share risk assessments and threat intelligence to inform boards and corporate executives – a practice common with critical infrastructure sectors but one that could be expanded further. They could also help develop market mechanisms that increase preparedness and resilience in the private sector. Guidance for preparedness would also be helpful, such as the recent release of the “In case of crisis or war: Preparedness for businesses” pamphlet circulated by Sweden’s government.
What we’re tracking next
Below are some events and activities over the following month worth keeping an eye on.
Nuclear
The nuclear arms control treaty, New START, expired on 4 February, the first time in 40 years that the US and Russia have been without a nuclear arms control treaty. We’ll be following the policy statements and news reporting on how major powers are seeing this new era of nuclear diplomacy.
Risk governance
The OECD is inviting comments on the draft revised Recommendation on the Governance of Critical Risks. The document helps set a standard for OECD member states to fulfil in their national risk governance and management.
Artificial intelligence
India is hosting the AI Impact Summit 2026 on February 16-20, following on from the AI Action Summit 2025 in France and the Korea AI Summit 2024. AI risk and safety is playing a less prominent roles in these summits compared to earlier iterations.
Space
We’re excited by Artemis II, the NASA mission to orbit the Moon, which was scheduled for mid-February but delayed until at least early March. The mission will test the Orion spacecraft and life support systems with crew aboard before attempting a moon landing on Artemis III, which would occur no earlier than 2028.
This briefing is a product of Global Shield, the world’s first and only advocacy organization dedicated to reducing global catastrophic risk of all hazards. With each briefing, we aim to build the most knowledgeable audience in the world when it comes to reducing global catastrophic risk. We want to show that action is not only needed, it’s possible. Help us build this community of motivated individuals, researchers, advocates and policymakers by sharing this briefing with your networks.

