Today the Doomsday Clock has been moved to 90 seconds to midnight, the closest it has ever been and a full 25% closer to midnight than it was set at any time during the Cold War. This reflects a global situation that is spiralling out of humanity’s control, in relation to the risk of nuclear and biological weapons, climate change, and disruptive technology. As researchers at the Centre for the Study of Existential Risk, we regrettably can only concur with this terrifying assessment of humanity’s existential predicament. Below we set out our own reasons for this conclusion and some of the ways that we will be working over the next year to try and improve the situation.
The Doomsday Clock has a simple message - that humanity can and must dramatically improve its collective capacity to govern itself and its more dangerous and complex activities if we are to survive. The risk involves complex interaction between longer term trends that include climate change, disruptive technologies and social instabilities alongside a more immediate deterioration in strategic relationships and limitations to the effectiveness of global governance. The world’s capacity to manage complex existential risk has been weakened in 2022.
The Centre for the Study of Existential Risk takes a holistic and pluralist approach to these challenges. Our researchers cover a variety of hazards, their interactions and the vulnerabilities within our systems, seeking achievable transitions that can positively impact upon this capacity.
Nuclear Weapons and strategic relations
Russia’s invasion of Ukraine and its leadership’s explicit warnings around the use of nuclear weapons are the most concerning development of the year. The dangers of nuclear use have risen to heights last seen 40 years ago. This war has also damaged the appetite for cooperation and global diplomatic capacity across all sectors. In particular, arms control is in crisis since the Trump Administration shredded many agreements with Russia, and the last remaining instrument, New START, runs out in three years..
On the other hand, increasing awareness of the nuclear risks could spur action. The Doomsday Clock moved in a direction away from Midnight in 1963 soon after the Cuban Missile Crisis. The NPT was negotiated four years later. Many practitioners are today aware of the need to double down on new effective instruments that tackle emerging novel complexities. CSER is working with governments on approaches to nuclear diplomacy, and to improve our understanding of the likely global climatic consequences from nuclear detonations over many cities.
The inexorable rise of global emissions continued, and COP27 demonstrated the paucity of global governance. Fossil fuel prices increased with the Russian invasion, and Europe found alternative hydrocarbon sources, sometimes turning back to coal, and permitting or investing in new extraction. There also remain significant uncertainties about mid and long term effects of other forms of pollution and the consequences of new extractive industries.
Researchers at CSER have been looking at some of the more risky realistic scenarios connected with climate change, and have recently published research warning of the extreme consequences which would deeply damage or destroy society.
Our energy vulnerability has also underlined and motivated a quicker pace towards net zero. We see the green shoots of increased government funding towards the green transition. Most notably, the Inflation Reduction Act (IRA) and other US Bills passed this year have committed over $600 billion to the transition.
The COVID-19 pandemic continues into 2023 and it remains unclear how well many governments will integrate the lessons they had a chance to learn during its early years. Unequal access to vaccines threatens a two-tiered recovery from the pandemic that will have long-ranging consequences to global health and wellbeing but also to international cooperation and trust. In the US, Operation Warp Speed symbolically came to an end with the person who ran the vaccine distribution operation under the early Biden administration resigning. More fundamentally, calls for $60 billion investment in pandemic preparedness have not been answered. China’s Zero Covid strategy came to a shambolic end with too few people vaccinated. Pandemic preparedness must also comprise greater understanding of how and why diseases emerge, and what impact human activity such as habitat encroachment and deforestation can have - these aspects of emerging infectious disease are still misunderstood and underestimated.
The Biological Weapons Convention (BWC) still has no verification protocol and progress at the recent Review Conference was hampered by political disagreement. Nevertheless, in the context of rock-bottom expectations, the 50th anniversary BWC meeting ended with limited agreement. This is complemented by work at the civil-society level that CSER researchers are doing on strengthening other kinds of governance, such as the establishment of norms and practice around responsible research and innovation: there are many levels at which governance can occur, and that these can usefully co-exist should give us hope.
Some new technologies, like mRNA vaccines for COVID and other diseases such as malaria, also give us some grounds for hope. These technologies need to be developed and deployed for the benefit of all, with responsible, ethical and equitable principles at their heart.
While present, cyber-conflict has had less impact on outcomes in the Russian-Ukrainian conflict than expected. There have been attempts at cyber-disruption, from fake news spread by Russian media and bots, to attempts at using deep-fakes. The most impactful use of cyber may have been in western espionage on Russian communications fed to the Ukrainians.
Ten years ago, AlexNet broke all records at image recognition, and showed what deep neural networks could do. The past decade has seen continual technological breakthroughs. This became tangible to the public in 2022, with the ability to produce your own AI art through Midjourney or talk to a remarkable chatbot through ChatGPT. The US and its allies moved to increase export controls and take other actions to effectively cut off China’s ability to buy or build the latest, high-end chips. This is a monumental shift to supply chains and it is hard to predict all the impacts it might have. It may slow overall AI progress, and could offer breathing space to take sensible steps, such as establishing AI regulation.
However, positive moves have come from both sides of the Atlantic getting serious about the challenges of ensuring that AI systems used in high-risk situations are safe and secure. The EU is finalising its EU AI Act, and the US is forging forward with NIST standards.
Looking ahead to 2023
In general, we need international cooperation to reduce the risks posed to every single country by these global risks. That cooperation is harder in times of war, conflict and great power competition. And yet, previous generations faced just as difficult tensions, were able to act with some degree of wisdom and prudence and cooperate in their shared interests. We must learn to do the same.
We also need to act collectively towards justice as well as safety; existential risk affects every human, yet responsibility for this risk is not so equally spread. Indeed many of those who are currently most negatively impacted by war, climate change, and technological harms are among the least responsible for these things and stand to benefit the least from the economic and political institutions that are driving them. Standing in solidarity with all who suffer as a result of existential hazards and holding those with outsized responsibility to account while also shining a light on the systems of oppression, extraction, and privilege that support this status quo is a vital aspect of moving towards a safer world.
The Centre for the Study of Existential Risk works to understand risks with the potential to bring about human extinction - but also to develop approaches to managing and mitigating them. While we start 2023 with significant concern about many of the issues raised in the Bulletin’s announcement about the Doomsday Clock; we also look forward to opportunities to move humanity towards a safer situation.The UN Secretary-General recently highlighted a path forward with his ‘Our Common Agenda’ report, calling for sustainable development to be risk-informed and emphasising the UN’s opportunity to address global risks.
CSER has an increasingly broad network of contacts across global decision makers and we look forward to working with them to make use of this opportunity. We are also increasingly developing our own approaches to risk governance at the global level, from the Stepping Stones approach to supporting nuclear disarmament, to proposals for a global treaty on Artificial Intelligence. And we continue to take the long view of our own work, ensuring that lessons are learned from the recent string of crises affecting all of us to build towards a safer world. At a time when the world is closer to midnight than it has ever been before it is worth reflecting that the darkest hour may come before the dawn.
The very periods when the Bulletin of Atomic Scientists judged humanity to be at its safest often occurred shortly after global crises like the ‘second Cold War’ (whose resolution saw the clock move to 17 minutes to midnight in 1991) or the Cuban Missile Crisis (which spurred leaders into drastic action that moved the clock back to 12 minutes to midnight in 1963). Let us hope that the same will be true for the 2020s as well.
For media enquiries, please contact email@example.com
- "Does the Doomsday Clock Actually Mean Anything?" in Science Inverse
- "Daily Cuts - Doomsday Clock" in CNA
- "Doomsday clock may tick closer to midnight, as experts warn of nuclear war" in WION News
- "What is the 'Doomsday Clock'?" Reuters