Doomsday clock.
Photo credit: Bulletin of the Atomic Scientists’ Science and Security Board

The Doomsday Clock has been set to 85 seconds to midnight — the closest it has ever been to global catastrophe — as scientists warn that a “failure of leadership” and the rise of autocracy have left humanity dangerously vulnerable to nuclear war, unchecked artificial intelligence, and biological collapse.

The Bulletin of the Atomic Scientists’ Science and Security Board (SASB) announced the historic move today, advancing the hands from 89 seconds to midnight, where they stood in 2025. The Board issued a blistering critique of the geopolitical landscape, citing the expiration of the New START nuclear treaty, a “war on renewable energy”, and the emergence of “mirror life” as critical threats.

“The Doomsday Clock’s message cannot be clearer. Catastrophic risks are on the rise, cooperation is on the decline, and we are running out of time,” says Alexandra Bell, president and CEO of the Bulletin of the Atomic Scientists. “Change is both necessary and possible, but the global community must demand swift action from their leaders.”

The collapse of shared reality

In a stark assessment of the global information ecosystem, the Bulletin identifies a crisis that underpins all others: the inability of nations to agree on basic facts. Maria Ressa, Nobel Peace Prize Laureate and Bulletin Board member, describes the current era as an “information Armageddon”.

“Without facts, there is no truth. Without truth, there is no trust. And without these, the radical collaboration this moment demands is impossible,” Ressa says. “We are living through an information Armageddon — the crisis beneath all crises — driven by extractive and predatory technology that spreads lies faster than facts and profits from our division.”

Ressa argues that this fracture prevents any meaningful resolution to physical threats. “We cannot solve problems we cannot agree exist… Nuclear threats, climate collapse, AI risks: none can be addressed without first rebuilding our shared reality. The clock is ticking.”

Nuclear escalation and the ‘Golden Dome’

The nuclear landscape has deteriorated into what the Bulletin describes as a “full-blown arms race”. The statement highlights the expiration of the New START treaty — ending nearly 60 years of nuclear restraint between the US and Russia — and plans for a new US missile defence system dubbed “Golden Dome”, which includes space-based interceptors.

Jon B. Wolfsthal, director of global risk at the Federation of American Scientists, warns that nuclear states are “reducing their own security” by pursuing coercion over deterrence.

“In 2025, it was almost impossible to identify a nuclear issue that got better,” Wolfsthal says. “More states are relying more intently on nuclear weapons… Hundreds of billions are being spent to modernise and expand nuclear arsenals all over the world.”

Wolfsthal urges leaders to relearn Cold War lessons: “No one wins a nuclear arms race, and the only way to reduce nuclear dangers is through binding agreement to limit the size and shape of their nuclear arsenals.”

Autocracy vs. Cooperation

Daniel Holz, PhD, chair of the SASB, points to the rise of nationalistic autocracies as a threat multiplier that is dismantling international trust.

“The dangerous trends in nuclear risk, climate change, disruptive technologies like AI, and biosecurity are accompanied by another frightening development: the rise of nationalistic autocracies in countries around the world,” Holz says. “Our greatest challenges require international trust and cooperation, and a world splintering into ‘us versus them’ will leave all of humanity more vulnerable.”

The rapid integration of AI into military command systems and the deregulation of safety standards were cited as major accelerators of risk. Steve Fetter, professor of public policy at the University of Maryland, criticised the US administration’s revocation of AI safety initiatives.

“As uses of AI expand and concerns grow about potential risks, Trump revoked Biden’s AI safety initiative and banned states from crafting their own AI regulation, reflecting a ‘damn the torpedoes’ approach to AI development,” Fetter says. “The emphasis on technological competition is making it increasingly difficult to foster the cooperation that will be needed.”

Climate and biological tipping points

On the environmental front, the Bulletin reports that carbon dioxide levels have reached 150 per cent of pre-industrial levels, with global temperatures breaking records in 2024 and 2025. Inez Fung, ScD, professor emerita at UC Berkeley, criticises current policies as a “war on renewable energy”.

“Reducing the threat of climate catastrophe requires actions both to address the cause and to deal with the damage of climate change,” Fung says. “Many technologies for renewable energy are now mature and cost-effective, and governments should ramp up the wide deployment of these clean energy technologies.”

Meanwhile, the biological threat landscape has expanded to include “mirror life” — synthetic biological forms that could devastate ecosystems. Asha M. George, executive director of the Bipartisan Commission on Biodefense, warns of degraded response capacities.

“This year featured degraded capacity to respond to biological events, further development and pursuit of biological weapons, poorly restrained synthetic biology activities, increasingly convergent AI and biology, and the spectre of life-ending mirror biology,” George says. “Partnerships… will be key to managing these risks. With the right tools and determination, we need not fall prey to the diseases that threaten us.”

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Resilience by design: Protecting the North’s digital backbone

theFreesheet is the official media partner for Manchester Edge & Digital Infrastructure…

DeepMind and Anthropic CEOs clash on AGI timeline but agree on disruption

The leaders of two of the world’s most powerful AI companies offered…

Funny business: Algorithms reveal hidden engineering of stand-up comedy

It may feel like a spontaneous conversation, but a new algorithmic analysis…

95% of AI pilots failing as companies driven by ‘fear of missing out’, Davos told

Ninety-five per cent of generative AI pilot projects are failing to deliver…

‘Digital harness’ needed to tame AI before it surpasses human intelligence

A “digital harness” is urgently needed to prevent artificial intelligence from outrunning…