Log In

Try PRO

AD
Anna Fleck for Statista

“Doomsday Clock” ticks closer to midnight - Statista

The hands of the symbolic Doomsday Clock now stand at 85 seconds to midnight - closer to global catastrophe than ever before, Statista reports.
“Doomsday Clock” ticks closer to midnight - Statista
The Doomsday Clock measures how close we are to creating a man-made catastrophe and the experts just moved the hands pu from 89 seconds to midnight to 85 seconds. With a key Cold War-era missile agreements due to expire in February there will be nothing to prevent a new arms race kicking off.
January 28, 2026

The hands of the symbolic Doomsday Clock now stand at 85 seconds to midnight - closer to global catastrophe than ever before, Statista reports.

The Doomsday Clock, or the Nuclear War Clock, represents how close we are veering towards global disaster at the hands of humans. According to the Bulletin of the Atomic Scientists, the decision to move the clock forward from 89 to 85 seconds reflects escalating threats from nuclear weapons, accelerating climate change and the potential misuse of emerging technologies.

As Steve Fetter, a member of the Bulletin's Science and Security Board, explains: “In every area, we have failed to take the steps that are necessary to reduce risks and there are new developments in every area that make the risks greater.”

At the announcement of the new setting, Bulletin representatives highlighted how the last remaining treaty governing nuclear weapons stockpiles between the United States and Russia is due to expire next week. This means that for the first time in over half a century, there will be nothing preventing a “runaway” nuclear arms race. The scientists also voiced concern about the global rise of autocracies and how AI is “supercharging" mis- and disinformation.

It is believed that artificial intelligence could undermine nuclear deterrence in several ways. According to the Future of Life Institute, an international thinktank focused on existential risks from transformative technologies, this could include increased risk of error based on disinformation, as well as the fact that faster, AI-driven judgments may leave little time to verify information before irreversible actions are taken.

Despite these warnings, the Bulletin urges that the public has power beyond the ballot box. Citizens, it argues, can also to take the opportunity to become more informed about these risks and to drive policy by showing political interest in their local districts.

Looking further back, the Doomsday Clock was first set below the two-minute mark (at "100 seconds" to midnight) in 2020. That shift cited the Covid-19 pandemic, advancing climate change, the spread of fake news and a worsening geopolitical instability. Russia’s invasion of Ukraine later intensified these concerns, prompting researchers to advance the Doomsday Clock from 107 to 90 seconds (1.5 minutes) before midnight in early 2023.

As this chart shows, 1953 was also considered a year of heightened tensions, when the U.S. and the USSR had tested hydrogen bombs. Notably, however, the threat of catastrophe from the so-called Cuban Missile Crisis (1962) is not reflected on the clock. This is likely because the issue of the Bulletin of the Atomic Scientists published near the time came out in November/December 1962, after the immediate danger of the crisis had largely subsided.

Infographic: “Doomsday Clock” Ticks Closer to Midnight | Statista You will find more infographics at Statista

The US and UK are among 12 countries opposing a global ban on autonomous weapons, joined by Australia, Belarus, Estonia, India, Israel, Japan, North Korea, Poland, Russia and South Korea. 

Data compiled by Automated Decision Research, the monitoring and research team of Stop Killer Robots, finds that another 53 nations have yet to take a clear stance, while 127 countries, including most of Africa and Latin America, support the ban. These positions have been listed following discussions at UN General Assembly and Certain Conventional Weapons meetings.

At present, there is no single law or legally-binding treaty that bans the use of lethal autonomous weapons (LAWS), which have been used in conflict zones like Ukraine and Libya. The International Committee of the Red Cross (ICRC) is calling for new international rules, citing humanitarian, legal and ethical concerns over the loss of human control in warfare. LAWS pose risks to both civilians and combatants and could escalate conflicts.

Though LAWS may use AI, it is not a requirement. However, the broader debate around military AI also remains obscure. Over the past few years, several initiatives have emerged to address military AI, but none are yet legally binding. In 2024, the UN GA resolution A/79/408 saw 166 countries supporting restrictions on LAWS, while Belarus, Korea, and Russia opposed, and 15, including Ukraine, abstained. Meanwhile, two landmark intergovernmental frameworks worth mentioning include The Political Declaration on Responsible Military Use of AI and Autonomy, an initiative launched by the U.S. and supported by over 60 nations, as well as the Responsible AI in the Military Domain (REAIM) Call for Action, endorsed by more than 50 countries. Both focus on ethical guidelines but are non-binding.

The UN Office for Disarmament Affairs has condemned LAWS as "politically unacceptable and morally repugnant," and UN Secretary-General António Guterres has called for their prohibition under international law.

Infographic: Twelve Countries Say No to Banning Autonomous Weapons | Statista You will find more infographics at Statista

Unlock premium news, Start your free trial today.
Already have a PRO account?
About Us
Contact Us
Advertising
Cookie Policy
Privacy Policy

INTELLINEWS

global Emerging Market business news