Bipartisan US Bill Aims To Prevent AI From Launching Nuclear Weapons

"As we live in an increasingly digital age, we need to ensure that humans hold the power alone to command, control, and launch nuclear weapons – not robots," said co-sponsor Sen. Ed Markey.

Posted on

In the name of "protecting future generations from potentially devastating consequences," a bipartisan group of U.S. lawmakers on Wednesday introduced legislation meant to prevent artificial intelligence from launching nuclear weapons without meaningful human control.

The Block Nuclear Launch by Autonomous Artificial Intelligence Act – introduced by Sen. Ed Markey (D-Mass.) and Reps. Ted Lieu (D-Calif.), Don Beyer (D-Va.), And Ken Buck (R-Colo.) – asserts that "any decision to launch a nuclear weapon should not be made" by AI.

The proposed legislation acknowledges that the Pentagon's 2022 Nuclear Posture Review states that current US policy is to "maintain a human 'in the loop' for all actions critical to informing and executing decisions by the president to initiate and terminate nuclear weapon employment."

The bill would codify that policy so that no federal funds could be used "to launch a nuclear weapon [or] select or engage targets for the purposes of launching" nukes.

"As we live in an increasingly digital age, we need to ensure that humans hold the power alone to command, control, and launch nuclear weapons – not robots," Markey asserted in a statement. "We need to keep humans in the loop on making life-or-death decisions to use deadly force, especially for our most dangerous weapons."

Buck argued that "while US military use of AI can be appropriate for enhancing national security purposes, use of AI for deploying nuclear weapons without a human chain of command and control is reckless, dangerous, and should be prohibited."

According to the 2023 AI Index Report – an annual assessment published earlier this month by the Stanford Institute for Human-Centered Artificial Intelligence – 36% of surveyed AI experts worry about the possibility that automated systems "could cause nuclear-level catastrophe."

The report followed a February assessment by the Arms Control Association, an advocacy group, that AI and other emerging technologies including lethal autonomous weapons systems and hypersonic missiles pose a potentially existential threat that underscores the need for measures to slow the pace of weaponization.

"While we all try to grapple with the pace at which AI is accelerating, the future of AI and its role in society remains unclear," Lieu said in a statement introducing the new bill.

"It is our job as members of Congress to have responsible foresight when it comes to protecting future generations from potentially devastating consequences," he continued. "That's why I'm pleased to introduce the bipartisan, bicameral Block Nuclear Launch by Autonomous AI Act, which will ensure that no matter what happens in the future, a human being has control over the employment of a nuclear weapon – not a robot."

"AI can never be a substitute for human judgment when it comes to launching nuclear weapons," Lieu added.

While dozens of countries support the Treaty on the Prohibition of Nuclear Weapons, none of the world's nine nuclear powers, including the United States, have signed on, and Russia's invasion of Ukraine has reawakened fears of nuclear conflict that were largely dormant since the Cold War.

Brett Wilkins is is staff writer for Common Dreams. Based in San Francisco, his work covers issues of social justice, human rights and war and peace. This originally appeared at CommonDreams and is reprinted with the author’s permission.

27 thoughts on “Bipartisan US Bill Aims To Prevent AI From Launching Nuclear Weapons”

  1. Will other countries like Chinka and Russia follow our lead or put the world on a razors edge? I think that matters. Once ONE uses AI, ALL will.

  2. The fact that people in Congress are concerned about this means it’s already a big problem. Nuclear weapons shouldn’t exist in the first place, but letting AI anywhere near them takes this insanity to a new level.

  3. Russia’s invasion of Ukraine has
    of nuclear conflict that were largely dormant since the Cold War.

    Horseshit. You and all the other idiots that sit around watching the US lol rekt everything they can on their way to Russia’s borders with nukes. Yeah it was Russia, dude. Totally…*passes joint*

  4. It was a human in Russia who prevented a nuclear exchange. Concerning AI, programming is paramount. Any garbage in will lead us to doom.

  5. A nice sentiment: AI can’t decide to launch nuclear weapons.
    In practice:
    “sir, in 99/100 simulations, our AI strategybot has advised that a nuclear strike is the optimal move here.”
    human decision maker: “prepare launch sequence.”

    And to the phrase: “Human Rights Watch and the International Human Rights Clinic of Harvard Law School argue that ‘‘[r]obots lack the compassion, empathy, mercy, and judgment necessary to treat humans humanely, and they cannot understand the inherent worth of human life’’.

    Can’t the same be said of any human willing to launch a nuclear strike?

Comments are closed.