• Wion
  • /World
  • /AI in military: Humans, not AI, should control nuclear weapons, agree 100 nations; sign agreement - World News

AI in military: Humans, not AI, should control nuclear weapons, agree 100 nations; sign agreement

AI in military: Humans, not AI, should control nuclear weapons, agree 100 nations; sign agreement

AI and nuclear decisions

At the two-day global summit on artificial intelligence (AI) in the military domain, nearly 100 countries, including the United States, China, and Ukraine, agreed that humans—not AI—should make critical decisions regarding the use of nuclear weapons. The nations have signed a non-binding agreement to that effect.

The REAIM summit

The two-day 'Responsible AI in the Military Domain (REAIM)' summit held in Seoul wrapped up with anon-binding declaration called the "Blueprint for Action." It emphasises the necessity of maintaining human control in decisions concerning nuclear weapons deployment.

Add WION as a Preferred Source

The non-binding agreement says it is essential to "maintain human control and involvement for all actions... concerning nuclear weapons employment".

It adds that AI applications in the military "must be applied in accordance with applicable national and international law".

"AI applications should be ethical and human-centric," it adds.

The summit also noted that there was a need for "further discussions... for clear policies and procedures".

However, the declaration stopped short of outlining sanctions or consequences for any violations of these principles.

Even though the declaration is not legally binding, it was not signed by China.

Russia, a Chinese ally, was not invited to due to its invasion of Ukraine.

Artificial intelligence in the military

AI is already being used in military operations for tasks like reconnaissance, surveillance,and analysis.

It also has the potential to autonomously select targets in the future, as made evident by an AI-based tool "Lavender," which is reportedly being used by Israel inthe war in Gaza against Hamas militant group.

The Lavender system is said to mark suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ) as potential bombing targets, including low-ranking individuals.

The software analyses data collected through mass surveillance on most of Gaza's 2.3 million residents, assessing and ranking the likelihood of each person's involvement in the military wing of Hamas orPalestinian Islamic Jihad(PIJ).

Individuals are given a rating of 1 to 100, indicating their likelihood of being militant.

As per reports, even though the AI machine has an error rate of 10 per cent, its outputs were treated "as if it were a human decision".

(With inputs from agencies)

About the Author

Share on twitter

Moohita Kaur Garg

Moohita Kaur Garg is a senior sub-editor at WION with over four years of experience covering the volatile intersections of geopolitics and global security. From reporting on global...Read More