Who or what pulls the trigger?

A landmine is autonomous. However indiscriminate its selection of a target, a landmine does not require intervention by a human. Deadly weapons that do not require human intervention date back to pits with spikes, but something new is on the horizon: weapons that are selective in their targets. Concern that artificial intelligence will facilitate a new arms race has mobilized prominent scientists to publish an open letter warning of the dangers.

MQ-9 Reaper (Photo by Ethan Miller - Getty Images)
MQ-9 Reaper (Photo by Ethan Miller – Getty Images)

What are autonomous weapons? Technology that selects its own targets to destroy. This goes beyond the proximity fuse of World War II, kept as important an Ally secret as the atomic bomb. That caused an anti-aircraft shell to detonate when close to an airplane, but was aimed and launched by humans. In 2015, the U.S. Army developed guided bullets. The so-called “smart bullets” would follow the target at which they were shot. But they were still aimed and launched by humans. Weaponized drones that can independently identify targets and attack them are, perhaps, the first manifestation of autonomous weapons. The category, however, could stretch our imagination.

Why would we use autonomous weapons? For many of the same reasons that we automate anything: accuracy, speed, economy, scale. While humans are still better and often faster at recognizing patterns, for instance an appropriate target, technology is rapidly catching up. Technology has long been cheaper and easier to replicate than humans. Train a robot to do something, and then make copies.

One driver of autonomous technology is space exploration. Sending humans to other planets is both expensive and dangerous. But the distances mean that even speed-of-light communication is slow: Mars is minutes away and Saturn is over an hour. So some decisions have to be made by the technology. Imagine driving a car and feeling the lane-separating bumps only an hour after your tires touch them.

How could autonomous weapons change us? As imagined in the Open Letter, they could make it easier for dictators to control their populace, for warlords to murder by ethnicity, or for terrorists to target their attacks. Terrorists already deploy a poor-man’s autonomous weapon: suicide bombers. Someone clad in explosives can enter an area to choose the time and place of detonation. Autonomous weapons could bring economy and scale to this threat.

How can we change autonomous weapons? Thinking critically, we can understand and evaluate them. This essay has applied some of KnowledgeContext’s ICE-9 questions. Once we understand and evaluate, one possible act is the Open Letter. Another would be a treaty similar to the Ottawa Treaty or Anti-Personnel Mine Ban Convention. Civilization can choose its direction.

Leave a Reply

Your email address will not be published. Required fields are marked *