The Case Against Lethal Drones and Autonomous Weapons Systems
They were a new rage that soared in popularity with the military: a weapon that could pinpoint its target, reduce unnecessary “collateral damage” like civilian casualties, fly to hard to reach places instead of using ground forces, use remote piloting thus protecting U.S. military personnel involved in this form of combat. The concept was an easy one: using the drone devised for surveillance, just equip it with lethal weapons. Israel first used the drone in its surveillance and intelligence gathering of Palestinian territory, but the U.S. decided to make it lethal and add the weaponry to predatory drones after 9/11. And since that time, its use has escalated from war zones like Afghanistan and Iraq to areas outside U.S. war zones like Somalia, Yemen, Libya, Gaza, Mali, and Niger. There is at present no control on their use by anyone, no international law or set of standards, no limits on restriction, and yet there is growing criticism, especially about the significant number of civilian deaths and destruction of civilian areas like houses, schools, and hospitals resulting from their use. And the recent attempt on the life of Venezuelan President Nicolás Maduro by two drones shows how individuals can use this weapon as well, not just State actors.
One reason for the frightening escalation has been the introduction of signature strikes by U.S. forces, meaning that the target is not necessarily a named combatant or identified terrorist, but someone whose behavior arouses suspicion and is thus designated a combatant eligible to be targeted for a strike. In Afghanistan and Pakistan, any male between the ages of 20 and 40 can be labeled such a combatant, regardless of any more personal identification or proof of wartime or terrorist activity. The consequence has been an enormous number of civilian casualties, mostly Muslim, and the bolstering of recruitment by non-state organizations or terrorist groups bent on seeking revenge for these strikes.
There is also a growing psychological effect on those soldiers recruited to push the buttons, often half way around the world from their targets, discharging the UAVs (Unmanned Aerial Vehicles). While they are in a totally safe environment, they can also view the after effects their actions have created: watching the victims being torn apart, suffering horrendous deaths. In time this responsibility creates anxiety, guilt, and stress, and is partially responsible for the increasing number of suicides among our recent veterans.
Groups have formed to raise consciousness about the use and consequences of these UAVs. Noteworthy is one involving Unitarian Universalist minister, Rev. Chris Antal, and his work within the Interfaith Network on Drone Warfare. This group, representing 21 different religions, has prepared educational material to be used by any group interested in this topic. They have developed a series of five free videos available on YouTube or through their website. The videos (Moral and Safe, The Religious Community and Drone Warfare, Unmanned, National Bird, and Drone) contain highly informative, excellent commentaries from a wide range of academics, military personnel, whistleblowers, religious spokespersons, and civilian victim survivors. It is well worth seeing them all and even hosting a screening series at your congregation.
This effort is part of a larger movement to stop killer robots. With the expansion of technology in artificial intelligence, some weapon systems have been designed where the technology is programmed to make the decision to target and kill. Within the UN itself continual efforts have been made to commit to retaining meaningful human control of weapons systems and over individual attacks. Under the Convention on Conventional Weapons (CCW) at the end of 2013, states agreed that the CCW should begin considering questions relating to lethal autonomous weapons systems, and the study continues with another meeting of a Group of Government Experts on this topic at the end of August this year.
Armed drones and other autonomous weapons systems with decreasing levels of human control are currently in use and development by high tech militaries including those in the US, China, Israel, South Korea, Russia, and the UK. Several states, the Campaign to Stop Killer Robots, artificial intelligence experts, faith leaders, and Nobel Prize Laureates among others, fundamentally object to permitting machines to determine who or what to target on the battlefield or in policing, border control, and other circumstances. Such a far-reaching development raises an array of profound ethical human rights, legal, operational, proliferation, technical and other concerns.
More than 16 years after the first U.S. drone strike, the armed unmanned aerial vehicle remains key to American counterterrorism operations, and the Trump administration is expanding their use. It is dialing up the frequency and global scope of drone strikes, has allowed military operators to have greater strike-decision authority (the Obama administration responded to criticism about expanding civilian casualties from drone strikes by ordering White House approval of sensitive strikes), and expanded the CIA’s role and responsibilities in lethal strike operations (which had also been taken away by President Obama along with the rendition/torture of prisoners.)
A very helpful overview of recent actions by the Trump Administration is available in a piece by the Stimson Center entitled An Action Plan on Drone Policy: Recommendations for the Trump Administration. Words from this piece are well worth repeating:
Given these and related concerns, such as the rapid spread of drone technology for military and national security purposes around the world, it is important that the United States develop a drone policy that is both practical and comprehensive, and that sets a constructive international precedent for future drone use worldwide.
The same can be said for the autonomous weapons system. There is much work to be done when it comes to trying to Disarm Our Planet.