The term “killer robot” may sound like science fiction to many people but they are closer to reality than we imagine. Killer robots – a nickname for Lethal Autonomous Weapons Systems or LAWS, are systems able to select and fire on targets without human intervention. Yes, they could also decide when to take a person’s life. Armed drones are – unfortunately, familiar to most of us; while they are remotely controlled by a person, LAWS will not be.
If killing a human being is severely disturbing and problematic when it is done by another human being, how is it going to be if it comes from a machine? Is it morally acceptable to delegate decisions about the use of lethal force to fully autonomous weapons? Are we really willing to leave decisions about death and life to computerized systems?
Many question the capacity of robotic systems to acquire moral reasoning and, as a consequence, do not believe in their capacity to respond to moral dilemmas.
Even if technologies continue to advance in several fields it is highly unlikely that machines could be programmed to match human judgment. Human rights law, international law and international humanitarian legal systems are all built around the capacity for moral judgment. In fact, autonomous weapons systems could never be subject to the rules and moral principles central to those juridical instruments? We simply cannot dismiss sound and clear mechanisms of accountability; these would be significantly challenged by LAWS.
Human dignity is also at stake here. Killer robots are another step in what many see as a dehumanizing trend in warfare. Killing from a distance and engaging targets as mere objects are being presented by some as a normal evolution in the advancing technology of war, which sometimes resembles – and is becoming familiar through, the “gamification” of violence and war. Are we, as a society, willing to subordinate the value of human life and our ethical standards to the search for economic profits or to the pursuit of ever more advanced technological developments?
I am convinced that this issue needs to be addressed urgently. It is true that we are worried about more than enough bloody conflicts, but we risk reaching soon – in fact, we may already be at – a point of no return. Technology advances very rapidly, more so than legal and regulatory systems that are usually based on social consensus and are often unable to fill in the gaps quickly. We need to act quickly to prevent killer robots from taking advantage of such a gap.
Ideally, peoples and states should focus on protecting the right to life by finding increasingly effective ways to prevent conflicts and resolve them by peaceful means. The development of killer robots is disturbing because it suggests that we are comfortable with the idea of creating more sophisticated forms of killing each other. If the ethical standards of societies are measured by the way they threaten or enhance the life and dignity of the human person, we are certainly failing.
The International Secretariat of Pax Christi, on behalf of our global network, in collaboration with our Dutch colleagues from PAX – one of the co-founders of the Campaign to Ban Killer Robots – has been promoting an Interfaith Declaration in support of a ban on fully autonomous weapons. The Declaration is a statement from religious leaders, faith groups and faith-based organizations, in which they raise their collective voice to call on all governments to participate in the international debate and to work towards a ban on the development, production and use of such weapons. At the same time the Declaration is a commitment from endorsing leaders to educate their own constituencies around this sensitive issue.
The Declaration has been signed so far by more than 70 faith leaders of various religious denominations from over 30 countries. From the Christian community, signatories include Archbishop Desmond Tutu, Nobel Peace laureate; the Latin Patriarch of Jerusalem, His Beatitude Fouad Twal; and the Chaldean Patriarch of Iraq, His Beatitude Louis Raphael Sako. Several archbishops and bishops, including former and present presidents of Pax Christi national organizations, and women and men Religious Superiors, who represent large numbers of active communities around the world also signed the Declaration, as have faith leaders from Muslim, Buddhist and Jewish communities. This is an ongoing initiative and the number of signatories keeps growing. After similar statements from Nobel laureates and scientists, the responses to this initiative show how ethical concerns over killer robots are widely shared within society.
For religious leaders the ethical concerns arise from core values and principles of their faith traditions regarding human life. For Christians, human life is sacred and the dignity of the human person is the foundation of a moral vision for society. Devotion to being harmless is a core principle of Buddhist religious life and the principle of non-harming is absolute. Islam considers all life forms as sacred; however, the sanctity of human life is accorded a special place.
The discussion about LAWS is by no means a simple issue and no simple answers are expected. But the message of religious leaders underlines the need of urgent actions within ethical frameworks shared by millions of people on our planet. We can promote further dialogue to approach the more complex aspects of this issue and we can also promote policies that will pave the way for a pre-emptive ban on killer robots.
It is important to acknowledge that ethical aspects in this process are not limited to the questions regarding the autonomous weapons themselves. Taking action or failing to take action around the development, production and use of lethal autonomous weapons systems has also ethical implications. A number of religious leaders, civil society organizations and states are already taking action, but we hope many more will do the same.