Monday, June 10, 2013 | BioEdge | contact us | donate | contributors |
|
UN debates use of killer robots
by Michael Cook
The ethics of using killer robots is adding
a new subspeciality to bioethics. This week the United Nations Human
Rights Council debated the use of lethal autonomous robots in Geneva. UN
special rapporteur Christof Heyns, a South African legal expert, called
for a moratorium while legal and ethical issues are knotted out.
Mr Heyns says that LARs could be as big a step in warfare as gunpowder or nuclear weapons. But it would be qualitatively different, since machines, not humans, would be deciding whether or not to kill. He also worries that LARs could make war more likely, as nations would not feel inhibited by the fear of wasting the lives of their own soldiers. In a report to the Human Rights Council, he writes: “There is a qualitative difference between reducing the risk that armed conflict poses to those who participate in it, and the situation where one side is no longer a ‘participant’ in armed conflict inasmuch as its combatants are not exposed to any danger. LARs seem to take problems that are present with drones and high-altitude airstrikes to their factual and legal extreme.” One important issue posed by the LARs is the lack of a clear chain of responsibility. Drones are also mechanical killers, but at the moment the decision to kill is taken by a human being. Mr Heyns says: "Their deployment may be unacceptable because no adequate system of legal accountability can be devised. LARs can potentially be also used by repressive governments to suppress internal domestic opponents. "Do we want a world in which we can be killed either as combatants or as collateral damage by robots with an algorithm which takes the decision? It's this issue of diminishing human responsibility that concerns me." Human Rights Watch has been campaigning to ban the use of killer robots. It issued a report last November calling for a pre-emptive ban. |
No comments:
Post a Comment