news
International Law
02 November 2018

Paola Gaeta: Robots and Criminal Responsibility

Robots are part of our daily lives, for instance when we use the self-checkout lane at the grocery store.

Robots are part of our daily lives, for instance when we use the self-checkout lane at the grocery store. However, they are rapidly becoming more than the routine mechanical devices programmed to perform repetitive functions to which we are accustomed today. Vehicle manufacturers have begun testing self-driving cars that operate at the push of a button, taking their passengers wherever they want to go. The arms industry is developing similar technology to produce so-called lethal autonomous weapons systems (LAWS) that can find, track and fire on targets without human supervision.

The development of these new weapons raises a host of complex questions. Among the most pressing legal ones, there is the attribution of criminal responsibility in the case of malfunction. Due to the autonomy of LAWS, there is the possibility that these weapons could target people and objects in violation of the rules of international humanitarian law. Who should bear criminal responsibility for any subsequent war crimes? The issue of whether autonomous weapons themselves should bear criminal responsibility is problematic. It would require LAWS to be treated like human beings thereby contesting the anthropocentric foundations of modern criminal law. Machines are not suitable recipients of criminal punishment, mainly because they are not morally responsible agents and cannot ‘understand’ the concept of retributive punishment. Ascribing criminal responsibility to the ‘user’, usually the military commander responsible for engaging the LAWS, is also problematic. In most cases, the commander does not intend to use the autonomous weapon system to commit a war crime.  There is only an ‘acceptance of risk’ that the machine may take the wrong targeting decision. It is then a question of whether this acceptance is sufficient in itself to consider the military commander a war criminal.

These legal issues are not limited to the arms industry, but equally apply to self-driving cars. Fatal crashes involving pedestrians have already been reported in the news and pose questions around the attribution of criminal responsibility. There is a pressing need to develop an appropriate legal framework to fill possible gaps emerging from these new technologies, including laws on war crimes. Though clearly, the crux of the matter is not legal. Increasing automation brings many benefits to society. However, does robotism — the mindless automation of our lives — risk leading us to ‘insane societies’, as predicted by Erich Fromm in his book, The Sane Society (1955)? ‘The danger of the past was that men became slaves. The danger of the future is that men may become robots.’