<
 
 
 
 
×
>
You are viewing an archived web page, collected at the request of United Nations Educational, Scientific and Cultural Organization (UNESCO) using Archive-It. This page was captured on 02:17:35 Jul 05, 2018, and is part of the UNESCO collection. The information on this web page may be out of date. See All versions of this archived page.
Loading media information hide

Report of COMEST on robotics ethics

A report aiming to raise awareness and promote public consideration and inclusive dialogue on ethical issues concerning the different use of contemporary robotic technologies in society was finalized by the World Commission on the Ethics of Scientific Knowledge and Technology of UNESCO (COMEST) in September 2017, and publicly released in October 2017.

Robotic technologies blur the boundary between human subjects and technological objects. In doing so, they do not only have societal implications which can be ethically evaluated, but they also affect the central categories of ethics: our concepts of agency and responsibility, and our value frameworks. Given the increasing autonomy of robots, the question arises who exactly should bear ethical and/or legal responsibility for robot behaviour. This is one of many issues that COMEST addresses in its report.

Robotic technologies have been used for industrial and military purposes since the mid-20th century. Since then, these technologies have been increasingly applied in other areas, such as transportation, health care, education, agriculture, and the home environment. Robotic technologies come in many forms, from the rapidly developing concept of autonomous vehicles, to the use of robots in surgery, therapeutic care, elderly care, and as educational tools for children.

 

Robotics today is increasingly based on artificial intelligence (AI) technology, with human-like abilities in sensing, language, interaction, problem solving, learning, and even creativity. The main feature of such ‘cognitive machines’ is that their decisions are unpredictable, and their actions depend on stochastic (random) situations and on experience. This is very different from robots whose behaviour are determined by the program that control their actions (deterministic robots). The question of accountability of actions of cognitive robots is therefore crucial.

The rapidly increasing presence of cognitive robots in society is becoming more challenging. They affect human behaviours and induce social and cultural changes, while also generating issues related to safety, privacy, and human dignity.

In its report, COMEST proposes a technology-based ethical framework to consider recommendations on robotics ethics based on the distinction between deterministic and cognitive robots. COMEST further identifies ethical values and principles that can be helpful to set regulations at every level and in a coherent manner, from engineers’ codes of conduct to national laws and international conventions. These relevant ethical principles and values include: (i) human dignity; (ii) value of autonomy; (iii) value of privacy; (iv) “do not harm” principle; (v) principle of responsibility; (vi) value of beneficence; and (vii) value of justice. The principle of human responsibility is the common thread that joins the different values that are enunciated in the report.

In this regard, COMEST also makes a number of specific recommendations concerning the application of robotic technologies. These recommendations cover a wide variety of areas, from the further development of codes of ethics for roboticists, to the need for retraining and retooling of the work force, as well as to the advice against the development and use of autonomous weapons.

See also:

Back to top