To Prevent Android Abuse and to Protect Humans
The government of South Korea is drawing up a code of ethics to prevent human abuse of robots—and vice versa.
The so-called Robot Ethics Charter will cover standards for robotics users and manufacturers, as well as guidelines on ethical standards to be programmed into robots.
"The move anticipates the day when robots, particularly intelligent service robots, could become a part of daily life as greater technological advancements are made," the ministry said in a statement. A five-member task force that includes futurists and a science-fiction writer began work on the charter last November.
Gianmarco Veruggio of the
"However, as in every field of science and technology, sensitive areas open up, and it is the specific responsibility of the scientists who work in this field to face this new array of social and ethical problems."
The new charter is part of an effort to establish ground rules for human interaction with robots in the future. "Imagine if some people treat androids as if the machines were their wives," Park Hye-Young of the ministry's robot team told the AFP news agency.
The main focus of the charter appears to be on dealing with social problems, such as human control over robots and humans becoming addicted to robot interaction.
The document will also deal with legal issues, such as the protection of data acquired by robots and establishing clear identification and traceability of the machines.
Technological advances have introduced new models of human-machine interface that may bring different ethical challenges, said Veruggio, the Italian scientist. "Think of bio-robotics, of military applications of robotics, of robots in children's rooms," he said.
Laws of Robotics
Familiar to many science-fiction fans, the laws were first put forward by the late sci-fi author in his short story "Runaround" in 1942.
The laws state that robots may not injure humans or, through inaction, allow humans to come to harm; robots must obey human orders unless they conflict with the first law; and robots must protect themselves if this does not conflict with the other laws.
Kemo D. (a.k.a. no.7)