We need robots to have morals. Could Shakespeare and Austen help?

Robots Opinion John Mullan Using great literature to teach ethics to machines is a dangerous game. The classics are a moral minefield • John Mullan is professor of English literature at University College London Illustration by Thomas Pullin Robots Opinion We need robots to have morals. Could Shakespeare and Austen help? John Mullan Using great literature to teach ethics to machines is a dangerous game. The classics are a moral minefield • John Mullan is professor of English literature at University College London When he wrote the stories in I, Robot in the 1940s, Isaac Asimov imagined a world in which robots do all humanity’s tedious or unpleasant jobs for them, but where their powers have to be restrained. They are programmed to obey three laws. A robot may not injure another human being, even through inaction; a robot must obey a human being (except to contradict the previous law); a robot must protect itself (unless this contradicts either of the previous laws). Unfortunately, scientists soon create a robot (Herbie) that understands the concept of “mental injury”. Like a character in a Thomas Hardy novel or an Ibsen play, the robot soon finds itself in a situation where truthfully answering…


Link to Full Article: We need robots to have morals. Could Shakespeare and Austen help?

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!