Artificial Intelligence Has a Morality Problem

If you ever get hit by a self-driving car, it may be my fault. It will not be only my fault, but I may have contributed to your demise. I’m sorry about that. My contribution to your unfortunate death arrived through playing a round or two at MIT’s Moral Machine. MIT is using the machine to “crowdsource” opinions on how self-driving cars should respond to possible moral dilemmas. The Moral Machine is an interactive version of the Trolley Problem, a thought experiment, an artificial moral conundrum first introduced in 1967, meant to tease out values. The Trolley Problem begins with a runaway trolley racing toward you. Five persons lie tied to the tracks. You have access to a lever that will route the trolley to another track, but, just before…

Link to Full Article: Artificial Intelligence Has a Morality Problem

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!