Ethics in a self-driving car: whom should it opt to save in an accident?


Imagine an autonomous vehicle driving down a two-lane highway. Suddenly, its brakes fail. If it keeps moving forward, it will hit two men crossing the road. If it swerves out of the lane, it will hit a few dogs. Whom should the vehicle choose to save?

Imagine another self-driving vehicle, this one carrying a man, a woman, a child and a dog. Ahead, a pregnant woman, an elderly woman, a robber, a girl and a poor person are crossing the road. Brakes having failed, if the vehicle chooses to swerve, it will crash into a barricade, which will likely kill its passengers. What choice should it make?

These two are among 13 scenarios presented as part of The Moral Machine Experiment, an online “game” that engaged 2.3 million people with two difficult choices — an adaptation of a well-known “trolley problem” that has been long been discussed among ethicists and psychologists.