As self-driving cars are nearing mass production, many are wondering how a machine could possibly make difficult moral decisions when presented with a dangerous situation. For example, driverless cars will have to decide in the event of an imminent crash how to steer to brake or speed up, or what to hit. This is obviously a moral issue when human or animal life is at stake. A new program called the Moral Machine is collecting data on how humans would decide what to do in tough moral circumstances, and you can play too.
Autonomous vehicles really will become the modern day moral choice machines. While many feel that a machine should not have the capability of making choices for humans lives, that time is approaching us fast. In the Moral Machine game, you have to chose essentially who will die given a certain scenario. In many scenarios, there isn’t any outcome but bad, and ultimately a car will have to be programmed to make this choice. When you finish the short test, the game shows you who you preferred to save more and what ideals you cared more about. It’s a little scary to be presented with the end result of a series of difficult moral choices.