Imagine a not too distant future where the world has self-driving cars. A car is traveling through an intersection where an accident is unavoidable. At the intersection, there is an older adult and young child. The car’s software (and any connected systems) calculates that all potential outcomes involve the death of either “driver”/passenger of the car, the older adult, or the young child. There will only be a single fatality in this accident and it must be one of the three people.

– What should the car do in this scenario?
– Is the car making a moral judgment?
– If you choose an outcome that could result in a lawsuit, who should the suit be directed towards: the driver/passenger if they are still alive, the car manufacturer, the developer’s of the car’s software, and/or someone else?
– Do the answers to the questions above change with the following personally directed modifications?
– You find out that you would personally be the fatality in the scenario given natural decision making processes by the car
– You find out that the fatality would be someone you are close to like a friend or family member

You’ll learn a lot about someone by their responses and this may one day be a very real issue that the world will need to make one or more decisions on.


Leave a Reply