I read somewhere else about another interesting conundrum for designers of driverless cars. As humans we not only make judgments, but moral judgments on a daily basis, and sometimes within a split second.
Imagine a man driving a car with his daughter in the passenger seat. Let's say he suddenly finds a pedestrian crossing the street, and he is driving too fast to stop in time. Let's say the driver has a choice of either running over the pedestrian and killing him, or driving off the lane and colliding head on into a truck, and dying along with his daughter.
It wouldn't be surprising if the driver decides to run over the pedestrian to save his daughter's life.
But what if he were actually alone in the car and the pedestrian was his daughter? Would he run her over to save his own life? Probably not.
That is a judgment that humans sometimes have to make, with very serious consequences. Now the onus is on the designers of the driverless car. What would be the rule? Are the occupants always the priority? In a similar situation with no driver, who would the car save? The occupant or the pedestrian?