Totally off topic but brought to mind by your example, I understand this is a quandary with programming autonomous cars. For example as we are the driver we make certain split second driving decisions regarding safety based on what advantages and protects us. Should the automation be programmed likewise, or should the automation take us off a cliff to avoid hitting a pedestrian etc...
You are viewing a single comment's thread from:
Tough question for sure. As far as I see it, if the car is programmed for self preservation first, and to try to protect life second (just like we do,) then an injured passenger or a struck pedestrian are truly by accident, limiting the liability of the company. If not, then the company, programmer, vehicle has "made a choice" and seems liable. But what do I know?!