I had this discussion a while ago with someone. Self driving cars brings up an interesting dilemma.
Should a car favor the life of the owner or the amount of lives saved? If the car can take an action that puts the driver at extreme risk but would save 2 or more other parties should it take that risk? Who would buy a car that would make that decision for them?
These types of issues will come up in a future where these types of systems become more common place.
Would a car manufacturer even be allowed to program a car to favor saving the owner without legal issues?