Do you allow that situations exist (in the billions of miles driven by the world population every day) in which a driver, whether human or computer, can be faced with a situation where different actions are possible, and one sacrifices the passenger's life but prevents harm to other nearby cars, and another action would harm nearby cars but prevent harm to the passenger of the car, and that one of these actions can be 'slam on the breaks'? And that thus there's a trade off by your car's software between your life and someone else's?
Even if that trade off comes from 100 different actions that can be taken, that situations exist in which someone inevitably gets harmed and the different decisions distribute that harm either to you, or passengers of other cars?
Whatever that situation may be, do you allow that it can exist, even if it's extremely rare, and that someone has to make a decision on how to set the computer's priorities in such an edge case?
That's the point I'm making. The specific example of such a situation isn't very important (unless you don't acknowledge ANY situation with such a tradeoff decision could ever to occur in the over 4000 billion km driven ever year which I'd say is pretty unlikely).
I would say that in all cases, any decision taken by the computer would probably be miles better than a decision made by a human driver in such a hypothetical situation. Firstly, the computer will have a much better overview of what's going on than any hman would. Secondly, a human is most likely to just panic and slam on the brakes as hard as possible.
Even if that trade off comes from 100 different actions that can be taken, that situations exist in which someone inevitably gets harmed and the different decisions distribute that harm either to you, or passengers of other cars?
Whatever that situation may be, do you allow that it can exist, even if it's extremely rare, and that someone has to make a decision on how to set the computer's priorities in such an edge case?
That's the point I'm making. The specific example of such a situation isn't very important (unless you don't acknowledge ANY situation with such a tradeoff decision could ever to occur in the over 4000 billion km driven ever year which I'd say is pretty unlikely).