Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, but we can also say what a responsible self-driving car should be detecting/recording/ decisioning at that point, paraphrasing:

- unknown object detected ahead, collision possible- likely

- dangerous object (car) detected approaching from rear with likely trajectory intercepting this vehicle (people easily forget multitasking/sensing like this is something an autonomous car should be able to do better than a human who can only do relatively intermittent serial scans of its environment)

- initiate partial slow down and possibly change path: make some decision weighting the two detected likely collision obstacles.

You do not have to slam on brakes and be rear ended, but speed is a major factor in fatal crashes, so even if you can drop 30% of your momentum by the time of impact and avoid the rear end, that's still a responsible decision.

And we can accept that sometimes cars are put in potential no-win situations (collision with two incoming objects unavoidable).

What's a negligent/borderline insane decision? Put a one second hard-coded delay in there because otherwise we have to admit we don't have self-driving cars since we can't get the software to move the vehicle reliably if it's trying to avoid its own predicted collisions.

(Another issue is an inability to maintain object identity/history and it's interaction with trajectory prediction... personally, IMO, it is negligent to put an autonomous car on a public road that displays that behaviour, but that's just me)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: