Hacker News new | past | comments | ask | show | jobs | submit login

The interesting part is that the car stopped instead of moving out of the way.

A human has an instinct to avoid collisions that evidently is absent in their model.

The question is how long is the long tail, and how hard is it to instill common sense things like crash avoidance in unusual scenarios.

Humans benefit from a few hundred million years of training on “run away when things fast approaching”, even if “thing” is totally outside our training set.

Gradient descent doesn’t really have fear / self-preservation, and adding that in runs into Asimovs laws immediately. Interesting stuff.

Also - there are better and worse ways to get seriously injured. Something about the total lack of autonomy in current self driving cars is both horrifying and infantilizing. Sitting in the middle of an intersection with oncoming traffic and having absolutely no control over you own imminent death or injury would certainly make me re-evaluate categories of risks.

With an Uber driver I can yell “GO!!”




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: