Hacker News new | past | comments | ask | show | jobs | submit login

Personally, I am constantly scanning for people near the edge of the road and anticipating their movements. I bet most of us have had the experience where someone stepped out unexpectedly but you already took action because you could tell they were not paying attention. Good luck programming that.



Yeah there are a lot of subtleties that are hard to program; that's why it must be extremely effective on things that _can_ be programmed such as quick breaking, quick steering; prefer lanes that are farther away from pedestrians near-by, beep when entering a non-illuminated zone and hundreds of other details.


Quick braking is the one thing you would have thought they could get right for sure, and they didn't manage even that on this occasion.


Quick braking can also be dangerous. Perhaps even LiDAR has to deal with false positives every now and then.


Then it shouldn't be deployed. I had several occasions where I overlooked somebody (luckily without consequences) but never one where I falsely registered someone or something to warrant to brake.


I don't think it will work. Instead the failing effort will be cover to add more and more rules in an attempt to hide the obvious while blaming the victim(s) for breaking the rules.

https://www.youtube.com/watch?v=UEIn8GJIg0E


Beeps are annoying. Better to artificially increase the engine sound.


I actually think this is a fairly trainable attribute. Given that Tesla does this kind of detection for objects and classification and records many miles of human drivers - it wouldn't be surprising that this may be a learned trait.


It's the corner cases... like Halloween, or body posture, or crossing behind a semi-transparent asymmetric object.


Fair point. This is far less likely.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: