Hacker News new | past | comments | ask | show | jobs | submit login

"always assume that buses are out to get them and avoid them at all cost"

This is a case where perhaps the computers have too much imagination. We actually tell human drivers to drive that way, but as humans we all know that we don't really mean that we need to worry about someone driving in front of us suddenly slamming their brakes, drifting 180 degrees, and driving at us full speed. Tell a computer to assume too much malice and the car will refuse to even move, because it's pretty easy to specify the search algorithm that will find that outcome.

We have to specify the exact level of malice the computer can reasonably expect, which is way harder. And it will still, by necessity, at times be an underestimate.




You make an excellent point, however as I've been recently up to my hips in RNNs I wonder if we can figure out how to score encounters, can the car learn the level of malice to expect, can it learn it to the level of perhaps the individual driver shifts? My daughter took the 54 to DeAnza community college and learned which drivers were ok, which were mean, and which were indifferent. Would regular exposure of the car to the bus at different times of day allow it to figure that out? Can we start with an expected level of malice and tune it? Fun question to think about.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: