Hacker News new | past | comments | ask | show | jobs | submit login

I'm not convinced that the distinction between "situations requiring thinking" and "boring situations" is that strong. You seem to be implying that self-driving cars will probably be worse than humans at things like negotiating through city traffic (where this accident apparently occurred), but is there any evidence of that?

It's true that most deaths and serious accidents occur on highways, where human factors like boredom and sleepiness are probably significant. But humans also have a heck of a lot of fender benders like this incident, and a lot of the ones I have seen weren't exactly situations where "a lot of thinking" was necessary. People are texting and rear-end the person in front of them, or underestimate how much room they have to switch lanes between two cars, or other things like that. I don't see why these situations are uniquely difficult for AI to solve. In fact, they seem like very apt problems for computers to solve.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: