Hacker News new | past | comments | ask | show | jobs | submit login

Humans can only look in one direction, and only from inside the car with their view obstructed, and they're only paying attention sometimes.

Check out this article, it is easy to never see a bike you're on a collision course with. https://singletrackworld.com/2018/01/collision-course-why-th...




We're not talking about complicated scenarios with multiple moving actors. Tesla's autopilot cannot even do something as basic as detect stationary obstacles that are directly in front of the car. It will crash into barriers even if the highway is completely devoid of other cars.

You may consider humans as bad drivers but Tesla's autopilot is even worse than that:

It can't even look in one direction!


I'm talking about the pitfalls of human perception, and the low-hanging fruit of ways that self-driving systems can potentially outperform humans.

I'm not claiming Tesla's system is currently better than a human, just that there is plenty of potential for a machine to outperform humans perceptually. As it is, Tesla's system isn't exactly the gold standard.


>Humans can only look in one direction..

Last time I checked, I could move my eyes, up and down, side to side. I could also rotate my whole head, that also up and down and side to side.

And I am a human being.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: