Hacker News new | past | comments | ask | show | jobs | submit login

Certification is not a 100% solution. We can never be 100% sure that a neural network will produce the right result. Just like we can never be 100% sure that hand written software doesn't have bugs, certification and auditing can't do that either.

Even with the most stringent software development and testing procedures in the world, NASA is unable to produce bug-free software for their spacecraft (which is why they have procedures to recover from bugs and patch software).

All the certification is saying: No bugs were found during testing, based on the amount of testing and the quality of the code and the testing procedures he probability of a fatal bug is x% (where x is a really low number with lots of zeros) and we deam x% to be an acceptable risk. In the case of autopilots, we only really need x% to be below regular human driving.

No software changes could have saved that driver. Tesla have a class 2 autopilot (adaptive cruise control with lane following). It's not within the job description of an autopilot to avoid all possible crashes, it's the job of the driver to be alert and ready to take control at all times. Tesla's autopilot doesn't even use Neural Networks outside of the camera sensor.

To be a class 3 autopilot that would be expected to deal with this sort of incident, Tesla would need way more sensors and much smarter software, so the car would actually have hope of detecting the faulty data from one sensor and taking the correct answer (The truck was only within view of the camera sensor, the radar is calibrated to only detect things at bumper height).




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: