Hacker News new | past | comments | ask | show | jobs | submit login

I agree with the better automation line.

If a human is pulling back on the thing that says "plane needs to go UP" believe what they're requesting.




the AF flight that was lost over the Atlantic years back (10 years now maybe?) did just that, when the pilots believed that the plane's automation was in a state that would not allow them to do anything that would result in unrecoverable flight... so they stalled it all the way into the ocean.

The problem here is nearly identical to the problems with self driving cars. The automation is capable of handling 95% of situations, but the 5% it's not capable of is not clear to either the operators or the software itself.

I firmly believe that both car and air travel need to be either 100% completely automated, or 100% under the control of the operators with well defined areas where the automation systems will ASSIST but always yield to the well trained (pilots) or poorly trained (car drivers) when requested. A human operator is able to think outside the box, and is not limited to what code has been conceived and written.


That sounds more like an issue of not correctly communicating the detected problem to the humans.

In any event, the situation where the humans aren't processing the data correctly and doing illogical things as a result would always lead to failure outcomes, until the point where we no longer allow humans to be pilots/operators at all.


The cockpit recording had a very loud stall alarm going in the background. The human brain is absolutely amazing in it's ability to tune out signals




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: