Hacker News new | past | comments | ask | show | jobs | submit login

> putting pilots into a situation where they can't command the airplane even if they want to is a bit extreme.

I agree, this is the main issue. In both cases if pilots had a clear option to just switch off MCAS we would have not lost over 500 lives.




Unlikely. In the case of the first crash, they didn't even get as far as switching off the stab trim (they were flipping through the flight manual to find it as the plane crashed). A second switch would have done nothing if they couldn't remember how to use the first one. Yet another switch in a cockpit (which already has hundreds of them) would also have been yet another way to create an emergency by having a switch accidentally flipped the wrong way.

In an emergency it has repeatedly been shown that pilots often forget complicated/rare recovery procedures. We need to stop relying on humans to perform at a superhuman level in these situations; better automation is the only way out.


On the other hand, there are many who believe, that automation got us into this. The more we automate, the more we depend on it which leads to problems if exceptional situations occur. [1]

Maybe the 737 Max is just too complex and relies too much on automation to fly correctly? I'm reminded of the Quantas QF32 flight, where pilots were nearly overwhelmed by the endless amount of errors and checklists they had to perform. [2]

[1] https://www.youtube.com/watch?v=lIusD6Z-3cU

[2] https://www.news.com.au/travel/travel-advice/flights/inside-...


Very good points.

And in the case of MCAS, physically unable to exert the effort to overcome the automated trim settings. (Posted above):

https://youtube.com/watch?v=aoNOVlxJmow


Better automation like MCAS?


I agree with the better automation line.

If a human is pulling back on the thing that says "plane needs to go UP" believe what they're requesting.


the AF flight that was lost over the Atlantic years back (10 years now maybe?) did just that, when the pilots believed that the plane's automation was in a state that would not allow them to do anything that would result in unrecoverable flight... so they stalled it all the way into the ocean.

The problem here is nearly identical to the problems with self driving cars. The automation is capable of handling 95% of situations, but the 5% it's not capable of is not clear to either the operators or the software itself.

I firmly believe that both car and air travel need to be either 100% completely automated, or 100% under the control of the operators with well defined areas where the automation systems will ASSIST but always yield to the well trained (pilots) or poorly trained (car drivers) when requested. A human operator is able to think outside the box, and is not limited to what code has been conceived and written.


That sounds more like an issue of not correctly communicating the detected problem to the humans.

In any event, the situation where the humans aren't processing the data correctly and doing illogical things as a result would always lead to failure outcomes, until the point where we no longer allow humans to be pilots/operators at all.


The cockpit recording had a very loud stall alarm going in the background. The human brain is absolutely amazing in it's ability to tune out signals


That first crash is a good example of multiple minor problems joining together to create a disaster.

1. "they didn't even get as far as switching off the stab trim" means they were incompetent.

2. MCAS, obviously, was a bad design

3. there was a physical issue with the sensor, possibly due to a bird strike

Fix any one of those, and the flight wouldn't have crashed.


>means they were incompetent //

I thought a central part of this was that there was according to Boeing no need to train on this particular aircraft because it was so close operationally to the 737NG. But that in fact the correct response in the particular situation is vastly different. So yes, kinda "incompetence" but more "ignorant due to Boeing's assertions that training wasn't needed"; ie the blame should not primarily rest on the pilots.(?)

I'm ignorant on the details, but this sort of article seems convincing:

> It’s probably this counterintuitive characteristic, which goes against what has been trained many times in the simulator for unwanted autopilot trim or manual trim runaway, which has confused the pilots of JT610. They learned that holding against the trim stopped the nose down, and then they could take action, like counter-trimming or outright CUTOUT the trim servo. But it didn’t. After a 10 second trim to a 2.5° nose down stabilizer position, the trimming started again despite the Pilots pulling against it. The faulty high AOA signal was still present.

> How should they know that pulling on the Yoke didn’t stop the trim? It was described nowhere; neither in the aircraft’s manual, the AFM, nor in the Pilot’s manual, the FCOM. This has created strong reactions from airlines with the 737 MAX on the flight line and their Pilots. They have learned the NG and the MAX flies the same. They fly them interchangeably during the week.

> (https://leehamnews.com/2018/11/14/boeings-automatic-trim-for...) //




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: