Guys, are you really saying that you think that you can certify every response of a NN for every input?
Especially for NN as complicated as autopilot systems?
Are you aware that if that was true then you would be the richest men in the world given that everyone would come to you to write a perfect algorithm to drive a car?
But even in safety critical tasks I have seen NNs and MDPs used to derive a good enough solution, then training turned off and the system certified (e.g. by running a large gold standard collection of test cases)