No, the other way around. I am saying it is not a smart take to say ”a safe airplane cannot be built if LLMs were used in the process in any way, because reasons”. The safety of the airplane (or more generally the outcome of any venture) can be measured in other ways than leaning on some rule that you cannot use an LLM for help at any stage because they are not always correct