No, I don't need ChatGPT's help for the basics of air defense.
Military technologies are validated before deployed. Nobody can die from a hallucination.
But if I want to understand, say, how a particular Russian drone works, ChatGPT can help me piece together information from English, Russian, and Ukrainian-language sources.
But sometimes ChatGPT's safety filter thinks I want to use the Russian drone instead of stopping it, in which case it doesn't want to help.
This happens in real life too. I’ll never forget an LT walking in and asking a random question (relevant but he shouldn’t have been asking on-duty people) and causing all kinds of shit to go sideways. An AI is probably better than any lieutenant.