The common occurrence of hallucinations makes it hard for me to believe anyone will be using LLMs to produce code anywhere outside of shops who really don't care about errors. Until they fix that, code is a use case where even slight errors make the output useless.