Machines lie very effectively. Machines plainly have more resources, while people give all kinds of metadata that they're lying. It used to be that if someone had a lot of details ready at hand they were probably truth-tellers, since details are tiresome to fabricate. But ChatGPT can talk math-into-code with me for an hour, occasionally asking for clarification (which makes me clarify my thinking) and still lead me to a totally nonsensical path, including realistic code that imports libraries I know to be relevant, and then relies on classes/functions that don't exist. Fool me once, shame on me.