I just wish we’d stop using words like intelligence or reasoning when talking about LLMs, since they do neither. Reasoning requires you to be able to reconsider every step of the way and continuously take in information, an LLM is dead set in its tracks, it might branch or loop around a bit, but it’s still the same track.
As for intelligence, well, there’s clearly none, even if at first the magic trick might fool you.