It is a proxy to questions that pop up as part of other sessions in my daily use. The LLM chooses its query itself and as long as it is not fine tuned to avoid the “listicles”, or its underlying search engine are not putting importance on more factual responses, I don’t think the answer quality will be as high. It would be weird and redundant if I had to talk to the LLM as if it’s an old school search engine, wouldn’t it?