Except those farms don't do any original research and just copy off each other. They're littered with mistakes and you will see the same mistake pop up across all of them.
These days for obscure terms, you don't even get the luxury of reading garbage written by people who barely understand the topic at hand, instead you get meaningless fluff generated en masse using LLMs.
Honestly, I'd rather spend time parsing whatever doxygen spits out than try to figure out what the needlessly verbose yet inaccurate LLM output is trying to get at.
These days for obscure terms, you don't even get the luxury of reading garbage written by people who barely understand the topic at hand, instead you get meaningless fluff generated en masse using LLMs.
Honestly, I'd rather spend time parsing whatever doxygen spits out than try to figure out what the needlessly verbose yet inaccurate LLM output is trying to get at.