Hacker News new | past | comments | ask | show | jobs | submit login

Originally term "agency" was used to indicate inability of computers to act autonomously, intentionally or with purpose.

Then few developers published joke projects of automated browsers that pointlessly doomscroll random webpages saying "here are your agents".

Then few companies transformed this joke into "rag" agents - but there automated browsers don't act autonomously, they plagiarize web content according to prompt.

Then many websites started to block rag agents, and to hide it companies fell back to return data from LLMs, occasionally updating responses in the background (aka "online models").

The idea of plagiarizing content is also mixed with another idea: if LLM rewrites plagiarized content multiple times, it becomes harder to proof.

Obviously, none of involved companies will admit to plagiarizing. Instead, they will cover themselves with the idea that it leads to superintelligence. For example, if multiple neural networks repeat that 9.11 > 9.9, it will be considered more accurate[1].

[1] https://www.reddit.com/r/singularity/comments/1e4fcxm/none_o...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: