Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Perhaps the fact that you think this field is only 5 years old means you're probably not enough of an authority to comment confidently on it?




Claiming that AI in anything resembling its current form is older than 5 years is like claiming the history of the combustion engine started when an ape picked up a burning stick.

Your analogy fails because picking up a burning stick isn’t a combustion engine, whereas decades of neural-net and sequence-model work directly enabled modern LLMs. LLMs aren’t “five years old”; the scaling-transformer regime is. The components are old, the emergent-capability configuration is new.

Treating the age of the lineage as evidence of future growth is equivocation across paradigms. Technologies plateau when their governing paradigm saturates, not when the calendar says they should continue. Supersonic flight stalled immediately, fusion has stalled for seventy years, and neither cared about “time invested.”

Early exponential curves routinely flatten: solar cells, battery density, CPU clocks, hard-disk areal density. The only question that matters is whether this paradigm shows signs of saturation, not how long it has existed.


I think this is the first time I have ever posted one of these but thank you for making the argument so well.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: