The prediction that a new technology that is being heavily researched plateaus after just 5 years of development is certainly a daring one. I can’t think of an example from history where that happened.
Claiming that AI in anything resembling its current form is older than 5 years is like claiming the history of the combustion engine started when an ape picked up a burning stick.
Your analogy fails because picking up a burning stick isn’t a combustion engine, whereas decades of neural-net and sequence-model work directly enabled modern LLMs. LLMs aren’t “five years old”; the scaling-transformer regime is. The components are old, the emergent-capability configuration is new.
Treating the age of the lineage as evidence of future growth is equivocation across paradigms. Technologies plateau when their governing paradigm saturates, not when the calendar says they should continue. Supersonic flight stalled immediately, fusion has stalled for seventy years, and neither cared about “time invested.”
Early exponential curves routinely flatten: solar cells, battery density, CPU clocks, hard-disk areal density. The only question that matters is whether this paradigm shows signs of saturation, not how long it has existed.