This is a valid point, but we are still in the early stages of AI/LLMs, so one would expect the speed and efficiency to improve drastically (perhaps accuracy too) over the coming years.
At least AI & LLMs have large scale practical applications as opposed to crypto (IMO).
It's also interesting to think that IBM released an 8-trillion parameter model back in the 1980s [0]. Granted it was an n-gram model so it's not exactly an apples-to-apples comparison with today's models, but still, quite crazy to think about.
Interesting to see Robert Mercer the former CEO of Renaissance Technology is one of the authors on that paper. He is a former IBMer. If his name is unfamiliar he is a reclusive character who was a major funder of Breitbart, Cambridge Analytica and the Republican candidate in the 2016 presidential election.
I wouldn't call the early McCulloch & Pitts work quite "full-fledged". Also backpropagation, essential for multi level perceptrons was not a thing until 1980s.
It was thought of as early as in 1960s by Rosenblatt but he did not come up with a practical implementation at the time. Lotsa things look obvious in hindsight.
At least AI & LLMs have large scale practical applications as opposed to crypto (IMO).