They took 1970s dead tech and deployed it on machines 1 million times more powerful. I'm not sure I'd qualify this as progress. I'd also need an explanation as to what systemic improvements in models and computations that give an exponential growth in performance are planned.
> They took 1970s dead tech and deployed it on machines 1 million times more powerful. I’m not sure I’d qualify this as progress
If this isn’t meant to be sarcasm or irony, you’ve got some really exciting research and learning ahead of you! At the moment it reads very “computers are just addition and multiplication and we’ve had that for thousands of years!”
> you’ve got some really exciting research and learning ahead of you
I've done the research. Which is why I made the point I did. You're being dismissive and rude instead of putting forth any sort of argument. It's the paper hat of fake intellect. Yawn.
> At the moment it reads very “computers are just addition and multiplication and we’ve had that for thousands of years!”
Let's be specific then. The problem with the models is they require exponential cost growth for model generation giving only linear increases in output performance. This cost curve is currently a factor or two stronger than the curve of increasing hardware performance. Putting the technology, absent any actual fundamental algorithmic improvements, which do /not/ seem forthcoming despite billions in speculative funding, into a strict coffin corner. In short: AI winter 2.0.
Got any plans for that? Any specific research that deals with that? Any thoughts of your own on this matter?
Great. What's the 1970s equivalent of word2vec or embeddings, that we've simply scaled up? Where are the papers about the transformer architecture or attention from the 1970s? Sure feels like you think LLMs are just big perceptrons.
> The problem with the models is they require exponential cost growth
Let's stick to the assertion I was disputing instead.
A linear increase in technology can easily lead to a greater than linear increase in economic gain. Sometimes even small performance gains can overturn whole industries.
Is progress measured in nobel prizes? My understanding is those are put to a vote by institutional committee.
Putting that aside. The shared prize in 2024 was given for work done in the 1970s and 1980s. Was this meant to be a confirmation of my point? You've done so beautifully.
In 2022 they saw fit to award Ben Bernanke. Yep. That one. For, I kid you not, work on the impacts of financial crises. Ironically also work originally done in the 1970s and 80s.
AlphaFold uses transformers. That is definitely not from the 70s and 80s.
Progress for me includes both small iterative refinements and big leaps. It also includes trying old techniques in new domains with new technology. So I think we just have differing definitions for progress.
They took 1970s dead tech and deployed it on machines 1 million times more powerful. I'm not sure I'd qualify this as progress. I'd also need an explanation as to what systemic improvements in models and computations that give an exponential growth in performance are planned.
I don't see anything.