> I crave a computing era where we can return to the fundamentals.
The core of this comes down to -- will it scale?
This is what I want to see from the OS/OW AI models (open source / open weights) universe (kudos to Hugging Face for facilitating this).
I suspect the OpenAI model approach, where all infra is hidden and the models are hidden behind services, can't scale to meet the demand of what people want out of AI. Directions like Mamba over hungry compute Transformers, SLM as over expensive LLMs, and so on, will commodify AI services as operational frameworks mature. OpenAI gets that, hence why they are listening to consumer demand, but their moat assumption is faulty.
That said, I am optimistic about the future of this technology and others that are currently being mined for breakthroughs.
I can see better now that this is a distinct take, but the fundamentals to me were about man-machine symbiosis. About making systems that we understand & can be freer for using.
I do think scale is important too. But it's a refinement it feels like, one we've already covered pretty well albeit have to keep re-learning/isn't evenly distributed yet. But making computing that is open & explorable? Go watch an Alan Kay talk, any of them; he'll say what I'm saying: we barely have begun to figure out the actual essence of computing where it opens itself up to us & makes us more potent people. We have been grinding on the same target, without pause for thinking about what general computing systems could be like, and it's the wider view that this is all actually on service of, that will help us.
AI to me is mostly creating our own Dark Forest scenario on our own planet that we have to somehow coexist with. More distraction from fundamentals.
The core of this comes down to -- will it scale?
This is what I want to see from the OS/OW AI models (open source / open weights) universe (kudos to Hugging Face for facilitating this).
I suspect the OpenAI model approach, where all infra is hidden and the models are hidden behind services, can't scale to meet the demand of what people want out of AI. Directions like Mamba over hungry compute Transformers, SLM as over expensive LLMs, and so on, will commodify AI services as operational frameworks mature. OpenAI gets that, hence why they are listening to consumer demand, but their moat assumption is faulty.
That said, I am optimistic about the future of this technology and others that are currently being mined for breakthroughs.