The article mentions Support Vector Machines being the hot topic in 2008. Is anyone still using/researching these?
I often wonder how many useful technologies could exist if trends went a different way. Where would we be if neural nets hadn't caught on and SVMs and expert systems had.
I've been looking at SVMs for use with a binary classification experiment. Training and operating these models is quite cheap. The tooling is ubiquitous and will run on a toaster. A binary decision made well can be extremely powerful. Multiple binary decisions underly... gestures broadly.
Obvious contextual/attention caveats aside, a few thousand binary classifiers operating bitwise over the training & inference sets would get you enough bandwidth for a half-ass language model. 2^N can be a very big place very quickly.
Expert systems did catch on and do see widespread use - they're just not called AI anymore. It's 'business logic' or 'rules engine' now.
The issue with SVMs is that they get intractably expensive for large datasets. The cost of training a neural network scales linearly with dataset size, while the kernel trick for SVMs scales with dataset size squared. You could never create an LLM-sized SVM.
in insurance we use older statistical methods that are easily interpretable, because we are required to have rates approved by departments of insurance
I often wonder how many useful technologies could exist if trends went a different way. Where would we be if neural nets hadn't caught on and SVMs and expert systems had.