> AI winters are a result of a massive disparity between the expectations of the general public and the reality of where the technology currently sits.
I think they also happen when the best ideas in the field run into the brick wall of insufficiently developed computer technology. I remember writing code for a perceptron in the '90s on an 8 bit system, 64 k RAM - it's laughable.
But right now compute power and data storage seem plentiful, so rumors of the current wave's demise appear exaggerated.
Most of the software world will have to move on stuff like Haskell or functional language. As of now bulk(almost all) of our people are trained to program in C based languages.
It won't be easy. There will be a renewal for high demand software jobs.
I don't think Haskell/FP is a solution either... Even if it allows some beautiful straightforward parallelization in Spark for typical cases, more advanced cases become convoluted, require explicit caching, and decrease performance significantly, unless some nasty hacks are involved (resembling cut operator in Prolog). I guess bleeding edge will be always difficult and one should not restrict their choices to a single paradigm.
That's more a matter of budget than anything else. If you problem is valuable enough spending the money in a short time-frame rather than waiting for weeks can be well worth the investment.
I think they also happen when the best ideas in the field run into the brick wall of insufficiently developed computer technology. I remember writing code for a perceptron in the '90s on an 8 bit system, 64 k RAM - it's laughable.
But right now compute power and data storage seem plentiful, so rumors of the current wave's demise appear exaggerated.