Hacker News new | past | comments | ask | show | jobs | submit login

There was no "AI winter"; that's a myth. Well, or least a big exaggeration when used to explain why certain things are the way they are.

The main force which explains everything is the procession whereby mainframes were replaced by minis, were replaced by workstations, were replaced by microcomputers.

At each stage, the new wave of hardware started small, bringing in its own approaches, tools and languages. As each wave matured, certain software technologies made "the jump". Some didn't. Some made the jump, but their popularity was destroyed. Those approaches which started each wave had a certain edge, even if they were inferior. For instance, the BASIC language was very widely available on the first 8 bit microcomputers that burst onto the scene in the late 1970's, putting computing into the hands of non-institutional users for the first time. In computing ivory towers of that time, nobody considered BASIC viable at that time any more. Yet, because of this boost that BASIC received, riding the microcomputer wave, it still persists with us in the form of Microsoft's VB.

Lisp was very successful in the 1980's. However, it didn't run very well on microcomputers. If Lisp hackers wanted to make an application to sell on the mass market, the faced the prospect of their customers having to buy expensive workstations and minicomputers. So they did the obvious thing and re-wrote the logic in C or what have you, making it workable on an 8 Mhz PC with a meg of RAM. Once someone has "made it" by doing such a thing, they stop learning. Twenty five years later they are still telling the same story to new recruits about how they rewrote some Lisp thing in C to make it actually run, and made a business out of it, hence forget Lisp.

By the time the next wave of hardware gets to the point that it exceeds the previous generation in capacity and power, it's too late to try to revive most of the stuff that didn't make the jump. The people moved on to something else (or have even become irrationally permanent naysayers), plus other things have obviously changed in the world.

Lisp is doing very well all things considering, because of great expressivity and abstraction, machine independence and overall enduring value. Also, its adaptability: the ability to be reshaped into new dialects. Nothing that old has anywhere near the clout.

As far as the AI winter goes, basically the spiel is that certain funding money dried up for certain types of AI. Even if that is true, what does it tell us? That certain people were dependent on that type of money. They were dependent on it because their stuff only ran on institutional hardware; they were not able to wean themselves off the institutional teat and do something in the mass market. At least, not without changing toolchains.




Imagine an alternative universe where AT&T was allowed to sell UNIX and charged the same price as other mainframe OSes, instead of a symbolic license price for universities.

I have a feeling that would have turned out quite different.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: