Well, the utterly exposed stack of FORTH makes it nearly impossible to write clear and maintainable code.
I've been going through some old papers and ran across the version of FORTH that I wrote in 1979. I had thought it was pretty whizzy; it compiled to inline machine code, had a screen editor a little like Emacs and some other neat stuff. And the FORTH code was utterly unreadable compared to the kernel of Z-80 assembly that provided all the primitives.
I've had other experiences with FORTH and other exposed stack languages (usually little domain-specialized engines used as extension systems), and the result has been uniformly horrible. As in, "I wish I had done something else." Or worse, "I wish my cow-orker had chosen some other language, because I want to kill him now." Argh.
There are concatenative languages that have real expressions and real memory management, but I've never tried them; I imagine they're a real improvement.
I've played around with this paradigm a bit for the last few years.
To put it briefly: for many programs the style is extremely elegant and compositional; for others it's just plain awkward. In those cases it's not just hard to read for beginners; it's hard to come up with the program and maintain it.
Factor has lots of great ideas, but takes them way too far! The object system is supremely awkward IMO. It's basically as if you grafted an object system onto bash. Yeah you could imagine some way it can work. But both the syntax and semantics just clash with the original paradigm. And you should just use a different language if you want the OO paradigm.
The concatenative style works better as a little language than for general purpose programming.
> Its just that most programmers don't want to code in it.
Most programmers don't want to code in APL, Prolog, Haskell or Scheme either. Most programmers don't want to even consider writing Smalltalk or Clojure or F#; and they won'tever touch Idris, even if they somehow knew it existed.
I get a feeling that most programmers just plainly don't want to program at all, that they'd be happier doing something better-specified, more solid, like painting walls or something.
I don't believe that getting "most programmers" to use some language or other should be the goal. Creating better tools for people who want to use them should take priority over trying to simplify the tools to make them more accessible to those who don't want them. If the tool is really good then someone will make it "user friendly" sooner or later, and "later" really isn't a problem here and in most cases means about 20 years.
I'm not involved with Factor community, but I was under an impression that its users are people who understand that. It would be a shame if they got discouraged by the lack of "mainstream adoption" and stopped working on it after a mere "few years". That's not even enough to tell if the idea is really good or bad.
I'm sure they keep using it and that's ok, of course. But deepdown you want your projects to be widely adopted. That provides motivation and renewable workforce given that at some point the "elders move to greener pastures".
Take Go for example. Its creators helped a lot of people to write easier network software. Maybe they even would have preferred another syntax, but got over that and just used a syntax that most programmers would perceive as familiar and easy to learn. To me there is a lesson in this too.
If the tool is really good then someone will make it "user friendly" sooner or later
I disagree. User-friendliness is like security: it's not something you can paint onto the system afterwards, it's a more fundamental property. There's certainly value in building experimental systems and tools to explore ideas, but at some point you have to consider the question of getting the ideas out to the wider community. And the key to that is pragmatism.
People clearly aren't as stuck as you think; tools like Ruby or node.js or the cambrian explosion of javascript frameworks show that. But those have traction because you can show people your own rapid success with them.
If someone's rapidly delivering high quality publicly usable applications with a novel technique, people will notice.
> tools like Ruby or node.js or the cambrian explosion of javascript frameworks show that
Ruby appeared in 1990, JS in 1994 (iirc?). Saying they got "widely" (that's a relative term) adopted 15-20 years later is not going to convince me that "people aren't as stuck as I think".
Meanwhile we're still waiting for ideas from '60 and '70 to get to the mainstream. No, I really think people are a bit conservative when it comes to tools they use.
Just because someone doesn't want to code in APL (!) does not mean that they dislike programming. Whether you understand them or not, there are actual reasons why most programming goes on in languages that aren't the ones you mentioned.
But the creator went to Google, and he was the main force in really advancing Factor in the early years, IIUC. Of course, after a few years you expect a software product to stop changing that much anyway.
Every time this article crops up I stop and read the whole thing, word for word. Not only does he nail the beauty of and fundamental trouble with Forth, but it's a deep ramble through a world in the way that the best New Yorker pieces are.
I do the same, but only today I read the whole discussion below the article. I don't know how I could miss it before - it's as good as the article, even if a little less structured.
I personally wonder why there is no modern popular language from this family.