The _only_ factor to programming language popularity is how accessible it is.
Beauty and elegance don't matter. Performance doesn't matter. In fact, no measure of "fitness for purpose" matters.
If Malbolge was built into every operating system, Stack Overflow would plug the gaps, and major production systems would be built in it. People would give conference talks about "scaling Malbolge to millions of transactions per second". It would have a package manager, a deep learning framework, and several UI toolkits.
C++ was very much more accessible than the alternatives (mostly Turbo Pascal) in the late 90s. As devs started moving to unix it became the most accessible option in the landscape (not counting C, which I would argue is in a different market) with djgpp, g++, etc.
Yeah, I think a better approach would be to state that a language popularity is determined by how much it makes you think that you can do useful things with it (even if it's a mistaken perception). Accessibility is pretty related, but this kinda explains C++ better, as it was "the language" used for all "serious" and "high performant" software, and therefore seen as the way to go for a lot of people. It was also what was used in a lot of other software you had to interact with, so in a sense that made it more accessible too: if you had to use it anyway, why not just stick with it all the way.
The popularity of C++ took off at a time when its competition was C. Not modern C, but C89 and older. Oceans of ink have been spilled describing the quality-of-life improvements that early C++ made over contemporary C. It is only with hindsight that we see how flawed these improvements were, and how other languages might achieve them with more elegance (and fewer corner cases that constantly explode in your face).
C++ is relatively easy to code in a plain way that don't blow up in your face. It's just that java made much more business sense with GC, stack trace, safe objects and built-in security model.
> It is only with hindsight that we see how flawed these improvements were
I think some platforms existed that had already solved those issues, but it was a different time. Disseminating those ideas was harder, open source wasn't as much of a thing yet, so development was siloed, we didn't have blogs or youtube to aggregate and broadcast the danger of those footguns, and it was the evolutionary solution that did provide the most QOL improvements.
I agree. C++ offers a fairly good compromise between performance, low-level control and convenience. C++ strings and vectors, classes, make it much easier to implement a number of things in C++ than in C. The popularity of C++ is very much a matter of fitness for purpose.
You could write code in D or Nim, and your code might be marginally more readable, but you wouldn't have as many easily available libraries, or as much support (eg: stack overflow). It's hard to compete with the fairly good compromise that C++ offers.
Yeah, it's not just presence but centrality to experience.
Powershell is still peripheral to most Windows' user's experiences. If they script anything on Windows, it's probably via VBA and MS Office. And even many sysadmins (IME, I know this is changing) seem to default to BAT files out of habit.
Contrast with shell on * nix, where it is more commonly used and known (even if not to writing fluency, most * nixers could probably read a shell script thrown at them barring poorly formatted/structured code). BASIC's popularity was largely due to its centrality to the user experience on early PCs, with many PCs starting in a prompt where BASIC could be entered directly. If it had been hidden away behind several menus and options, it would have been less popular.
Javascript's popularity is similarly due to its centrality to the experience with web browsers, and the ability to run it on anyone's computer without needing to compile and publish anything beyond the code itself (and an HTML page to show it off on).
EDIT: Asterisks ate themselves, edited to put a space between them and "nix(ers)"
Windows 1.0 was released in 1987. Powershell didn't exist until 2006 after Microsoft had promoted a succession of other tech for 19 years including building their own development tools designed to make building guis easy.
It would also be hard for anyone to mistake powershell or a replacement for existing tools.
That was a different time, though, and by 1990 Ashton-Tate (makers of dBase, and soon to be bought by Borland) was pretty much done. Microsoft bought dBase-clone Fox Software in '93, which is pretty much the only thing that kept anything dBase alive until FoxPro got put out to pasture sometime around '05-'06.
IOW, were it not for the sponsorship of Microsoft, dBase would have been done 30 years ago.
There were lawsuits flying left and right over the origin, which scared a lot of resources away from it. (It turned out Ashton-Tate stole it from NASA's JPL, who modified some of the keywords, but stole the concept from a defunct company and product called something like TymeShare.)
I thought dBASE was a great language for small-project RAD and ad-hoc data fiddling. I could crank out smallish apps in hours.
I propose a corollary to the author's thesis: a languages popularity is a function of runtime advantages and time to implement an MVP for your domain. This corollary explains the scripting language conundrum: Python is popular because it's easy to implement an MVP in a number of different domains. Rails helped Ruby to achieve widespread use because it made it easy to implement an MVP for a web application. I think it's fair to say that to gain adoption, a language needs either a killer runtime or a killer app.
Can someone familiar with the languages elaborate on "generics in Swift are the opposite of the generics in Go"? It's an interesting offhand comment but I have no idea what it means.
Unix shell also has a natural learning ramp. You start out just using the the shell to do standard computer tasks: install software, manage files, etc.
Then, one day, you're trying to do some sort of repetitive task, and you use a for loop. Or you realize it would be nice if cron could run a command for you at a set time.
Then, some time later, you realize it would be useful if cron could execute five commands in sequence, or decide that your for loop is complicated enough that you'd like to write it out on multiple lines. And just like that, you've started developing shell scripts.
I'm going to have to think more about the author's thesis about runtimes. It reminds me that I would love to see more language implementations with region-based memory reclamation instead of garbage collection, if only to be able to have more predictable overhead. A pure functional language would seem to be amenable to performing little collections over new objects that aren't reachable from function results, for example.
You're basically describing the erlang runtime, which is more popular than "totally niche" but not as explosively popular as, say JVM, .NET, JS, or even Dart. In fact despite being a practitioner of the erlang runtime, I would say that its lack of popularity is a point against the author.
I agree that the quality of language runtimes are neglected all too often, as opposed to language features. I had similar reasoning when I wrote about it earlier this year - https://blog.commure.com/whycommureusesrustforhealthcare/
> yeah, we’ll just JIT highly dynamic language to suuuper fast numeric code at runtime
This is the impression one gets sometimes, but the recommendations for fast Julia actually seem to revolve around ensuring type stability (another way to say “static strong typing”) and avoiding heap allocations, in fact Julia seems like just another nicer syntax for C++
Another point in favor of the authors thesis: pure functional programs continue, two decades into the 21st century, to not be widely used in spite of their syntactic popularity. Software still continues to not be written at significant scale in Lisp, or Haskell, or even Scala.
The majority of Twitter is Scala, Stripe uses Scala extensively, Verizon uses scala for at least a large portion of their TV related code as does Comcast, I could go on. Did it 'take over' the world like some of us hoped? Nope, it is definitely still a niche, but it's quite a large one at this point.
Pure functional programming is (IMO) more of a style than a language phylum. One can (and I can argue should) write C++ in a pure functional style, esp. now with C++17 having sum types.
I agree with this. Functional programming concepts (iterators, stream processing, sum types, first class functions, etc) have made significant inroads into almost every popular programming language:
I think the difference in brain cycles spent for a master in OOP vs a master in FP is far less than the difference between a beginner in either compared to the masters of their style.
At least in my experience, OOP has more concepts to learn but each learning step is smaller. FP has less concepts in total but the learning curve has some rather steep parts in it.
PHP is too hard for me. I can't keep track of all the things I need to keep track of not to write unmaintainable, error-prone, insecure garbage. With typed functional languages I do a little better.
This is exactly why I switched from C++ to Java, and from Python to Go, and why I’m using JavaScript in the browser.
But it doesn’t fully explain why I switched from Java to Python: that was partly for the runtime, but in that case the language itself played a big part.
Interesting hypothesis. There certainly is at least some truth to it.
On the topic of Zig, he seems to ignore that Zig provides several unique "runtime" features:
* unlimited comptime code execution
* error sets and error return traces
* choice between blocking or async runtime, using the same code
* runtime checks for undefined behavior
* debug allocator
And probably more that I'm not thinking of or am not aware of.
I wonder where Kotlin would fall in this case. It offers a lot of compile targets and it has some authority over Android (but Dart is getting more popular by the minute/ Java is shrinking only slowly). Will it be the Android language or the next cross platform language?
If Kotlin can set up adequate tooling and performance in native and JavaScript backends, its approach of building libraries for multiple platforms could be a new wrinkle, and fulfill the "write once, run anywhere" better than Java ever did.
As it stands, it's _not_ easy to use the multiple platform setup. You really need to be both an expert in Kotlin, Gradle (and their plugins are _barely_ documented), and the target platform.
But if the build process becomes easy, and, they can build a rich ecosystem that basically can create abstraction layers on top of the target platforms, I see it going far. It's still not there yet, but it's moving quickly.
I'm still in a "wait and see" place with Kotlin. I've enjoyed working with it on the JVM, where the tooling is fantastic. But the JS and native toolchains are still not up to my standards. Just being another JVM language isn't good enough for me.
This is too bold a claim. Not an inaccurate one, as with some languages runtime does play a huge factor in widespread adoption, but as with many things, it's just more complicated than that. Obviously scripting languages like python do not depend on runtime for popularity, except when they do. How can you even separate Go's syntax and paradigms from its runtime when it's defining attributes (parallelism) are so tightly coupled to it?
I don't really disagree with any of the overarching points here, other than the supposition that "runtime" is anywhere close to the one factor to rule them all. Adoption is determined by....mostly adoption. You generally choose the most popular language that fits your needs, however stringent they might be, and initial adoption has so many factors that go into it, and if syntax and paradigms were irrelevant, then why would C++ have superseded C, only to be replaced itself by a variety of languages addressing a variety of needs?
I wouldn't call it "objectively bad", but two major warts:
- Lack of modules/packages
- Dynamic binding as default
The former is an annoyance, especially in a "living system". Contrast with C where the lack of modules and prefix-based naming is controllable by putting things into libraries. So your use of OpenGL functions prefixed with gl is contained in one area, and I never need to see it if my portion deals with network code or anything else. But with elisp, once you load that, all those functions are thrown into the same scope as everything else. If something isn't properly prefixed, or if prefixes collide, then there will be problems.
Dynamic binding (lexical binding can be enabled as an option) means that, coming from other lisps, some things do not do what you'd expect:
I assume that the author, like many people, has an allergy to parentheses or something like that.
As far as I'm aware, there is no way to measure whether a language is "objectively bad" except if that language does not allow you to do what you want to do (or, maybe, if the language makes doing what you want to do unreasonably difficult). Emacs Lisp has been successful for decades and by all appearances is going to continue being successful, which in my mind makes it objectively good by any empirical measure.
(By this measure, we could consider languages like the assembly languages, ALGOL, etc as "bad", so long as the context for analysis is "writing modern high-level software." They are mostly no longer used except when necessary, which seems to me the only measure worth considering until we develop a theory of language usability analysis.)
I wish the author had not been so vague in this dismissal. While their article is certainly allowed/encouraged to be full of opinionated conjecture (as they defend in the opening), I think such a strong position is deserving of at least a little more justification.
I think the quibble here is about "objectively bad" vs. "objectively worse".
I believe the author somewhat elaborates on what he means by "objectively bad" with his example of Javascript having two "null" sort of values, which under a reasonable definition is objectively worse than a theoretical Javascript with a single null.
So if we read the author charitably, taking into account his example of Javascript having two nulls being objectively worse than a theoretical Javascript with one null, we can interpret his statement as "Emacs Lisp is objectively worse than other Lisps, which have essentially identical functionality and fewer sharp edges".
But the author does not elaborate on the "objectively bad" statement explicitly, leaving it to the reader to discern what makes Emacs Lisp bad. And I think "worse" makes less sense, because it is not being directly contrasted against any other language. The author simply takes it for granted that readers will agree Emacs Lisp is "bad" and leaves it at that, which I think is unfortunate.
Your example wrt. JS's two null types doesn't exactly generalize. What are Emacs Lisp's "sharp edges", as you say? What does it prevent people from doing? In what way could it be made more ergonomic? I would argue that its long-term proven stability is an indicator that the language does not have sharp edges; at least, not ones which discourage its use significantly. When you go into discussions about programming languages, you can't throw a stone without hearing somebody bemoan Javascript or PHP, but I never hear Emacs Lisp being brought up (and people who do bring up Lisp are often just upset about the abundance of parentheses, it seems).
I dunno, it just strikes me as the author acting as though this is a given assumption, and I don't quite see it. I'm not saying they're wrong (because, of course, the entire article is an opinion piece and the author is entitled to their opinion), but rather that I just wish this claim had gotten a bit more justification.
Yeah, I agree that not elaborating on this was unfortunate (but elaborating would be tangential, as relatively few people are familiar with Emacs Lisp).
I think Emacs Lisp is in the "easy to improve upon" category because of:
* lack of namespaces
* dynamic binding by default
* spartan standard library
Beauty and elegance don't matter. Performance doesn't matter. In fact, no measure of "fitness for purpose" matters.
If Malbolge was built into every operating system, Stack Overflow would plug the gaps, and major production systems would be built in it. People would give conference talks about "scaling Malbolge to millions of transactions per second". It would have a package manager, a deep learning framework, and several UI toolkits.
In some sense, this has already happened.