"Eventually, Mr. Graham himself seemed to spurn the language"
Imagine how ridiculous this was to read while taking a break from working on HN, surrounded by windows full of Arc source I was in the middle of editing.
A lot of people seem to feel that a language isn't real unless the designer is talking to them every day. But that's not the only way languages happen. Nor possibly the best way. I feel like you get better ideas if you think in units of occasional essays rather than a stream of tweets. It seems likely the same will be true with language design.
I'm definitely catching a lot of flack for the spurn comment, and justifiably so considering this window into your work day. I would say that the essay in general was an attempt at levity, as I hoped smart-alecky comments like "birds sang", "voices of children", "Christopher Hitchens", etc... would convey. Although a word like spurn may imply the opposite, having enjoyed your essays, interviews (when I can find them), and especially Arc itself, I am the last person in that world that would tell you how you go about your creative process.
The second footnote, in which you qualify that comment, should be inlined instead. That clarification, or the lack of it, has too much of an impact on the tone of the rest of the article.
The point of footnotes is that their content is not worth interrupting the flow of the main text. It should be safe for the reader to leave them for later, or skip them altogether.
However, I think that a language such as Clojure (or more likely one of its inevitable derivatives) will instead be the language of 2109.
That's laughable. The JVM will play as big a role in 2109 as horse-drawn carriages do today. Clojure doesn't even try to be a 100 year language.
More and more, I'm getting the feeling that wondering what kind of language we'll use in 100 years is kind of like people a century ago ago wondering what breed of horse people will ride today.
I'm tempted to agree, but then realizing that FORTRAN and Lisp are both over 50 now and both still actively used, and C is in its 30s, it wouldn't blow my mind if things weren't radically different in another 50 years.
FORTRAN and Lisp are both over 50 now and both still actively used
These two languages are "still actively used" in very different senses. The mathematical idea at the heart of Lisp is still around, but those don't time out. And Fortran is alive in the sense that there is something called Fortran, but it probably has less in common with Fortran I than with other more recent languages.
If Java survives till 2109, it would probably be in the latter sense. Curiously enough, the fact that it is so popular in big, hidebound industries tends to make that likelier. It's tomorrow's legacy software.
Fortran 77 is still pretty widely used in physics, with some relaxed compiler rules that make it a little less ugly. Fortran 90 is however, I find, a lot more like Pascal.
I suspect you're right on Java surviving as a legacy language.
While I'm not sure (and really don't care) if Java or the JVM will survive for an extended period of time, I do find it interesting to note that programming languages seem to be comparatively resilient to change relative to everything else in computing. While the FORTRAN in that photo looks odd, it's still a comparatively small gap between it and FORTRAN from 40 years later than if we were to compare the operating systems, applications or user interfaces from similar time spans. I can still figure out what it's supposed to do. I'd be completely lost if you put me on a batch terminal or a VAX/VMS prompt.
The last really transformational idea in programming languages hit the world the same year I was born -- in 1980 -- with Smalltalk. There are a lot of variations on the themes of functional, imperative, and object oriented languages, and I expect new metaphors will evolve, but it's interesting seeing that really, now for at least the last almost 30 years, we've basically been remixing ideas.
And that's where I'd revisit Java for a moment -- perhaps the only transformational change that it brought to the table was the use of a formalized VM in a widely popular language. Even if Java's VM doesn't survive, I'm pretty sure that's a notion that we'll still have in programming environments for many decades out.
I don't mean to be pedantic, but Smalltalk is a lot older than that (think 1971). Which raises the interesting question of when the last truly fundamental innovation was.
As far as VMs go, I have a horrible suspicion that the x86 will still be with us in 2109. For a while it looked like we might be able to abstract it away, but EC2 and its clones seems to have entrenched it again.
The x86 may live for a good while, but within a couple years there will be far more "computers" running ARM processors than x86.
I expect the CPU architecture landscape to change most radically as the lines become indistinguishably blurry between telephones, computers and entertainment consoles.
Java's more likely to survive in the same way COBOL survives today, just based on who their target audiences were and what kind of programs were written with it. There's lots of COBOL code out there, but nobody wants to maintain it.
2019's only 10 years away though, and 10 years is relatively short in programming language terms. 10 years ago, the languages commonly used in industry were C++, Java, VB, Perl, and PHP. Today, substitute various ECMAScript variants for VB and Python/Ruby for PHP & Perl, but it's basically the same.
In my view it's already legacy today, or very very close... There is a massive body of Java code that is firmly in maintenance only mode (in corporates I should add - and mostly fin services).
It's a mixture, of which one is certainly still Java.
Suffice to say, over the last 2-3 years I've seen a lot less of the "let's build a large J2EE project". There are also a lot of large Java platforms out there (e.g. Internet Banking platforms) that are no longer under active development - maintenance and feature enhancements only.
Organisations that once had large Java engineering rooms seem to be breaking them up.
The biggest influence is outsourcing and offshoring. To an extend, big corporates don't really care about the underlying tech - they outsource to (say) Infosys and it's up to them how they actually implement it... and as a result, this varies greatly. The big outsourcers are still dominated by Java and C# (which C# gaining ground). The smaller vendors are much more varied, with Ruby and Python starting to feature a lot.
In terms of new technology, there is a focus on things like Process Automation, Business Rules Engines, etc. I worked on a large imaging/workflow project recently that was almost entirely Ajax + Process Engine + Business Rules Engine.... (Admittedly, Java did underpin most of these components).
I am with the school that the JVM will become primarily a host for other languages and that Java (the language) will move towards legacy status in the next few years.
Clojure, Scala, Groovy, JRuby, JPython etc all are moving in interesting directions and Java is too mired in backwards compatibility and politics to do anything truly interesting.
Java the language is very much closer to legacy then it is to innovation for sure.
Laughable++. Programming as a profession won't exist in 100 years, maybe not even as an archaic hobby. Our machines/AIs/robots will do it for us in much the same way that photoshop has replaced the darkroom gymnastics of 1909. Look how far away web design is from newspaper layout of a century ago. Maybe computers will be modeled on the brain with billions, maybe trillions, of cores (neurons) and trillions of interconnects. We wont be using arc, java, or even python to program them. We wont be programming. The "brains" will be so much better at it.
It's funny how choice of analogies affects your argument. If you'd said "as the qwerty keyboard does today", it would've been just as historically accurate and yet far more amusing.
How hard would it be to implement a Clojure++ that is not dependent on the JVM and the Java class libraries, but emulates a JVM when necessary to run legacy code?
Huh? It's a very tongue in cheek essay. It's well written and in good humor. You guys need to lighten up a bit.
The central point, even if not popular here, seems valid:
Arc seemed like it might be the Lisp family's answer to developing web apps, but didn't hit the ground with quite the maturity that some were hoping, and now it's looking like Clojure is emerging as an interesting leader in that world.
Paul of course might have some more Arc-iness coming down the pipe, but having played with and read some of the source of both Arc and Clojure, as of the currently available releases, Clojure feels like it's less rough around the edges.
I'd be more interested in why. This article doesn't touch on that - this seems to boil down to, "this isn't fit for purpose, this is", but doesn't say what that purpose was, or how it didn't fit.
Those were quoted a part of the initial public response to Arc, it didn't really go into why they were bad things - and the impact they had. The fact that they were part of blog chatter when Arc was released only is, to be blunt, really at best only marginally interesting. e.g. Have these things changed? Are they still relevant?
Being built on top the JVM gives you access to libraries - is this the meat? If so, I think most people already knew that. I think the "Ipso Facto therefore Clojure will be the 100 year language, not Arc" is a bit of a leap given all that.
I don't think the Author needs to justify anything - I don't need to agree with him - I'm just saying I got no real content out of this article.
You're right, probably taking it too seriously, but I'm of the view that even something lighthearted has to have some content (in fact, that's where humour is at it's peak).
I believe PG said he is taking more of a long-run view on this. He's not trying to make Arc what's hot right now, but rather thinking more about a "hundred-year language"
True enough, but I do think there might be a lot of validity in the comparison with hurd. After all, Hurd is still under development, but there is little question that it has been eclipsed by Linux. While many have certainly become aware of the advantages of Lisp, they also want something that just works as much as possible, and the large collection of Java libraries will be a great help in this regard. By the time that arc is "ready" to be the 100 year language, it may be that it is too late to gain acceptance.
Still, I'll admit that only time will tell. Both languages are still quite young, and it is hardly uncommon for languages to develop slowly in obscurity before rising to prominence. Maybe it will gain traction, maybe it won't, but either way I think we will benefit from the increase in competition and the ideas that are generated in its creation.
So arc has finally got beyond the stage of initial hype (remarkably fast), and simply continues happily evolving. No need to feed the novelty crows. Please keep up the nice work!
[Edit: and nobody forbids me from using clojure, which is a great way to deal with tones both nice libraries and legacy applications.]
Imagine how ridiculous this was to read while taking a break from working on HN, surrounded by windows full of Arc source I was in the middle of editing.
A lot of people seem to feel that a language isn't real unless the designer is talking to them every day. But that's not the only way languages happen. Nor possibly the best way. I feel like you get better ideas if you think in units of occasional essays rather than a stream of tweets. It seems likely the same will be true with language design.