Hacker News new | past | comments | ask | show | jobs | submit login
An opinionated history of programming languages (artagnon.com)
104 points by zdw on Oct 3, 2020 | hide | past | favorite | 133 comments



The graph is missing the complete hierarchy of the Wirth languages. A very important part of programming language history. They influenced many languages and of course are a direct predecessor to Go. While Go has mostly a C-style syntax, in my opinion, it is otherwise closely based on Oberon.


Also missing is Ada from that same lineage, the entire category of concatenative languages (with Forth the best known example), COBOL, SNOBOL/Icon, BASIC, APL, PHP, Racket, etc. It's laughably incomplete, not a history at all, and "opinionated" is a severe understatement. Was not at all surprised to see that the author's still a student specializing in Coq and similar things. Experience building actual complex systems whose users aren't computer scientists might have led to a broader perspective and a very different set of opinions.


Those other languages are interesting today only insofar as they have affected currently important languages.

The exception is COBOL, which is still important in the way FORTRAN and Java are, but similarly will not affect future languages.


> only insofar as they have affected currently important languages

If this is meant to be any kind of history, that would seem to be grounds for inclusion. After all, Bourne Shell gets a spot, even though it's not actually a language and hasn't influenced any other language to any significant degree (unless you consider Tcl and even that's a big stretch). It seems like many things were included merely as excuses to express like or dislike, without any regard for historical importance or relevance. The fact that it favors C++ over alternatives might please some, but doesn't make it a good article.


Bourne Shell is a clear influence on Perl and PHP. And by the way, Bourne Shell was a influenced by C.


It's also marked "Unlikely to influence anything in the future, due to remarkably poor design", yet there's still a line drawn from it.... to Erlang?? but not Perl.

I can't extract any value at all from this chart.


You will find an increasing incidence of pipe forms ("X | Y | Z" meaning something akin to "Z(Y(X))", but different) in current languages. C++ and Rust both use it, and Haskell has a variation.


So it contributed one character? The same concept has been in many functional and concatenative languages since even before sh, and those are more likely to have inspired contemporary languages. The influence of absent Simula or Algol is multiple orders of magnitude greater than sh, so the point about the arbitrary standard for inclusion still stands.


Bourne shell is most practicing programmers' first exposure to language features supporting safe concurrency without locks or shared storage. Those features are still finding their way into C++.

And no, it is not the same concept.

When was the last time you had any contact with a running Algol or Simula program? I use sh -- bash, really -- every day.


> Those features are still finding their way into C++.

Not the only area in which C++ is 30 years (and counting) behind state of the art.

> When was the last time

The OP is supposedly about history, thus current usage is irrelevant. There's no longer a Mongol Empire either, but its existence is still historically important and only the worst kind of dilettante would try to write a history of the world without mentioning it.


Yet, other languages are, also, only just getting similar features.

The author seems to prefer to mention languages that are still taught and used, and to neglect those that are not. It being an "opinionated" history, I can only find fault where he breaks his own rules. Maybe he should mention Korn shell, for example, which became Posix shell and bash. But only Bourne shell features affected modern languages.


Did you notice that the author added Algol and Simula since we had this exchange? Kinda blows apart your theory of what the criteria were.


Or they changed.


PHP is far from insignificant in 2020.


What interesting opinions.

Rust not being a candidate for designing a future language?

But C++ being a candidate for designing a future language?

Go having remarkably poor design?

Ruby dying, but Python and Perl not?

I would really like to have a rationale for those decisions. It would be interesting to see how you can come to this conclusions.


I would have thought that at least half of your opinions were unexpected:

Go's design is at the very least controversial and considered poor by a vocal subset of the hackernews and programming community.

Python is decidedly not dying... it's still the dominating game in town for ML and data science. I'm surprised anybody would posit that Python were dying. Perl definitely feels like it should be in Ruby's camp.


> Go's design is at the very least controversial and considered poor by a vocal subset of the hackernews and programming community.

I'm in the camp of the rationales of Go's design being controversial. But the actual design is not bad IMO. For what Go wants to be (a modern C), it is not a bad design. But who wants a modern C? I don't. But if Go were available in the 90s, I would have jumped at the opportunity to write programs in it.

> Python is decidedly not dying... it's still the dominating game in town for ML and data science. I'm surprised anybody would posit that Python were dying. Perl definitely feels like it should be in Ruby's camp.

I didn't say that I would consider Python a dying language. But considering Ruby dying and Perl don't, is very weird. OTOH Ruby and Python aren't gaining any new usecases (but also aren't losing anything rapidly AFAIK) whereas Perl ... has any left besides very legacy code?

Putting Perl and Python into the same group and Ruby in another is certainly an interesting choice.


> For what Go wants to be (a modern C), it is not a bad design.

Sure and there's this weird thing where Go felt marketed as a C replacement and then Rust existed and it was all "just kidding we meant services" and boy do I think Go is poorly designed for services. It has a lot of characteristics that make it appealing - fast compiler and execution speed, low resource usage... but I don't see how pointers, a half-baked type system and a totally undesigned error system are good design for where Go is currently winning.

> But considering Ruby dying and Perl don't, is very weird.

Agreed for sure, I think Ruby and Perl probably have similar trajectories based on current trends in languages. Python's staying power IMO is really only due to the data science/ML trends and I frankly think it's a pretty poor language and ecosystem if you can get away with using anything else. I think Python WOULD be losing ground if not for its popularity in these very large niches.


I want a modern C. I love/hate C. I mostly love Go. It has warts, but the philosophy works _for my brain_.

Python will gain usecases as a meta-language as the ability to automatically wrap lower abstraction libaries improves. I see Julia eating much of Python's lunch, then python abstracting its way over julia, just because it can.


> Python will gain usecases as a meta-language as the ability to automatically wrap lower abstraction libaries improves. I see Julia eating much of Python's lunch, then python abstracting its way over julia, just because it can.

A big problem for Python as a glue language is interproceedural optimizations. In many important high performance applications, you really need the compiler to be able to make optimizations that cross function barriers, often barriers between user-written and developer-written functions.

e.g. in differential equation solving, you can have a very fast integrator, and your user can have a very fast integrand, but if you don't have a compiler that's able to see and reason about both simultaneously, you're in trouble.

This is why slapping Numba on a differential equation and sticking it in scipy integrators isn't even close to julia DifferentialEquation solvers. This is a very hard problem to solve in an ecosystem where there's so many different siloed compilers used by different Python packages. You'll always end up paying a performance price of the context switch.

In julia, most things are written in pure julia, so the same compiler is seeing the entire program top to bottom. It's very nice.

Furthermore, Julia has much power powerful metaprogramming facilities than Python does. It'll eat Python's lunch on that front too.


I can second this. I once implemented my own sparse matrix multiplication because I needed it in numba. Further, the type of automatic differentiation that is possible now is game changing.

On the other hand Julia is a terribly designed language in some respects. Compare python error messages to Julias. It has all the problems of c++ templates baked in from the start. And latency is a real issue for now. I am always shocked when I drop back to Python for one task or another how responsive it feels.

I think the latter might see a technical solution some time, e.g. by running code in the interpreter and swapping in jotted functions as they become available.

The former it seems might have to wait for Julia 2.0 to start addressing...


There's actually a lot of work going into making better error messages right now. In 1.6 we're getting better highlighting of them in the REPL to emphasize the parts of the stacktrace that we think are actually relevant to the problem, and there's some discussion about mechanisms to mark functions as 'internal' so that they don't appear in stacktraces unless toggled to do so.


That's great to hear! I am sure there is a lot of room for improvement and that would go a long way! But it also seems to me that with the language as it is now, the compiler simply doesn't have enough information to provide good error messages. E.g. there is no way to express the idea that something needs to be iterable to the compiler. Or that you expect an argument to be a function of a certain type. I'd love to be wrong on this though. :)


I totally get your point, but that's still below the level of abstraction I'm talking about. Basically it boils down to, python's main trick is metaprogramming. If your interprocess optimization is a bottleneck, then write a framework which lets you put those pieces together under the hood.

> Furthermore, Julia has much power powerful metaprogramming facilities than Python does.

Until someone writes a way to import julia code directly into python/autogenerate it from python. Anyone who says "well then you're not really writing python" is missing the whole point: writing python is all about exactly this sort of tip-of-the-abstraction-iceberg trickery.

Sure, you can call python from Julia as well. So it boils down to "which is the better glue language?" This isn't a competitive thing, it's just literally what Python is meant to do.

https://pyjulia.readthedocs.io/en/latest/usage.html


> I want a modern C.

As in, a minimalist language aesthetic or fast low-level non-gc? Golang only has the aesthetics, Zig has both.


Problem is that as Python starts using Julia more and more due to its top notch performance many will eventually question why they even use Python as they will realize Julia is a better language anyway.


Python is still vital and thriving but I wouldn't be surprised if its market share is declining (slowly). As you say, it's dominant for data science... but recently it and R were the only practical choices... now other languages (Julia, for example) are valid options. The development of alternative Jupyter (formerly IPython) kernels has accelerated adoption of other languages. Similarly, a few years back it seemed like Flask and Django were vastly dominant frameworks for building APIs, now similar frameworks exist in tons of languages. None of this is to imply that Python is anywhere close to dead, but my personal opinion is that "peak Python" is behind us.


I want this to be true, by the way. As I mention in some other comment, I think Python is rather mediocre as an actual language for building software. Its appeal cannot be denied, but a majority of my peers would really like to use a language that has its tooling shit together.

Python is the only language I've had to deploy that makes me understand why anybody would want to use containers.


I love Python, and it's my first and most-used language, but I definitely agree. I want something that takes the best parts of Python, is compatible with most existing Python code, and has a much better tooling ecosystem and better options for static typing.

Spending a bit of time with Rust's tooling, static analysis, and "if it compiles it's pretty likely to work" (at least way more so than with Python) definitely emphasizes a lot of Python's flaws.


Have you tried writing python as if types were mandatory? It takes a lot of discipline but it's a totally different beast with strict mypy checking.


Yes, I've started to more in the past few weeks, and I've enabled stricter Pylance checking in VS Code. It's definitely giving me more Rust-like productivity and reliability improvements.


Maybe we should separate the evolution of languages from the evolution of package/versioning systems. They do tend to go hand in hand, but that has more to do with logistics and organizational issues than any technical necessity.


But like R has a great packaing system, and it copied Perl, which is much older than Python. Meanwhile Python has a bunch of incompatible packaging systems which cause so much pain to so many programmers.

Did you know that pip won't even resolve dependencies, and will break your programs silently? I certainly didn't, because Python wasn't my first language and therefore I assumed that the package manager will handle dependencies.

Like seriously, when a language designed by statisticians for statisticians (R) has a better packaging story than everyone's second favourite language, something has gone horribly wrong.


> R has a great packaing system, and it copied Perl

I almost used Perl as an example of Python's inverse - a crappy language wrapped in a good packaging system. For its time CPAN was pretty great. All of the things today that people are likely to hold up as being better drew inspiration from it.

> which is much older than Python

Perl 1987, Python 1989. So yes it's older, but much older?


Fair, I realised that the difference wasn't that large after I posted.

I'm just so annoyed by Python's (lack of a) packaging system that it makes me prone to hyperbole. It's been frustrating me at work all week, and I suspect that it will be a low-grade annoyance for me in my career for the next few years, as I'm a Data Scientist and hence need to deal with Python a lot.


pip is a packaging system, and it ships out of the box with Python. It may not be the best one around, but there certainly isn't a lack of one.


Does it ensure that packages in the environment are consistent with one another? If not, I'm not sure what it does that curl won't do.


curl won't resolve dependencies.


If Google Trends is a good indicator Ruby and Perl are dying, Python is still going strong

https://trends.google.com/trends/explore?date=today%205-y&q=...


Computer languages to me looks like a fashion industry. So personally I am trying (not always successfully) to ignore all of those discussions about perceived dis-/advantages. I pick what I personally like or if that's the case what client insists to be used and do not give much hoot about what is the latest popular opinion about the subject.


Thematically related: A brief, incomplete, and mostly wrong history of programming languages (2009)

http://james-iry.blogspot.com/2009/05/brief-incomplete-and-m...


> The reason GHC didn't just turn on all flags by default is that many of them are mutually incompatible, so your individual .hs file has to pick a compatible set of language features it wants to work with.

This is profoundly untrue. Its not an opinion, its simply wrong.

Other things said about Haskell, such as "memory leaks are difficult to debug" is the type of thing you hear people repeating a lot but I experienced very little of. Pretty much every language has flaws that are a bigger issue and cost more wasted time than that.


Anyone care to explain what

    template <size_t i, typename... Ts, typename CurTy>
    void recurseFillChildren(CurTy &E)
    {
      using PackTy = std::variant<Ts...>;
      using TyL = std::variant_alternative_t<i - 1, PackTy>;
      static_assert(std::is_same_v<CurTy, TyL>);
      using TyR = std::variant_alternative_t<i, PackTy>;

      for (i32 j = 0; j < E.NChildren; ++j)
      {
        E.Children.push_back(miniParser<TyR>());
        if constexpr (i + 1 < sizeof...(Ts))
          recurseFillChildren<i + 1, Ts...>(E.Children.back());
      }
    };
does, and why it can't be translated into Rust?


If you have a hierarchy of types

    struct A {vector<B> Children; int NChildren = 2, ...};
    struct B {vector<C> Children; int NChildren = 2; ...};
    struct C {vector<D> Children; int NChildren = 2; ...};
    struct D {...)
and a function template

    template<typename T> T miniParser() { return T{}; }
and you call it like

    struct A;
    recurseFillChildren<A, B, C, D>(A);
It expands into nested loops that fill the "Children" areas of each type with two default-constructed values of the next type down the hierarchy, until you get to D, at which point it does nothing.

The reason you can't implement this in Rust is that Rust Generics (their "template-y" language feature) is missing a few capabilities relative to C++ templates. Namely,

1. They don't operate on numbers

2. They have no notion of collections of types

Because of this, they

1. Can't know that there is a collection of types

2. Can't extract the size of the collection of types

This comes up in numerical code, where in C++ it is easy to write e.g. geometry code that is generic in the number of spatial dimensions, but in Rust it is a pain in the ass.


This is a "non-type template parameter," that is, it's generic over that size_t i. Rust cannot yet do this. This is "const generics" in Rust speak.


I used to know C++. But not sure what's happening here.


This is me too and sometimes I feel like the language is going in two directions. The academic / scholar path where people show off unreadable C++ like this and the business path where professionals that use a subset of the language (mandated by corporate code base or teams) that are competent C++ developers but maybe not writing this complicated C++.


I work in telecoms and 'template like this' show up from time to time sometimes for reasonable reasons, sometimes for 'optimisations' reasons which are not backed by figures (some programmers like to show off).


Templates, in my opinion, are often extremely hard to read, not unlike Perl in its time.

And I will always prefer more verbose code that I can read, understand and maintain.


It demonstrates a method of generating a family of types or functions that, until C++17, would have required Template Metaprogramming via partial specialization or template function overloading, but is now just programming.

The key line is "if constexpr", which is evaluated at compile time and terminates the recursion.


could a proc macro not do it?


Seems like some tree-roots are missing -- Algol??! PL/(I/M)? B(CPL)? Snobol? APL? Simula? Link between Smalltalk and C++?


Seems like most of the interesting history is missing, perhaps because it happened before the author was born.


I bought this from Amazon UK like 22 years ago. (Should have bought Amazon stock instead ..)

https://www.amazon.com/History-Programming-Languages-Thomas-...

It's one those books I return to, every few years. Recommended. Maybe the papers are available online somewhere by now?


The author seems to have restricted the entries to languages still taught and used, but missed COBOL and maybe C#. I can't really blame him, in either case.


He also missed PHP...


Visual Basic, Swift, and even Scratch also rank higher on usage lists than some of those that are on the chart. Chapel is likely to be more influential. The "limited to languages being taught or used" excuse is super thin.


Why should there be a link between C++ and Smalltalk?


From its creator's own words:

https://www.stroustrup.com/bs_faq.html#from-Smalltalk

> C++ got its Object-Oriented concepts from Smalltalk?

> No. C++ got the key notions of classes, derived classes, virtual functions (in other words, the notions of encapsulation, inheritance and polymorphism) from Simula just like Smalltalk did. In terms of family relationships, C++ and Smalltalk are siblings.


So a good reason to do without the link. If so, C++ and Smalltalk-80 would have to point to Simula 67. But it does not look like the author of the referenced article optimized for scientific accuracy anyway.


I think a lot of people get it mixed up with Objective-C (which is a derivative of sorts of Smalltalk).

And C++ derives much from Simula which comes from Algol. And C derives much from Algol. And link between C and C++ is kinda obvious!


Objective-C isn't so much a language as one language embedded in another -- C + ST inside square brackets...


The blazing fast speed of Smalltalk combined with the memory safety of C


Well, because OO - and if Simula isn't there, then that's where the link would be. Even if Simula did come first, ST is certainly the "OO root" that most people know, so I'm fine with Simula -> ST -> C++ as a reasonable path.


This is a widespread misconception. The first Smalltalk version in 1972 didn't even have inheritance. Although Kay is ascribed the term object orientation, he understands it to mean something quite different than most. It was Ingalls who first introduced the properties generally referred to as object-oriented and implemented in Simula as early as 1967 into his version Smalltalk-76. The influence of Smalltalk is generally overestimated.


Yes most people know it because Americans are good at marketing themselves and Norwegians , who made Simula, are pretty bad at marketing because being humble and and not blow your own horn is kind of a big deal.

But I know at the University of Oslo there has been some annoyance that Simula’s place in programming history is so unknown while smalltalk has this outsized role.

Smalltalk definitely inspired Objective-C but C++ is clearly derived from Simula. As a fellow Scandinavian Bjarne was well aware of Simula and an active user of it.


Bjarne Stroustrup learned most of what he knows about OO from Kristen Nygaard personally, as he was at Aarhus University at the same time as Nygaard was there. He'd sit and talk to Nygaard for hours, but noted in an interview with Lex Fridman that it was usually just Nygaard talking at him for hours and him soaking as much as possible up. So even "well aware" is a bit of an understatement.


In terms of design influence, Simula is a lot closer to C++, Java, and C# than Smalltalk is, so I don't think that would be accurate even so.

Every time you write "protected" or "virtual", that's Simula speaking.


Classes and "virtual" procedures were indeed already part of the original "Simula 67 Common Base Definition" in June 1967, but "protected" and "hidden" were added later (proposed here https://dl.acm.org/doi/abs/10.1145/956003.956009, and probably added to the Common Base Language in 1982).


Besides the fact that they have object orientation as a guiding paradigm? (Different though their faithfulness and implementation may be)


C++ and Smalltalk-80 have a common ancestor called Simula 67, but neither Smalltalk was derived from C++ nor vice versa.


The problem with any of these directed-graph based "histories of programming languages" is that they ignore the fact that languages change. Sure, neither Haskell nor Ocaml are descendants of the other, but features in one have inspired features in the other countless times, making both languages better. The same goes for Ruby and Python, Java and many of the other languages on this chart and not on this chart, and more generally, all languages. Java introduced generics in 2004. Python introduced garbage collection in 2000.


The author addresses that further down on the page: "The diagram only shows influences in the original language design, not influences absorbed through the evolution of the language."

If you only are interested in "programming language design" this is not a bad idea. Changes and additions to a language often don't fit well into a language's overall design.


Indeed, current C++ is strongly influenced by the MLs.

But after 1985.


I see a lot of users forgetting that it's an opinionated history. The author hasn't claimed to make this exhaustive.

Further, it's a bit counterproductive to just point out f"Hey, what about {some_lang}?"

Wikipedia has a decent resource to start, if you'd care to take a journey into computation notation history :)

https://en.wikipedia.org/wiki/History_of_programming_languag...

My own nitpick, to throw in the mix: The asterisk marks languages with ~sophisticated~ compilers, and I think that Ocaml being on the chart without one is absolutely ridiculous and shameful.

But who cares. It's just some guy's opinion as a chart.


Labeling something as "opinionated" shouldn't be an exception to fact checking and discussion. Unfortunately, the term "opinionated" and often alsoe "opinion" is used exactly like that. This makes this term a huge red flag for me.

I am perfectly fine, if people take a controversial point of view, especially when entering a discussion. But then this should be the starting point for a discussion, not the ending point.


The asterisk is basically worthless on this chart. The very premise of a "Mature high-performance high-complexity compiler/runtime" contains 3 orthogonal categories. Common Lisp satisfies all three, and lacks an asterisk.


And Haskell, for that matter.


I’ve been learning ocaml for a few weeks. what makes the compiler sophisticated?


20+ years of some of the best researchers in the world?

It's been absolutely pummeled into stability by thousands of industrial use cases?

It's also like, STUPID fast. Try finding a large ocaml project, any will do, and see how fast it compiles.

Someone with more time and know-how could give a more technical testimony as to just exactly how badass ocaml actually is.


It is fast while having the best type inference capacities I ever saw in a programming language: you can spend years in a static language without ever writting a type (my one exception is deserialization which makes sense).


Why did Haskell wane?

IMO, it's partly the ecosystem. There's regular Haskell, Haskell Platform, Cabal, Stack. It ends up being an alphabet soup of possibilities, and it's not clear what is depending on what.

And then there's monads, which are stumbling blocks.

State is allegedly the root of all evil. But I think the argument needs to be more nuanced than that: yes, state is difficult to reason about globally, but it's usually not so bad locally in small functions.

I think that this is what Rust gets wrong. It goes down the "state is evil" road, but there's really not that much wrong with the judicious use of mutated values.

If you really want to avoid mutability, then you're really going to need monads, something which seems a bad fit for Rust (based on what I see in Hackers News).

At the end of the day: fuggit, just use C++. It's fast, you can point in references to vars, and if you don't want them mutated, just make them const. That's the conclusion I drew a few years ago.

As Alan Kay might put it, most new languages are in the "pink plane": they're incremental improvements (sometimes a step back) on old ideas, and sometimes just inferior rehashes of them. None of them are what he calls "blue plane": revolutionary new ideas.

Kay seems to admire Erlang, though. What it brings to the table is the notion that a program can crash, and recover from it. That might be a "blue plane" idea.

I also assume he thinks Scratch is a blue plane idea; even though strictly speaking graphical languages are not new.

When I saw Ada couple of weeks ago, I noticed that it had the notion of tasks as a language primitive. I wonder if he'd consider that sufficiently blue plane.


I think your view of Rust is quite off. How is Rust view "state is evil"? At its core, Rust is very imperative and procedural.

Global reasoning about state is hard, hence the assistance from borrow checker. Locally mutating local state in small function? Put a reference to vars, if you want them to be mutated, just make them mut. It's also as fast.

In a way, Rust does better job of localize/encapsulate mutation than functional/monad world.


Waning is the natural course for any language. It takes a Miracle to avoid that fate. COBOL, FORTRAN, C, C++, Java, Javascript (and maybe Python?) each got a Miracle. Other languages didn't. Go and/or Rust might get one, but odds don't favor it.

Miracles are not awarded on merit, so when a language that got one is usable, it's like another miracle.


This isn’t “opinionated”, it’s biased and poorly researched..


It seems opinionated is frequently used as synonym to poorly researched nowadays.


Maybe my color perception is off, but it looks like Python is considered a root language with no significant predecessors. Which doesn't seem correct - Python wasn't so novel that it introduced new programming paradigms.

Also, no asterisk (modern, mature compiler & runtime) for OCaml?


> Maybe my color perception is off

It's interesting to see this article on the front page at the same time as the Color Blindness article [1] discussing the difficulty of distinguishing tiny color spots. I'm not color blind, but I needed to zoom in on the first figure to see the difference between the red dot and the other red dot. Not to mention the black dot, the dark green dot that looks black at a distance, and the black dot that's actually an asterisk. Now I understand why these are so hard to distinguish.

[1] https://news.ycombinator.com/item?id=24672463


The arrows showed descent from C and Perl.


But the color (black) in the legend indicates no significant predecessors.


That only applies to the bolded languages. A bit confusing.


Listening to OCaml users, you would never get any impression that its toolchain was mature.


The author marked Clojure as a "dying language". I'd be interested to find out the metric by which this is true. In my experience, it is very much alive and thriving.


Can you elaborate on your experience?

I have great respect for Rich Hickey and I recently saw a cool talk about a financial startup using Clojure in production [1]. I have kept track of the language for some time but have never had the chance to use it in any serious capacity.

[1] https://www.youtube.com/watch?v=fnediEWRuyI


I think it’s a great language and a good community but I think some of the opposition to static types from Rich Hickey at the top is silly and I also find it genuinely harder to use languages which aren’t gaining wider traction after being around for over ten years.


Is the language community net growing or shrinking though? A language can be very alive and still not be experiencing tremendous growth.


Lazarus/Free Pascal is another example of a language still very alive, but not growing like crazy.


I first heard of it just now.


"FORTRAN: Unlikely to influence anything in the future due to remarkably poor design". Stops reading there


"FORTRAN: Unlikely to influence anything in the future due to remarkably poor design" I guess the author of this sentence likely does not know anything about Fortran. They do not even know that the language has NOT been called FORTRAN for more than 3 decades. They do not even know about the latest release of Fortran in 2018. They are at best showcasing your ignorance and bias, by not providing any arguments as to why people should take their comments and advice seriously.


That made me chuckle, I still work on Fortran applications. Some were written in the 70s/80s, but maybe that just means the industries I've worked in are a bit _inflexible_?


What lessons would you say are left to be drawn from Fortran?


Those lessons that C++20 is still learning from Fortran


There is a severe lack of Pascal


Seems like a good place to share a link to this poster which shows a comprehensive family tree of programming languages through 2003 or so.

https://www.cs.toronto.edu/~gpenn/csc324/PLhistory.pdf

O'Reilly's History of Programming Languages


Interesting thoughts. I wanted the columns in the graph to mean something, perhaps as a spectrum from functional to object-oriented, but that wasn't the author's intent.


Bourne Shell is the worst language people actually need to learn. At least PHP is a choice.


POSIX makefiles is another.


HOPL by Diarmuid Pigott (https://hopl.info/home.prx) is a pretty thorough picture of a vast number of programming languages (8945 of them).


I know it's an opinionated history, but you may want to draw an arrow from LISP or Common Lisp towards C++, as a good chunk of STL is based on Lisp concepts (and I'm pretty sure I've seen this mentioned explicitly in one of Meyers' books).


C++ as a language has very little influence from Lisp.

I would draw an arrow from Lisp to ML, since FP, GC, list processing etc. came from Lisp - ML was even implemented in Lisp and used the runtime.


Ruby is dying and Perl is not?? I guess this is "really" opinionated...


For that matter, PHP is conspicuously absent, for all I'm no fan.


If PHP is included, it should be colored red ("Unlikely to influence anything in the future, due to remarkably poor design") as well... If the author already classifies Go, JavaScript, or even time-tested FORTRAN and Java into this category, it's all but fair to include PHP. On the other hand, unlike the popular beliefs, the author doesn't consider Perl to be the case here? Anyway, interesting opinion.


Seems to me that Java is likely to influence things in the future, if for no other reason than because people seem to like creating languages that run in the JVM, and the libraries (if nothing else) then influence the new language.


One of those would need to survive. Kotlin could, conceivably.


Whilst I've worked with older PHP which was indeed awful. The modern stuff is quite OK. PHP has drifted into a decent design recently. I hope that any future language designer will take PHP as a case study of incremental improvement.


I wonder where BASIC would fit into this chart? Anyone care to enlighten an old basic programmer? It was my first language maybe 30+ years ago when I was a kid...


(and I've hidden the shame ever since lol)

Assembly was cool back in the day Z80, 68000... then C, but I felt too stupid to learn that stuff when I was a kid. Ugh. ;-)


It would be heavily influenced by Fortran, but I don't think it heavily influenced anything else.


Nothing but a flamewar tinder. :D


Fortran has already made its influence on several major languages, including C and Python. The nice concept of modules that you see in Python (that C++ has just decided to add to the language in year 2020), has existed in Fortran ever since free-format Fortran came to existence, decades ago.


Which functional programming languages have managed to build a solid user base?

What language would you recommend to someone trying to get into FP? I was considering Haskell but don't know enough about the ecosystem to make an educated choice for the language.


A fun coffee table book on a bunch of different programming langauges https://hellobook.io/


This was an interesting talk on languages https://youtu.be/QyJZzq0v7Z4


Ruby has influenced https://crystal-lang.org/.


Opinionated or not, no Algol?


In a recent talk, Robert Martin summarizes the history of programming languages for about 35 minutes, starting here:

https://www.youtube.com/watch?v=ya1xDCCMh7g#t=2m47s


Cool video, I like his rundown of the major programming paradigms. I'd love to show this to some beginners I know - though perhaps with some light explanation of the some of the low-level jargon he uses, as the bootcamps and tutorials out there these days don't really explain things from first principles like that anymore.


One takeaway is the only sane choices for a new, large project where you expect a mature ecosystem and performance are C, C++, Java, Erlang, and Javascript (but probably not Javascript).


Some thoughts.

Fortran (especially modern Fortran) is quite well designed. PHP seems to be missing.


I came prepared for a faddish take on the froth of momentarily popular languages, and found instead a forward-looking, grounded assessment.

I hope Rust doesn't fade out, but honestly cannot expect otherwise. C++ has long needed serious competition, but has not had any.


"FORTRAN: Unlikely to influence anything in the future due to remarkably poor design" I guess the author of this sentence likely does not know anything about Fortran. They do not even know that the language has NOT been called FORTRAN for more than 3 decades. They do not even know about the latest release of Fortran in 2018. They are at best showcasing your ignorance and bias, by not providing any arguments as to why people should take their comments and advice seriously.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: