Git-annex, darcs, postgREST. I might have cheated and used google, but those are all projects I've heard of before. And it's not really fair to exclude big projects for haskell, but not for go.
I explicitly excluded the big Go projects of Docker/K8s.
I think I've heard the name darcs, but have no idea what it is or does. I may have heard of postgrest but I think the whole category of a thin layer over the DB is dumb, so I refuse to learn about it. :-) (If you want to talk to your DB, use the native SQL drivers. Don't just invent the same thing but as REST or GraphQL for no reason. Backend system should connect SQL to SQL. If you want a browser to be able to talk to your DB, then you will need auth and want bundling of calls and object translation and suddenly the thin translation layer isn't thin anymore.)
Anyway, the point is that Haskell has been around for a long time and is very popular with HN/Reddit users, but unlike Go and Rust, it has produced a very small amount of OSS.
Darcs is a distributed version control system. It predates git.
I tried using darcs for a bit. But the designers were a bit too ambitious: they had an elegant concept to solve all rebases automatically at least in principle. Alas, the early implementations sometimes ran into corner cases that had exponential runtime. Which in practice meant that it hang forever for all a user could tell.
As far as I can recall, that behaviour is fixed now. But they missed their window of opportunity, when other dvcs became really popular. Like git.
Interestingly, git had the opposite philosophy: they explicitly only resolve simple conflicts in a simple way, and bubble up anything slightly more complicated to the user.
About SQL: Haskell users wouldn't want to use SQL directly. They want their compiler to yell at them when they get their database interactions wrong.
So they don't want some much to have a layer on top of SQL that hides things, but they'd rather want some support to forbid nonsensical SQL queries.
Interestingly, at one of my previous jobs we had 'relational object mappers' in our Haskell code. What that means is that we used relations as data structures inside our Haskell code, but had to interact with a mostly object oriented world on the outside.
Relations make fabulous data structures for expressing business logic. Take the leap in expressivity coming from eg dumb C to Python's ad-hoc dicts and tuples, and then imagine making a similar step again, and you arrive at relations.
Especially the various kinds of joins are good at expressing many business concerns. But projections and maps etc as well. Basically, everything that makes SQL useful, but embedded in a better language and just as a datastructure.
> Anyway, the point is that Haskell has been around for a long time and is very popular with HN/Reddit users, but unlike Go and Rust, it has produced a very small amount of OSS.
The ML family of languages that Haskell is a part of has been around for even longer, and many of the beloved features of Haskell stem from that older legacy.
I don't have high hopes of the ML family becoming more mainstream. But I am really glad to see many advances born in the land of functional programming, and ML or Haskell in particular, making it out into the wider world.
The poster child is garbage collection. Which was invented for Lisp in the first place.
(You can do functional programming without garbage collection, but it requires much more finesse and understanding than they had when Lisp was young. And even then, garbage collection is probably still the right trade-off when you programme is not resource constrained.)
Garbage collection is pretty much the default for new languages these days. People expect a good rationale for any deviation.
More recently we have seen first class functions, closures and lambdas make it out into the wider world. Even Java and C++ have picked up some of those.
First class functions belong relate to a wider theme of 'first class' elements of your programming language. I remember that eg in Perl you had to jump through extra hoops to make an array of arrays. That's because arrays were not by default treated the same as any other value.
I think Python did a lot for the mainstream here: Python is pretty good at letting you assign many of its constructs (like classes or functions or ints or tuples etc) to variables and pass them around just like any other value. In the understanding of the Python community, they see that just as good OOP, of course.
Combinators like map and filter have become popular in mainstream languages.
Algebraic data types have made it to a few languages. With that comes structural pattern matching, and the compiler complaining when you miss a case.
Tuples are pretty much expected in any new language these days.
The very article we are commenting on talks about generics.
Immutable data types are something every new language is supposed to have thought about. Even C++ has const and Java has final. Go's designers were asked to justify their very limited support for immutability.
Many people go so far as suggesting immutability should be the default in new languages. (And you could very well imagine a dialect of C++ where 'const' is implied, and you need 'mutable' everywhere, if you want to override that.)
There's quite a few more examples. Of course, correlation is not causation, and so not everything that showed up in functional programming languages before it showed up in the mainstream means that the mainstream actually got it from FP.
---
In summary, you are right that Haskell has not been used in OSS or commercial products as much as eg Go, but I am just happy that the rest of the world is moving in the Right Direction.
I’ll agree with that: while Haskell/FP has had a ton of OSS, it has had a huge, positive impact on other languages. Eg Rust would never exist without Haskell being around first.