Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The article mentions “Simple is better than complex”, but not the next line of the Zen of Python, which I think tells us a lot about that language’s philosophy: “Complex is better than complicated”.

Looking closely, that line says “(not simple) is better than (not easy)”, or more clearly, “easy is better than simple”. Python definitely lives up to this - it’s easy to get started with, but if you look deeply it’s a very complex language.

Go’s philosophy is probably the opposite - that simple is better than easy. This is similar to the philosophy of Clojure, as explained by Rich Hickey in “Simple Made Easy” [0].

[0] https://www.infoq.com/presentations/Simple-Made-Easy/




I just read Eric update quite recent on CSP. https://gitlab.com/esr/reposurgeon/-/blob/master/GoNotes.ado...

"Translation from Python, which is a dreadful language to try to do concurrency in due to its global interpreter lock, really brought home to me how unobtrusively brilliant the Go implementation of Communicating Sequential Processes is. The primitives are right and the integration with the rest of the language is wonderfully seamless. The Go port of reposurgeon got some very large speedups at an incremental-complexity cost that I found to be astonishingly low. I am impressed both by the power of the CSP part of the design and the extreme simplicity and non-fussiness of the interface it presents. I hope it will become a model for how concurrency is managed in future languages."


> dreadful language to try to do concurrency in due to its global interpreter lock

don't take it too seriously. GIL has its issues and it would be nice to see it gone but "dreadful" is an overstatement. Python wouldn't be so widely used as a back-end language otherwise. Concurrency is not parallelism.

On "it [Go] will become a model for how concurrency is managed in future languages." -- it is not as clear cut as it appears at the first glance: "Notes on structured concurrency, or: Go statement considered harmful" https://vorpus.org/blog/notes-on-structured-concurrency-or-g...


Dreadful may be a little strong but anytime I've tried to implement something like asyncio for a non trivial piece of code it becomes pretty obtuse. (imo)


I’ve found asyncio to be a very simple model to understand, using the aio libs: aiohttp, aiopg, aiobotocore, etc.

Basically just slap “async” or “await” in front of everything and understand that anytime there is a network connection being accessed, that method will release control of the main thread.

You just have to pay attention to where something might block the thread for any significant amount of time - heavy calculation or lengthy file IO

You can spawn a multitiude of async tasks on startup and have super basic “scheduling” by using asyncio.sleep with some jitter.

The only time I have seen the performance limits of a naive asyncio app reached was in an auth app that sat in front of every API request for the whole company, and even then it was an obscure DB connection pool management issue deep in psycopg2.


goroutines (and the go keyword) are the primitive, just like async is the primitive for python. Something like golang.org/x/sync/errgroup gives Go the same kind of "nursery" concept and can be leveraged almost identically (modulo Go not having the "with" concept, but defer can play the same role)


there is no doubt that a "nursery"-like construction can be implemented in Go in the same way like any language with "goto" can implement structured loops. The point is in constraining what can be done.

There is a trade off: "goto" is powerful but it is likely to lead to a spaghetti code. "nursery" introduces constrains but makes it easier to deal with error handling/cleanup, back pressure issues such as https://lucumr.pocoo.org/2020/1/1/async-pressure/


> Go community’s practice for grounding language enhancement requests not in it-would-be-nice-to-have abstractions but rather in a description of real-world problems.

jfc, the arrogance of this asshole. Seems like a decent fit for Go though, considering that language’s history of ignoring PL research..

I mean, Go is awesome for containers, and it’s awesome if you have a lot of junior devs and a decent amount of churn.

But the amount of anti-intellectualism by big shots in the community is seriously depressing.


ESR might be a lot of things, but he’s not a junior dev by any standard. And experience reports are pretty consistent across the experience gradient—Go is useful for solving real world problems, and often the very abstract languages fail to do so (often especially those languages beloved by intellectuals). You can challenge the qualifications of the reporters with respect to their own experiences if you like, but that seems like a silly thing to do.

If PLT can’t produce languages that practitioners find useful, then PLT is at fault, not practitioners.

EDIT: Rereading my last paragraph, "PLT is at fault" sounds harsher than I intended it to. Mostly it just sounds like PLT is based on a model of software development practice that doesn't fit well with the real world. The model performs poorly, but PLT supporters like the parent commenter are (implicitly or explicitly) blaming contemporary software development practice for the mismatch.


> often especially those languages beloved by intellectuals

*self-proclaimed intellectuals

To be fair, it also took me years to realize how programming as tought in academia is out of touch with reality.


I never said esr was a junior dev, and he’s obviously not :)

>“Go is useful for solving real world problems”

People repeat this like a mantra (you also hear similar from rich hickey’s most fervent acolytes in the clojure community), and I can’t for the world understand what it means...

I mean fucking BASIC can solve real world problems... I’ve spent ten years writing java and php to great success, but I'm still happy to never write in those languages again.

I even adore Elm, despite its annoying lack of type classes, but I respect Evan’s goal of avoiding complexity in the language. That argument holds up a lot better than That Rob Pike’s argument on types:

“ Early in the rollout of Go I was told by someone that he could not imagine working in a language without generic types. As I have reported elsewhere, I found that an odd remark.

To be fair he was probably saying in his own way that he really liked what the STL does for him in C++. For the purpose of argument, though, let's take his claim at face value.

What it says is that he finds writing containers like lists of ints and maps of strings an unbearable burden. I find that an odd claim. I spend very little of my programming time struggling with those issues, even in languages without generic types.

But more important, what it says is that types are the way to lift that burden. Types. Not polymorphic functions or language primitives or helpers of other kinds, but types.

That's the detail that sticks with me.

Programmers who come to Go from C++ and Java miss the idea of programming with types, particularly inheritance and subclassing and all that. Perhaps I'm a philistine about types but I've never found that model particularly expressive.

My late friend Alain Fournier once told me that he considered the lowest form of academic work to be taxonomy. And you know what? Type hierarchies are just taxonomy. You need to decide what piece goes in what box, every type's parent, whether A inherits from B or B from A. Is a sortable array an array that sorts or a sorter represented by an array? If you believe that types address all design issues you must make that decision.

I believe that's a preposterous way to think about programming. What matters isn't the ancestor relations between things but what they can do for you.”

It’s just your average, obvious complaint about the inflexibility of class hierarchies in OOP, with a slight misdirection at the beginning when he mentions generic types (aka parametric polymorphism) but for some reason that’s an argument against types?! He mentions polymorphic functions, as if they can’t be typed???

I mean I made the same mistake after three semesters of java at uni, but one semester of c/c++/python made me realize there was more to programming and I eventually discovered type theory, which makes Rob Pike’s claims seem odd at best.

For me personally (and thus anecdotally) PLT has been a boon in most aspects, even though I have to deal with imperative or object-oriented languages from time to time. it’s just such a drag...


> I can’t for the world understand what it means...

It means Go performs well on real world projects. People feel productive, the language, tooling, and ecosystem get out of the way. You (and most PLT advocates I've encountered) seem to be evaluating languages on their inputs/features (presumably because you believe axiomatically that certain features--e.g., type systems--have a huge effect on the success or failure of a given software project) while the "useful for real world problems" view is about evaluating languages on their outputs. The latter view is harder to measure objectively, but it accounts for everything (e.g., syntax, type system, tooling, performance, ecosystem, etc) in correct proportion (no axiomatic beliefs).

Many PLT proponents generally seem to struggle with the notion that languages are successful when their model predicts that they shouldn't be. For example, many PLT proponents believe type systems strongly predict the success of a language, yet languages with very sophisticated type systems which are much admired by PLT proponents do poorly in the real world and languages with very flat-footed type systems (e.g., Go) do relatively well.

Either the qualitative data about these languages are wrong (e.g., contrary to the qualitative data, Haskell actually makes for more productive software development on balance than Go), or these PLT proponents' whitebox model is wrong. My money is on the qualitative data.


> You (and most PLT advocates I've encountered) seem to be evaluating languages on their inputs/features (presumably because you believe axiomatically that certain features--e.g., type systems--have a huge effect on the success or failure of a given software project) while the "useful for real world problems" view is about evaluating languages on their outputs.

I don’t, so please keep you assumptions to yourself and don’t put words to in my mouth.

> in correct proportion (no axiomatic beliefs).

What is this based on?


> I don’t, so please keep you assumptions to yourself and don’t put words to in my mouth.

I'm hardly putting words in your mouth. You were expressing more-or-less exactly this sentiment in your previous post.

> What is this based on?

It follows by definition of output-based or blackbox evaluation. Evaluating the output of a system implies that you are evaluating inputs in proportion to their contribution to the output.


> I'm hardly putting words in your mouth. You were expressing more-or-less exactly this sentiment in your previous post.

Trust me, I wasn’t.

> It follows by definition of output-based or blackbox evaluation. Evaluating the output of a system implies that you are evaluating inputs in proportion to their contribution to the output.

I like this! It’s like pure functions/total programming, only not rigorously defined in the slightest.

It’s not an answer to my question though, HOW do you know that the results of your output/blackbox testing is correct?


Don't forget this absolute banger:

> Obviously Go chose a different path. Go programmers believe that robust programs are composed from pieces that handle the failure cases before they handle the happy path.

A function which only returns an error can have its result ignored without any warning.


> A function which only returns an error can have its result ignored without any warning.

It should perhaps be an error to not assign an error to a variable. Internally, Google has linters that enforce this.


I don’t think that paragraph was referencing compiler guarantees.


It should refer to them or address them.

Since, given this, Go is no better than a language with unchecked expressions nobody handles...


There’s other reasons too! I’ve written the following:

    if v, err = func(); err != nil {
        nil, err
    }
… and then went on to use `v`. Thanks to `v`s zero value being legitimate (and not nil like a pointer would be), the program continues on as if everything is okay. In case you didn’t catch it, I forgot the `return`.

Rust takes a much better approach with Result, where the return value is either `Ok(v)` or `Err(e)`, and there’s no way to access a meaningless value for the other possibility.


Yeah, and this is so simple a change (compiler wise) and far more stronger a guarantee I don't see why Go didn't implement it...

At least then errors as return values would be solid.

Of course now they make programmers do all the extra error wrapping thing in 1.14 to pass "richer" errors...


Eh. It’s verbose but I like it. It makes me think about the code a bit when I have to write a descriptive error wrap. Kind of annoying I guess... I haven’t written ultra large go codebases though so ymmv.


>It’s verbose but I like it. It makes me think about the code a bit when I have to write a descriptive error wrap.

Having the compiler force you, as is my suggestion, would make you think even more -- or not be able not to think and skip the error check or miss it.


Your example doesn't compile: https://play.golang.org/p/KQwXqTZHSPF

Go doesn't allow values to just be referenced without having some use, e.g., JavaScript's `"use strict";` hack could not be done.

In general, I have never seen a bug caused by accidentally ignoring an error. It's a theoretical concern, but not a real world problem.


Don’t know what to tell you, I have personally made this mistake and not had this caught by the compiler. I haven’t used go in several years at this point, so it’s entirely possible this is a newly-caught scenario by the compiler.

Regardless, the fundamental point stands. Using tuples to return “meaningless” values alongside errors allows developers to mistakenly use those meaningless values.


I do wish Go would adopt sum types, but in practice errors like you describe are vanishingly rare. It’s mostly a theoretical problem.


Any serious Go shop would have errcheck as one of the linters in CI.


Apparently linters are bad and that’s why go literally refuses to compile if you have unused imports. But crap like this? No problem, flies right through.


Linters aren't bad (Go basically has one as `go vet` that checks for all kinds of "this is probably wrong", like the common "closing over a loop variable")

Warnings are bad, specifically when the warning is unambiguous (importing a package you aren't using is always wrong, though it makes debugging frustrating at times) The idea is that warnings that don't "stop" the build generally get ignored. Build most non-trivial C++ projects and count how many warnings flow past the top of your screen for an example of what they were trying to prevent.


What drives people crazy about Go is the laser-like focus of the designers on real world problems over theoretical problems.

Theoretical problem: Someone might mutate a variable intend to be constant.

Go designers: Then put a comment saying not to do that.

Real problem: People ignore compiler warnings.

Go designers: Then eliminate warnings.

Real problem: Exceptions can happen anywhere and often go unchecked.

Go designers: Then call exceptions "panics" and encourage people not to use them.

Theoretical problem: Someone might ignore an error return value.

Go designers: Let paranoid people write linters.

Etc. etc.


Yes, laser focus on real-world problems like unused variables (don't matter) or unused imports (matter even less).

> Theoretical problem: Someone might mutate a variable intend to be constant.

> Go designers: Then put a comment saying not to do that.

One can only wonder why they even bothered writing a compiler when comments can solve it all.

> Real problem: Exceptions can happen anywhere and often go unchecked.

> Go designers: Then call exceptions "panics" and encourage people not to use them.

> Theoretical problem: Someone might ignore an error return value.

> Go designers: Let paranoid people write linters.

Because that way it's even easier to ignore than exceptions, and that's… good apparently?

Also create `append` where not using the return value just right (with no help from the compiler but that's OK because comments are where it's at) doesn't just create a bug in your program it can create two or more, what relentlessly efficient focus on real-world problems.


I can emphatically confirm that this is not what annoys me about go, and what does annoy me are the real-world issues I ran into it through multiple pieces of production software developed with multiple teams of skill levels ranging from intern to senior.


Can you link to a write up? I’d like to read what went wrong.


It’s been at least three years so it’s difficult to do a real write-up. In a lot of ways it was death by a thousand cuts. But some things off the top of my head:

Having to rewrite or copy/paste data structures for different types, given the lack of generics. As I understand it, even Google now has tools that generate go source from “generic” templates. This is absurd.

Defer to free resources (e.g., closing files) is a terrible construct because it requires you to remember to do this. You have lambdas! Use them so that the APIs can automatically free resources for you, like Ruby and Rust. It’s insanely hard to debug these kinds of issues because you run out of file descriptors and now you have to audit every open to ensure matching closes.

Casting to interface{}. The type system is so anemic that you have to resort to switching over this, and now you lose type safety. Combine this with the compiler not caring about exhaustive switch statements. And combine this with interfaces being implemented implicitly, and it’s a mine field for bugs.

I literally had a go maintainer waffle on adding support to calculate SSH key fingerprints because “users can just read the RFC and do it themselves if needed”. This is an indefensible perspective on software development.

Despite “making concurrency easy”, having to implement concurrency patterns by hand for your different use-cases is nuts. I have lots of feelings here, most are summed up by https://gist.github.com/kachayev/21e7fe149bc5ae0bd878

Tulpe returns feel like they were bolted on to the language. If I want to write a function that does nothing more than double the return value of a function that might error (insert your own trivial manipulation here), I have to pull out the boilerplate error handling stanza when all I want to do is pass errors up the stack.

This is the 5% that I remembered off the top of my head years later. All in all, the design of go as a “simple” language just means that my code has to be more complex.


Interestingly, the one time I introduced someone to Go without really "realizing" it (during a coding interview, got to pick the language I used), his first comment was actually that he liked how explict the error return was. (strconv.Atoi, to be specific). That pretty much sums it all up to me: `if err != nil` seems like annoying boilerplate, but then when you see stuff that's not just doing `return err` inside that conditional, you realize that it can actually be a benefit.


It would be nice to have “elegance” brought into the picture too. Code is an art as well!


Unless your building a user facing frontend please keep art out of your code. Just like a bridge internal's concrete structure doesn't need decoration so does your backend's plumbing. Writing elegant and artful code that is hard to understand and debug does not make you a clever developer. Keep that stuff for the demo-scene or 99-bottles-of-beer. Go is an language for engineering software, not crafting.


This seems to have totally missed the first half of the comment to which you replied: _elegance_ certainly has a place in programming, just as it does in mathematics and many other disciplines. Things that are elegant are often difficult to conceive, but easy to understand being simpler solutions to a problem than something inelegant.


I have seen uncharitable interpretations, but this one is one one of the worst.


What exactly are you picturing when someone says "elegant" or "artful" code?

To me, elegant code is DRY code. Elegant code is code with useful abstractions where needed, and no abstractions where they just complicate matters. Code that is succinct yet clearly communicates its purpose.

From the sounds of it, you have an entirely different conception of what elegant code looks like.


> Writing elegant and artful code that is hard to understand and debug does not make you a clever developer

The whole purpose, to me, of artfulness in code is to take unartful, hard to understand code and make it simple. What other objective is there?


Honestly if a solution is too hard to understand and debug then it is not elegant. Elegance is turning a complicated solution into an easier-understood one. Clever solutions could be elegant but they could also be shortcuts that are confusing and cause more harm than good. There’s definitely a difference.


> Unless your building a user facing frontend please keep art out of your code.

Honestly, I had rather art be kept out of the user facing frontend. (Computer programming, on the other hand, is considered art by some computer scientists.)


(I don't necessarily disagree, but...)

It's probably changed a lot since, but at least back in the 90s demo-scene code was absolutely 100% written for the result alone, even perhaps when it should perhaps have been say 75% for the sake of reusability. Imagine "decent" 90s game code quality, then dial the qualitynotch down a bit, since it will only have to work once on a well-defined machine anway.

I'm long since tuned-out. I do seem to remember Farbrausch making some waves when they started applying the concept of reusability and structure to their work in the early 2000s.


> Writing elegant and artful code that is hard to understand

You do not know the meaning of elegant. At least in context of programming and maths.


You have a depressingly narrow view of what "art" can mean.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: