Hacker News new | past | comments | ask | show | jobs | submit login

It's a reference to Jon Postel who wrote the following in RFC 761[0]:

    TCP implementations should follow a general principle of robustness:
    be conservative in what you do, be liberal in what you accept from
    others.
Postel's Law is also known as the Robustness principle. [1]

[0] https://datatracker.ietf.org/doc/html/rfc761#section-2.10

[1] https://en.wikipedia.org/wiki/Robustness_principle






I've always felt that this was a misguided principle, to be avoided when possible. When designing APIs, I think about this principle a lot.

My philosophy is more along the lines of "I will begrudgingly give you enough rope to hang yourself, but I won't give you enough to hang everybody else."


HTML parsing is the modern-ish layer-uplifted example of liberal acceptance.

I won't argue that this hasn't been a disaster for technologists, but there are many arguments that this was core to the success of HTML and consequently the web.

Which, yes, could be considered its own separate disaster, but here we are!


It makes sense in a "costumer obsessed" way. The user agent tries to show content, tries to send requests and receive the response on behalf of the client (costumer), and ceteris paribus it's better for the client if the system works even if there's some small error that can be worked around, right?

but of course this leads to the tragedy of anticommons, too many people have an effective "veto" (every shitty middlebox, every "so easy to use" 30 line library that got waaay to popular now contributes to ossification of the stack.

what's the solution? similarly careless adoption of new protocols? and hoping for the best? maybe putting an emphasis on provable correctness, and if something is not conformant to the relevant standard then not considering it "broken" for the "if it ain't broken don't touch it" principle?


When it comes to writing APIs I feel strongly that you should be incredibly strict.

1 != ‘1’

true != 1

true != ‘true’

undefined != false

undefined != null

etc

“Flexibility” in your API just means you are signing up for a maintenance burden for the lifetime of your API. You will also run into problems because you have to draw the line somewhere and people will be frustrated/confused since your API is “flexible” but not as flexible as they want. Better to draw the line at complete strictness IMHO. I dislike even optional fields and prefer null to be passed instead except special cases (like when null has a meaning, example: search endpoint where you pass the fields you want to search on and a field can have a null value).

I want people to be explicit about what they are doing/fetching when using an API I have written/maintained. It also encourages less sloppy clients


Ironically it leads to less robust systems in the long term.

> Postel's Law is also known as the Robustness principle.

Really? It seems like it's obviously just a description of how natural language works.† But in that case, there's an enforcement mechanism (not well understood) that causes everyone to be conservative in what they send.

We can observe, by the natural language 'analogy', that the consequence of following this principle is that you never have backwards compatibility. Otherwise things generally work.

† Notably, it has nothing to do with how math works, making it a strange choice for programming.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: