Just taken on its merits, I think a case can be made that this is one of the most overrated pieces of technical writing of the last 25 years. What's true in it isn't interesting ("the importance of having users", "release early release often") and what's interesting isn't true ("Linus's law" being perhaps the most notorious example). Much of the insight is taken directly from Brooks. The whole piece has as its backdrop the development of Fetchmail, which is not a well-regarded piece of software.
What's notable about Cathedral is its timing; it did capture the zeitgeist of what was an important moment in the computing field, the moment where we transitioned from 386bsd-style hobby projects to an industry run on free and open source software. But Raymond isn't the reason why any of that happened, and much of his description of that moment is faulty; the rest of it is just a retrospective of the engineering decisions involved in the writing of a midlist mail processing utility (fetchmailrc syntax, password encryption, the now-largely-irrelevant distinctions between MDAs and MTAs).
Even the high-level organizing notion of "cathedrals" and "bazaars", which should have been a lay-up, hasn't really proven out.
> What's true in it isn't interesting (... "release early release often")
This feels like 20/20 hindsight to me given that "release early release often" was definitely not the "axiom" that we think of today, and one could argue the contrary opinion was more widespread in 1997.
I can easily remember some debates at a large consumer web company in the mid 00s, where at the time we released once per quarter. We were struggling with product quality, and a large, significant portion of the software dev team argued that the way to improve our product quality was to reduce the number of our releases to something like only 2 per year. The argument was that we needed more QA time, needed more time to run integration tests, needed more time to ensure changes from all the different teams wouldn't break each other before releasing.
In our current world of CI/CD, a web company releasing only twice a year sounds absolutely insane. Fortunately the loud "slow down our releases" contingent lost (which is evident simply by the fact that this large tech company still exists - if they had reduced their release cycles as some desired I firmly believe the company would have died long ago). But I just bring this up because the mantra of "release early release often" was definitely not blatantly self-evident in 1997, and I'd argue it only became "axiomatic" after tooling support for CI/CD got much better.
"Release early release often" was a mantra of "Extreme Programming", a closed-source commercial software development methodology that predates this article by about 4 years, and was au courant at the time Raymond was was writing. One of my big thematic criticisms of Raymond's article is that it doesn't seem especially in touch with how closed-source development worked at the time.
I'm sympathetic to the idea that C&B is overrated, but it was published in 1997, and XP was only being fleshed out at the C3 program in 1996. The Agile manifesto -- in 2001 -- was what gave it its major visibility bump.
Release early and often was in the air in the late nineties, but C&B was legitimately the first time many people got to hear about it.
I dispute some of this, only because I was doing a software startup from '98-'01 and we managed to hire not one but two XP devotees that dragged me into reading this stuff. To this day, though, I'm still mostly unfamiliar with the particulars of Agile.
I get that Agile was bigger than XP (who could deny that?), and agree preemptively that Raymond's article is probably the first time many people heard about incremental release strategies. That's true of a lot of things in it! My big complaint about those ideas is that they're not Raymond's --- not even in the sense of waiting to be distilled from Linux by Raymond.
The true-feeling parts of Raymond's article read to me like a document of, for lack of a better term, late 1990s programming thinking. Just a bunch of stuff that was floating in the air, a bunch of Fred Brooks, and then weird attempts to generalize design decisions in Fetchmail to the whole of software development.
I appreciate getting called on this and forced to think more carefully about it. Thanks for the response!
I completely agree with you about C&B. Even at the time, it felt to me mostly like a restatement of the zeitgeist. But Raymond was very influential at the time and was apparently one of the reasons Netscape released Mozilla as free software.
I also agree that “release early and often”, in particular, was a significant contribution, if not for originality then for reach. Not everyone was exposed to XP at the time, but ESR seemed to be everywhere. And for my part, the main take away from XP had been around pair programming which as a startup myself, I wasn’t into/couldn’t afford/didn’t subscribe to anyway.
But I do feel your comments about XP and agile miss the mark a tiny bit. XP was developed by Kent Beck - who went on to be one of the authors of the Agile Manifesto. Perhaps because of this, XP is considered an agile practice.
But agile is really just a set of values and principles. There are many practices that people claim as “agile”, many of which do not meet the criteria of the agile manifesto (I’m looking at you, SAFe).
I think in some ways, C&B was a precursor to the Agile Manifesto, which itself it arguably just a restatement of principles and practices that already existed.
(My limited personal experience with engineering in large traditional companies is that the manifesto is ignored, commercial practices and consultants who slap “agile” stickers on their traditional SDLCs still rule - and it is still, largely, the dark ages. I was heavily criticised for using a CD strategy for some enterprise software at the same brand name company that required me to enforce password rotation… in 2018).
Things certainly happened very quickly in that period -- it's staggering to think that 1997 was only four years after the web began to percolate, and C&B I think got its lift because it happened when Netscape decided to open source their browser.
It's definitely hard to work out a chronology when things are piling one after the other, and also there wasn't the same consistency of knowledge propagation. It was still very hard to find out new things online.
> a closed-source commercial software development methodology that predates this article by about 4 years, and was au courant at the time Raymond was was writing.
I think you are misremembering. Check out the Wikipedia article on Extreme Programming [1] - The book Extreme Programming Explained wasn't published until 1999. Kent Back didn't even start working on the idea until 1996.
In any case, I'm not arguing that the idea of "release early and release often" was complete unheard of, but I am arguing that it was definitely not the standard and I would say the idea had more detractors than adherents at that time. There was still very much a battle between "waterfall processes" and "agile processes" that was really just starting to get going in the late 90s.
Is Extreme Programming a closed source commercial software development methodology? What makes it so?
I went through a period of my career where I dived head long into it, read Kent Beck's book, liked what I read. Tried pair programming, TDD etc, loved it. Found team that felt the same and had a great couple of years.
Given the book, the many conference talks etc and comparing it to other flavours
of agile that went full corporate (Scrum, SAFE). I'm surprised to hear it described as closed source.
My impression of XP at the time was that it was a methodology designed for consulting firms, and that most of its force came from the idea that it was an insurgent effort at reprogramming stodgy waterfall development processes at big companies; all of those companies --- the whole client base of XP consulting firms (which I'm assuming was a big thing) was closed source, because almost everything was at the time.
When working alone remotely from home, I simulate pair programming with a methodology I call "The Stranger". I sit on one of my hands until it becomes numb and tingly, and then it feels like somebody else is typing and moving the mouse!
I personally agree with the frequent release crowd, but my company sells a web-based SaaS product and releases quarterly, my last company released a web SaaS product 2x a year and SalesForce famously releases closer to once per year, so a large part my the world is "still insane" based on your opinion.
I was introduced to Salesforce in the early 2000s by a user very happy that useful features showed up every quarter without an upgrade process, so they have regressed substantially if this is the case.
I think it's safe to say in general that a large part of the world is still insane and will always be insane. Perhaps it's just part of the human condition.
> what's interesting isn't true ("Linus's law" being perhaps the most notorious example)
> Even the high-level organizing notion of "cathedrals" and "bazaars", which should have been a lay-up, hasn't really proven out.
you and i live oceans apart. a year of working around NixOS and especially linux on mobile phones has me working in very bazaar-like systems — where every component is swappable/composable — and seeing/using Linus’ law daily — in that people from out of the blue will patch things i’d been unable to figure out and when i hit edge cases that look familiar enough in any software i install i will debug it & send fixes upstream.
i’m well aware the above is niche. on the other hand i’m fairly confident that niche would be inaccessible to most of us who are otherwise participating in it were it not leveraging these two concepts.
We do. As a vulnerability researcher, my take on "Linus's law" (which, like "Gell-mann Amnesia" isn't the product of the famous person it's named for) --- "given enough eyeballs, all bugs are shallow" --- is that has been effectively refuted by the recurrent discovery of grave vulnerabilities that have hidden in plain sight, sometimes for over a decade, and with the success of commercial vulnerability research work in eradicating those vulnerabilities.
I don't think "bugs are shallow", or the "many eyeballs" section of this essay, are particularly talking about _discovering_ bugs.
The author seems to have had a worldview in which bugs don't really matter if people aren't coming across them, and in which the difficult part of dealing with a bug is either reproducing it or getting from a symptom to a cause.
If you were in a world where those two things are true then I think he's probably right that "many eyeballs" would help a great deal.
It's not interesting to say that lots of interesting eyeballs are helpful. Anybody would have said that prior to this article's publication. Raymond makes a much stronger claim (which is why it has the force of "law"), and it hasn't borne out.
if your process is “release often, but preserve quality by a staged release process (betas) in which power users can fix bugs before they reach the masses” then you’re pretty explicitly allowing bugs to be in the codebase transiently. combine that with projects of this era shipping with files like BUGS or ISSUES alongside their README or NEWS/release notes, and you have strong evidence that “transient” means “can knowingly be included in a release”. at that point it just feels false to read ESR as claiming anything strong like “with enough eyes the software will contain no latent bugs” (which i think is how one side of this thread is interpreting the essay?)
No, you can't get away with this semantic dodge, because Raymond numbered what he believed were the most important lessons he was imparting, and the one corresponding to Linus' Law is:
8. Given a large enough beta-tester and co-developer base, almost every problem will be characterized quickly and the fix obvious to someone.
He even attempted an axiomatic explanation:
Maybe it shouldn't have been such a surprise, at that. Sociologists years ago discovered that the averaged opinion of a mass of equally expert (or equally ignorant) observers is quite a bit more reliable a predictor than the opinion of a single randomly-chosen one of the observers. They called this the Delphi effect. It appears that what Linus has shown is that this applies even to debugging an operating system—that the Delphi effect can tame development complexity even at the complexity level of an OS kernel.
I was a developer at the time (still am), and if I'm remembering correctly, ESR was active in Slashdot and a few other places I hung out.
I took ESR's claim about bugs to imply that the quality of open source software would be greater than that of proprietary software because the number of people who had access to the code would inevitably result in less bugs. A lot of the discussions around C&B at the time were about software quality. I don't think anyone expected there to be zero bugs, just that there would be fewer.
I am not convinced it turned out that way, but that's an interesting discussion for another thread.
It seems at least plausible an argument I read years ago, maybe in Raymond Chen's blog, that in reality the only thing that makes a difference is paying people to look for bugs and fix them because people don't like doing that that much.
Well, I'm baffled then. From where I'm sitting that point 8 is clearly talking about what happens after a bug is discovered, and not about discovering bugs.
The longer paragraph doesn't seem to contradict the notion either. My impression (based on the "How Many Eyeballs Tame Complexity" chapter) is that Raymond thought that "debugging" means "fixing bugs".
If I were criticising this part of essay, I'd say the main weakness is that the things Raymond thought of as "taming complexity" weren't really addressing the hard problem of reducing the number of bugs.
As I mention upthread, a lot of the talk at the time was about overall quality of the software. So an interpretation that the total number of bugs (known and unknown) would be reduced is well aligned with this outlook.
If the objective is just to fix bugs that have been found, well, that doesn't really feed into this narrative. Also, ESR was making many claims about the efficacy of OSS, and limiting the scope to bugs already discovered would not really align with the rest of the goings-on at the time.
> 8. Given a large enough beta-tester and co-developer base, almost every problem will be characterized quickly and the fix obvious to someone.
The key word here is characterized. That word is not equivalent to found.
Security vulnerabilities are unique in that they matter despite being unknown.
Other bugs are only important because of their direct impact on users. It's not unreasonable to take everything here and apply it to known bugs, and not to unknown vulnerabilities.
In professional engineering, you characterize the behavior after someone finds and reports it. Or you characterize a flow transducer, or you characterize gas circuit compliance. It’s the thing you do once you know there’s something to characterize.
OK. Well in the variety of work I do, which I suppose isn’t professional by this standard, typically the person who finds a bug and the person who describes it are one and the same.
>> Given a large enough beta-tester and co-developer base...
That's a qualifier for what follows it. Seems true enough to me. If there are enough developers to notice bugs, they will probably be found and fixed quickly.
The argument here is that the pool of "developers" is essentially unlimited for open-source software. In principle I suppose that's indisputable, but in practice is it true?
Wow, what an impressive display of rhetorical gymnastics! It shows such fanatical dedication to the cause of carrying ESR's water. If you're up to it, I'd love to see you also have a go at rationalizing how his own words that I quoted aren't at all racist.
"My favorite part of the "many eyes" argument is how few bugs
were found by the two eyes of Eric (the originator of the
statement). All the many eyes are apparently attached to a
lot of hands that type lots of words about many eyes, and
never actually audit code." -Theo De Raadt
which, just to clarify, i didn’t mean as “one of us contradicts the other” but that “we experience different slices of life which are guided by different priorities”. i very much wouldn’t want to be a security researcher in the open source areas i occupy now. likewise i wouldn’t want to use the development practices i experience in community open source were i instead building a fighter jet.
looking at the bizarre overlap between “open source contributors who like {Nix,Haskell/FP,Rust}” and “people i know who work for defense contractors” though, i’m actually optimistic that our distance could shrink in time.
None of the significant security vulnerabilities that I can remember have been "deep" - subtle things that require extensive familiarity with the specific codebase and could never have been found by a drive-by contributor, which is the idea that ESR is refuting. Debian keygen bug? Dumb one-liner. Heartbleed? Dumb one-liner. Goto fail? Technically a logic bug, but a two-line thing that could be understood by reading that one file without any additional context.
I've seen cases where exploitation was complex, but even then, that tends to be because the full exploit is chaining together several bugs, each of which is individually simple and could be fixed by a drive-by contributor.
If that was the intended meaning of "shallow", why does the number of eyeballs matter? I think the more obvious interpretation is "quickly discovered".
In the article the number of eyeballs is pretty explicitly a minor caveat, and the main point is the shallowness in the sense of approachability:
> In Linus's Law, I think, lies the core difference underlying the cathedral-builder and bazaar styles. In the cathedral-builder view of programming, bugs and development problems are tricky, insidious, deep phenomena. It takes months of scrutiny by a dedicated few to develop confidence that you've winkled them all out. Thus the long release intervals, and the inevitable disappointment when long-awaited releases are not perfect.
> In the bazaar view, on the other hand, you assume that bugs are generally shallow phenomena—or, at least, that they turn shallow pretty quickly when exposed to a thousand eager co-developers pounding on every single new release. Accordingly you release often in order to get more corrections, and as a beneficial side effect you have less to lose if an occasional botch gets out the door.
This seems to be the main definition of deep vs shallow:
> It takes months of scrutiny by a dedicated few to develop confidence that you've winkled them all out.
By this definition, bugs like Heartbleed were indeed deep: they were not randomly found by a million eyeballs in a few days, they were were found by months of careful scrutiny by experts.
The fact that a bug can be fixed with a one line change doesn't mean that it's a shallow bug. A one-line bug can induce a rare, hard to observe behavior in very corner circumstances. Even if users are hitting it, the bug reports can be entirely useless at face value: they can look like a crash that someone saw once with no idea what was special that one time (I bet you would find OpenSSL bug reports like this that were symptoms of Heartbleed going back much longer).
This is why you need a complex and time-consuming QA process to even identify the long tail of deep bugs, which no amount of casual eyeballs will replace.
> By this definition, bugs like Heartbleed were indeed deep: they were not randomly found by a million eyeballs in a few days, they were were found by months of careful scrutiny by experts.
> The fact that a bug can be fixed with a one line change doesn't mean that it's a shallow bug. A one-line bug can induce a rare, hard to observe behavior in very corner circumstances. Even if users are hitting it, the bug reports can be entirely useless at face value: they can look like a crash that someone saw once with no idea what was special that one time (I bet you would find OpenSSL bug reports like this that were symptoms of Heartbleed going back much longer).
> This is why you need a complex and time-consuming QA process to even identify the long tail of deep bugs, which no amount of casual eyeballs will replace.
This is the point in dispute. As far as I can see Heartbleed did not require any special knowledge of the codebase; a drive-by reviewer taking a look at that single file had just as much chance of finding the bug as a dedicated maintainer familiar with the specific codebase. The fact that it was discovered independently by two different teams, at least one of which was doing a general security audit rather than specifically targetting OpenSSL, supports that.
The fact that it was only found 2 years after being introduced (by non-attackers at least), in one of the most used pieces of software in the world, suggests that it wasn't actually shallow by any definition.
I don't think it's relevant that it could have been found by anyone. We know empirically that it just wasn't. It was found by security auditors, which is just about as far as you can be from a random eyeball.
Edit: An even more egregious example is of course the ~20 year old Shellshock family of bugs in bash.
> The fact that it was only found 2 years after being introduced (by non-attackers at least), in one of the most used pieces of software in the world, suggests that it wasn't actually shallow by any definition.
Or that few people were looking.
> I don't think it's relevant that it could have been found by anyone. We know empirically that it just wasn't. It was found by security auditors, which is just about as far as you can be from a random eyeball.
It was found by security people with security skills. But those were not people closely associated with the OpenSSL project; in fact as far as I can see they weren't prior contributors or project members at all. That very much supports ESR's argument.
> It was found by security people with security skills. But those were not people closely associated with the OpenSSL project; in fact as far as I can see they weren't prior contributors or project members at all. That very much supports ESR's argument.
It doesn't. ESR's argument suggests OpenSSL should not hire security researchers to look for bugs, since all bugs are shallow and people will just quickly find them - the Bazaar approach.
What Heartbleed has shown is that the OpenSSL project would be much higher quality if it took a more Cathedral-like approach and actively look for security researchers to work on it, and make them a part of their release process. Because they didn't, they released with a critical security vulnerability for more than a year (and there are very possibly many others).
Especially in security, it's clear that some bugs are deep. Any project that cares about security actually has to follow a cathedral-like approach to looking for them. Releasing security critical code early is only making the problem worse, not better.
This is a "Seinfeld is unfunny" situation. People don't remember what development practices were like prior to this essay.
> ESR's argument suggests OpenSSL should not hire security researchers to look for bugs, since all bugs are shallow and people will just quickly find them - the Bazaar approach.
It suggests they should release early and often to allow outside researchers a chance to find bugs, rather than relying on internal contributors to find them. Which seems to have worked in this case.
> Especially in security, it's clear that some bugs are deep. Any project that cares about security actually has to follow a cathedral-like approach to looking for them.
No it isn't. I still haven't seen examples of bugs like that.
> Releasing security critical code early is only making the problem worse, not better.
How is missing a critical security issue that endangers all encryption offered by OpenSSL for more than a year (giving attackers access to your private keys via a basic network request) "working in this case"?
> No it isn't. I still haven't seen examples of bugs like that.
If you don't think finding Heartbleed after a year in OpenSSL was the process working, how about finding Shellshock was hiding in Bash for more than 20 years? Was that still a shallow bug, or is bash a project that just doesn't get enough eyes on it?
> How?
By letting people expose their data for years with a false sense of security. By encouraging projects to think security is someone else's problem, and they don't need to really worry about it.
Rather than releasing support for TLS heartbeats that steal your private keys for a whole year, it would have obviously and indisputably been better if the feature had been delayed until a proper security audit had been performed.
That the bug was eventually found is in no way a merit of the model. People find security bugs in closed source software all the time. The NSA and the Chinese and who knows who else usually find them even earlier, and profit handsomely from it.
> How is missing a critical security issue that endangers all encryption offered by OpenSSL for more than a year (giving attackers access to your private keys via a basic network request) "working in this case"?
The fact that it was found by people outside the project is the system working.
> If you don't think finding Heartbleed after a year in OpenSSL was the process working, how about finding Shellshock was hiding in Bash for more than 20 years? Was that still a shallow bug, or is bash a project that just doesn't get enough eyes on it?
Yes it's a shallow bug. I mean look at it. And look at who found it.
> Rather than releasing support for TLS heartbeats that steal your private keys for a whole year, it would have obviously and indisputably been better if the feature had been delayed until a proper security audit had been performed.
How much auditing do you realistically think a project with a grand total of one (1) full-time contributor would've managed?
If the code hadn't been publicly released we'd still be waiting for the bug to be found today.
> The fact that it was found by people outside the project is the system working.
This happens all the time to Windows, to Intel's hardware architecture, even to remote services that people don't even have the binaries for. There is nothing special about people outside the team finding security bugs in your code. After all, that's also what attackers are.
> Yes it's a shallow bug. I mean look at it. And look at who found it.
If a bug that hid from almost every developer on the planet for 20 years (that's how popular bash is) is still shallow, then I have no idea how you define a non-shallow bug.
> How much auditing do you realistically think a project with a grand total of one (1) full-time contributor would've managed?
That's irrelevant to this discussion. Per the essay, even a company as large as Microsoft would be better off releasing anything they do immediately, instead of "wasting time" on in-house security audits.
> If the code hadn't been publicly released we'd still be waiting for the bug to be found today.
I'm not saying they shouldn't have released the code along with the binary, I'm saying they shouldn't have released anything. It would have been better for everyone if OpenSSL did not support heartbeats at all, for a few more years, rather than it supporting heartbeats that leak everyone's private keys if you just ask them nicely.
This is the point of the Cathedral model: you don't release software at all until you're reasonably sure it's secure. The Bazaar model is that you release sofwatre as soon as it even seems to work sometimes, and pass on the responsibility for finding that it doesn't work to "the community". And the essay has the audacity to claim that the second model would actually produce better quality.
> There is nothing special about people outside the team finding security bugs in your code.
That supports the point.
> If a bug that hid from almost every developer on the planet for 20 years (that's how popular bash is) is still shallow, then I have no idea how you define a non-shallow bug.
A bug where you think "yeah, no-one except the x core team could ever have found this". A bug where you can't even understand that it's a bug without being steeped in the project it's from.
> That's irrelevant to this discussion.
Disagree; that the Bazaar can attract more contributions is a big part of the point.
> This is the point of the Cathedral model: you don't release software at all until you're reasonably sure it's secure. The Bazaar model is that you release sofwatre as soon as it even seems to work sometimes, and pass on the responsibility for finding that it doesn't work to "the community".
Few people were thinking about security at all in those days, at least the way we think about it now; the essay isn't about security bugs, it's about bugs generally. The claim is that doing development in private and holding off releasing doesn't work, because the core team isn't much better at finding bugs than outsiders are. The extent to which a given project prioritises security versus features is an orthogonal question; there are plenty of Cathedral-style projects that release buggy code, and plenty of Bazaar-style projects that release low-bug code.
It did the literal opposite: the TLS Heartbeat Extension was itself a bazaar (and bizarre) random contribution to the protocol. The bazaar-i-ness of OpenSSL --- which has since become way more cathedralized --- was what led to Heartbleed, both in admitting the broken code and then in not detecting that code regardless of the fact that it's one of the most widely used open source projects on the Internet. It comprehensively rebuts Raymond's argument.
I remember what development practices were like prior to this essay, which is part of why I feel so strongly that it's overrated. Several other people on this thread were working in the mid-late '90s too.
If "few people were looking" at OpenSSL, one of the most widely-used pieces of open source software in the entire industry, Eric Raymond's point is refuted.
The whole thesis is that the open source userbase forms the army of eyeballs that will surface all bugs --- they're part of the development process. So no, this dodge doesn't work either; it doesn't cohere with what Raymond said.
Spectre, Meltdown and Rowhammer were all discovered in the same timeframe as the things you listed, and were decidedly more complex than dumb one liners. Yes, those are all have a hardware component (and thus the "fixes" are software workarounds), but the fixes involved way more than one or two lines of work.
No. Architectural leakage via a side-channel is far from simple. It requires deep understanding of the code (i.e. what it does, why it is doing it that way and the implied effect of executing the code on the architecture) and also of the architecture (i.e. how it is executing and why that causes leakage to be observable via the side-channel).
If anything, these are canonical examples of non-simple bugs.
> It requires deep understanding of the code (i.e. what it does, why it is doing it that way and the implied effect of executing the code on the architecture) and also of the architecture (i.e. how it is executing and why that causes leakage to be observable via the side-channel).
For exploitation, or for patching, maybe. But characterizing these bugs is still clear and simple IMO. I mean, rowhammer is about as basic as a bug can get - the value of this bit in memory changed even though it was never written to.
Rowhammer isn't a speculative execution bug. Without getting into how complex Rowhammer is or isn't, speculative execution is an inherently complex and software-involved pattern of vulnerability.
You're letting yourself get dragged into some weird rhetorical places trying to defend this part of Raymond's essay.
> Rowhammer isn't a speculative execution bug. Without getting into how complex Rowhammer is or isn't, speculative execution is an inherently complex and software-involved pattern of vulnerability.
Rowhammer was one of the examples the person I was replying to listed. But the same applies for spectre/meltdown/etc.. They're all issues that can be understood - perhaps not exploited, certainly not fixed, but understood that a bug is there - without any deep architectural knowledge.
> You're letting yourself get dragged into some weird rhetorical places trying to defend this part of Raymond's essay.
If anyone other than you talked like this they'd be quite rightly downvoted to oblivion.
> They're all issues that can be understood - perhaps not exploited, certainly not fixed, but understood that a bug is there - without any deep architectural knowledge.
I don't think it is clear how you would do this other than with hindsight. When the bug was discovered, if we avoid architectural knowledge of why the bug was occurring then what information would the discoverer have access to?
- Memory corruption is occuring during execution.
- We can't find any part of code that appears to corrupt the values.
Finding the bug requires finding the Rowhammer affect, which means looking isolating the instructions that are accessing similar uncached addresses, and then finding the pattern of accesses and cache flushes which triggers the corruption.
Could you explain how this would be simple, or do you have a different view of the bug complexity that you could explain?
> if we avoid architectural knowledge of why the bug was occurring then what information would the discoverer have access to?
> Finding the bug requires finding the Rowhammer affect, which means looking isolating the instructions that are accessing similar uncached addresses, and then finding the pattern of accesses and cache flushes which triggers the corruption.
AIUI you don't need to know the details of the caches though? I guess you could argue that knowing the physical memory layout is deep knowledge, but you don't need to be an expert on e.g. that specific chip or memory controller to find that effect, no? And fundamentally if you can produce memory corruption then you can know that's a bug without having to understand the details of the memory hierarchy or hardware.
Perhaps we are talking about overlapping but slightly different concepts? If I run the program and it goes wrong then I know there is a bug - but I don't know what or where it is.
So observing that Rowhammer causes buggy behavior may be simple. But finding the actual bug - i.e. this sequence of instructions on this data causes the memory corruption would be much harder.
Part of this comes down to how you interpret the "many eyes makes bugs shallow" but I suspect that no number of eyes would have found the Rowhammer bugs by looking at the code.
I've known ESR and RMS (who he's made his career trying to destroy) for well over 35 years, so I can testify that ESR has always been insufferably narcissistic, arrogant, and smarmy, but 9/11 sent him over the deep end of racism, islamophobia and misogyny (google "Is the casting couch a fair trade?").
His software work is unremarkable (fetchmail is trivial and buggy, and CML2 was rejected by the Linux kernel developers), and he's made his career not by writing software but by attacking real hackers like Richard Stallman and trying to bring down their work, not by actually constructively developing any useful software himself. Attacking real hackers and trying to discourage people from using "free software" isn't "hacking".
He used to call himself "Eric the Flute", and he would go on and on ad nauseum about his beloved "Teenage Mutant Ninja Turtle NetNews Reader" and how it was so much better than every other netnews reader. But he never collaborated with anyone, and he never released it under any license. Much more Bizarre than Cathedral.
The main criticism (IMO) isn't really the work itself but how it's handed down as wisdom. It wasn't really cathedral==proprietary and bazaar==open source. In fact it was about cathedral vs. bazaar in open source software and that story is more complicated and comes down to "It depends." There are certainly open source projects that have more of a cathedral aspect whether because of benevolent dictator model or, more recently, a strong commercially-oriented foundation. But there are also areas that look more like a bazaar which have some advantages with respect to innovation but can be... messy.
ESR is on record stating that the original focus of 'cathederal' was on GCC and Emacs, the terminology was applicable to proprietary software and top-down corporate cultures.
I did enjoy reading his blog though, well mostly the comments. And then it just died a few years ago and never came back. I always wondered if the “database server” issue mentioned was just an excuse to stop blogging.
I'm probably mistating my "never felt" earlier. At the time it was the common assumption for those on slashdot (incl me) who had heard all the soundbites but not actually read it. It was only a bit later I realised it was about contrasting styles of running floss projects.
Understandable though, by the time it came out, the battle lines had shifted from Linux vs GNU to free/open vs proprietary anyway.
I found the whole ncurses drama tragic and sort of ridiculous because ESR behaved in a very non-bazaar manner. Held on to that thing like it was a precious family heirloom.
Really lowered my opinion of this book and ESR a lot. Not that it was ever particularly high.
He also held onto, refused to share or collaborate, but bragged incessantly and inappropriately about his glorious "Teenage Mutant Ninja Turtles Netnews Reader", during the early 1980's when I knew him from east coast science fiction conventions and the arpanet, when he called himself "Eric the Flute".
He'd corner unsuspecting people at parties, hijack ongoing conversations, and insufferably yap on and on and on about it, to the exclusion of anything else, dissing all the other competing netnews readers. And he steadfastly refused to collaborate with anyone else or share it, and of course he never finished it either. Now he hates to even talk about it. If you run into him and want to make him shut up, ask him about it and see how awkward his response is!
Date: Sat, 17 Feb 2001 23:49:58 +0000 (GMT)
From: Terry Lambert <tlambert@primenet.com>
To: roam@orbitel.bg (Peter Pentchev)
Cc: arch@freebsd.org
Subject: UUCP must stay; fetchmail sucks (was list 'o things)
[...]
>-- A tangential diatribe on the unsuitability of fetchmail -------
>As to fetchmail: it is an abomination before God. If someone in
the press ever paid for an audit of the source code, the result
would refute the paper "The Cathedral and the Bazaar" to such an
extent that it could damage the Open Source movement, which has
pinned so much on the paper, in ill-considered haste.
>ESR has constantly maintained that fetchmail is "not an MTA", and
he is right: it could be, but it's not.
>When mail is delivered to a POP3 maildrop, envelope information
is destroyed. To combat this, you would need to tunnel the
enveleope information in headers. Generally, sendmail does not
support "X-Envelope-To:" because it exposes "Bcc:" recipients,
since fetchmail-like programs generally _stupidly_ do not strip
such headers before local re-injection of the email.
>Without this information, it can not recover the intended
recipient of the email. The fetchmail program delivers this
mail to "root".
>The program has another bug, even if you elect single message
delivery (in order to ensure a "for <user@domain>" in the
"Received:" timestamp line. The bug is that it assumes the
machine from which the download is occurring is a valid MX for
your domain. Many ISPs use one machine to do the virtual domain
expansion, and another to do final deliver into ISP hosts POP3
maildrops. The net effect of this is that it attempts to use
the "for <domain-account@isphost>" stamp, since it does not
reverse-priority order "Received:" timestamp lines.
>Similarly, fetchmail fails to order headers in "confidence"
order. This means that, given an email with a "valid" (MX
matches in the "by <MX>" and a "for <user@domain>" exists)
"Received:" timestamp line, a "To:", "Cc:", or "Bcc:" line, or
an "X-Envelope-To:" line (which must be configured, and which
is terrifically screwed up by qmail, requiring un-munging),
fetchmail -- takes the first one it sees, not the most correct
one.
>Using the "To:", "Cc:", or "Bcc:" lines ("data") to do the
delivery buys a spammer the ability to relay mail, though the
route it must take is rather circuitous. It also means that
if the "Bcc:" was properly stripped before handing the RFC 822
message to an MTA, or if you are a list recipient, that data
is useless for recovering envelope information. This means
that root gets all mailing list mail from lists which do not
do recipient rewriting (like the SVBUG list does), and root
also gets all mail addressed to non-existant local users that
was intended for particular local users (all SPAM and all
mail that was requested but not sent specifically targetted to
a particular user, via email header data).
>Unfortunately, ESR would not accept patches for the mistaken
MX problem, nor for the preference order problem, nor for the
tunneled envelope information stripping problem. He seemed to
be too busy with speaking engagements, and has since declared
fetchmail to be in "maintenance mode", in order to demonstrate
a recognizable commercial software lifecycle for an Open Source
project, to give business the warm fuzzies.
>-- End diatribe ------------------------------------------------
yup. its hard to describe what it was like back then. i was young and the idea that this hobby open source internet thing i had been interested in for a few years, had become mainstream, with Linus on the cover of Forbes, was mindblowing. i knew maybe 2-3 people in my entire life who had ever installed linux and that was only because of college. nobody else cared. but then Linus and ESR became a celebrity. dotcom bubble was in mainstream news and open source was part of the bubble. internet, and computers, were going mainstream.
So that is why i thought Cathedral and Bazaar was cool. Because i was young. didnt have access to Brooks book or others, physically, monetarily, but also it wasn't even written in my language. ESR's Cathedral was a textfile i could get off my dialup. and there just wasn't much else talking about what he was talking about.
Took me a while to realize. oh. hes just kind of a aggro dude writing extremist politics and ... this is like pseudo sociology. and nobody uses fetchmail.
edit - this is weird to think about but it's almost like... how they say 1991 was the "year punk broke", Nirvana and all this alt music / college rock was selling millions of copies and headlining festivals. these guys went from obscure regional bands who have to wash dishes for a living in between gigs, to international fame and millionaires almost overnight. im guessing there is a rock equivalent of Catb and i guess theres an equivalent discussion about how the music version of CatB wasnt really that insightful. but it sure was popular amongst the young folks, who saw the world change before their eyes.
I had a similar experience around 2008 when I was first getting into the type of stuff we talk about around here. My media literacy wasn't what it is now, so I didn't know just how far outside the mainstream some of it was, and I certainly didn't pick up on the underlying politics.
As a corporate low level programmer far away from the SV world, this book was rather eye opening to me. It might not have been the best take on the world or proven out in history, but reading it helped me down the path of open source and understanding the differences between what I was doing and what was going on elsewhere. I'm glad C&B was written and that I read it when I did.
I've noticed some interesting things about open source projects that I don't think ESR mentions:
1. Most contributions from volunteers can't be used as-is and require significant work to incorporate; code reviews take a great deal of work, and requests for revision are typically ignored.
2. Paying maintainers and developers is usually the most effective way to enable them to work on the project long-term.
3. In the absence of formal structure, projects can become cults of personality (hopefully benevolent dictators) or community (growing the community is more important than actually creating good software), possibly to the detriment of the project.
4. Companies love it when you build software for them for free, and are eager to exploit any open source software, e.g. by selling it as a service. But they hate it when their own employees work for free on things that the company doesn't care about. They also don't like paying to develop anything (such as open source software) which will be given away for free, perhaps to their competitors.
I think much of the reason the essay was popular was not so much the actual contents but that the metaphors captured the imagination. Software developers weren't simply toiling for the man, but could instead be building a beautiful bazaar. Whether or not that is true or not is debatable, but i think it captured the imagination more so than fsf's overly political freedom based rhetoric.
Most of the OSS software I really love is neither... more of a well tended zen garden.
Cathedrals have all kinds of issues, and bazaars either dissolve into endless feature creep (inevitably with terrible defaults) or an exercise in bikeshedding while critical issues go untouched because they aren't "fun".
The issue I've seen is OSS has rarely scaled to large projects, with the Linux kernel being the exception. Part of why desktop Linux never happened was people kept arguing over desktop environments and UI toolkits, and it led to an incomplete, glued-together product. There wasn't a holistic vision, leaving design arguments to go on for decades.
Another problem is software in the purest sense stopped being the hard part. Look at Twitter, but this applies to lot of products that got popular after ~2000. The community is its asset, a lot of effort goes into just keeping it running, and human intervention (moderation) is/was often needed to tend to the garden.
OSS can be a building block, but the challenges we see today are at a higher level.
on a side note: I would argue that Unix-on-the-desktop for non-nerds was basically always a pipe dream (excluding highly locked down things like a chromebook), and indeed the unix philosophy itself doomed it. Lots of tiny tools chained together doesn't really translate to a GUI paradigm.
For sure. Figured that was too much of a sacred cow.
I wonder what a clean room user space might look like. Maybe the fish guys could write it. They clearly have some interesting ideas for dragging the shell into this millennium.
While you are rating things, let me chime in and say I still use fetchmail today. For some reason, I prefer a local Dovecot over everything else that needs my laptop to be online to browse the INBOX.
IIRC ESR also pontificated about dead bits vs. live bits and pretty much predicted the demise of the technical book industry which was massive at the time.
> What's true in [CatB] isn't interesting ("the importance of having users", "release early release often") and what's interesting isn't true ("Linus's law" being perhaps the most notorious example).
At the time or now? You were there, and so might mean the former, but I suspect most of the discussion here means the latter. And as a work of philosophy that has been assimilated into the mainstream, CatB has to look like a mix of trivially true and blatantly false things—that’s what mainstream acceptance of philosophy looks like[1,2].
so much hindsight bias, sure, everything you take for granted now is not interesting, and for the parts that you think are not true you have just as little evidence as he had in 1997
If anything’s true about Internet / hacker / open source culture at the time, it’s that the bar was lower for someone to have a following of devoted fetishists willing to spread the Good Word. People can call it 20/20 hindsight all they want, and some of it is, but looking at all these sacred cows for what they are through a modern lens is important. ESR was, and is, a tool. So much of what was written and highly regarded Back Then is just poorly written self-congratulatory nerdy fairytales with an arguable connection to reality.
rektide> "Edit: anyone care to speculate on why I'm so quickly at -2? I think there's perhaps some ideogical disagreement here but it's not clear what is so principally intolerable about a perspective against such persistent & vociferous negativity."
My speculation: you're reading a lot more heat into 'tptacek's comments than is there, and raising the temperature of the discussion unnecessarily. For example:
rektide> "My god man."
rektide> "I don't see how you can so callously disregard the most central message & imagery of this book. Yes, the individual ideas are ill spoken & poorly emphasized pieces of the key revelation"
I don't read 'tptacek as callously disregarding anything. Those examples are used to support esr's thesis, and to use them to test whether or not the thesis is supported is fair. And "callously disregarding" is pretty charged language.
rektide> "...and in your 9 comments here so far you have nothing but bile to spill about them perhaps rightly."
Again, "nothing but bile" is pretty charged and in my opinion an inaccurate description of his comments. The most arguably hyperbolic part I read from him is "I think a case can be made that this is one of the most overrated pieces of technical writing of the last 25 years, and even that is tempered by the preface "I think a case can be made that". There are places where he's pretty declarative in his language, but nothing I read is pejorative or an attack/
rektide> "But I cannot broke this hit campaign you are launching, when you are so persitently cantankerous & negative, without ever going near the actual inner truth that it did remarkably connect a generation with. I wish you could have some balance & some ability to see value, Thomas."
There are numerous places where 'tptacek acknowledges positive aspects of the piece:
tptacek> "What's notable about Cathedral is its timing; it did capture the zeitgeist of what was an important moment in the computing field"
And is actively engaged in furthering the discussion:
tptacek> "I appreciate getting called on this and forced to think more carefully about it. Thanks for the response!"
rektide> "The true-feeling parts of Raymond's article read to me like a document of, for lack of a better term, late 1990s programming thinking."
Again, pretty charged.
Anyway, that's what I think is likely the cause of the downvotes from reading your comment in the context of what 'tptacek has written. But of course, I'm not a mind reader.
(I'm just sharing my impressions because you asked and in the hope that you find them useful: I'm not likely to follow up in this thread because discussions of downvoting and tone are rarely useful. I question the utility of me even posting this response.)
Thanks. That was a lot of time spent; I appreciate your commitment to inquiry/discovery.
I still struggle with what I see as a Thomas fighting a war of technicalities to avoid the actual point, and I tried to leave significant rope to allow his annoyances. And I don't K ow how to walk back what I still feel like is an injustice that hurts the truth so badly, don't see how temperance can be found against this. But I get my post's polarization better and see how the room is so happy to stick to Thomas here.
Cheers. I'm glad it was at least somewhat helpful. And you've got me responding when I said I wouldn't, and that's something :)
> "I still feel like is an injustice that hurts the truth so badly,"
In my opinion, this is what you should be supporting. Providing examples that support the truth goes a lot further. Off the top of my head, a question I have is why is the cathedral/bazaar model better than the alternative 'tptacek puts forward from Gabriel's Worse Is Better or Zawinski's CADT model at describing open source? (https://news.ycombinator.com/item?id=35941338) What in particular do you find useful and true in CatB that you believe 'tptacek is dismissing unfairly?
> "how the room is so happy to stick to Thomas here."
One doesn't have to agree with 'tptacek to think your comment was unhelpful. It's not zero-sum, or us-vs-them.
I'm not great at contentious discussions (particularly on internet forums). Your "I still struggle with..." resonates with me: it's something I find myself thinking in situations like this. The best I can do is try to catch myself, and figure out how to express what I'm trying to get across in ways that move the conversation in a constructive direction. Easier for me to say than to do.
grr: looks like I didn't proofread this enough, including misattributions. Attempted corrections:
----
rektide> "But I cannot broke this hit campaign you are launching, when you are so persitently cantankerous & negative, without ever going near the actual inner truth that it did remarkably connect a generation with. I wish you could have some balance & some ability to see value, Thomas."
Again, pretty charged.
There are numerous places where 'tptacek acknowledges positive aspects of the piece:
tptacek> "What's notable about Cathedral is its timing; it did capture the zeitgeist of what was an important moment in the computing field"
tptacek> "The true-feeling parts of Raymond's article read to me like a document of, for lack of a better term, late 1990s programming thinking."
And is actively engaged in furthering the discussion:
tptacek> "I appreciate getting called on this and forced to think more carefully about it. Thanks for the response!"
Reminds me of the challenge of figuring out the rules to the Royal Game of Ur. It was surprisingly widespread and popular in the ancient world. There were sketched game boards and references to it in written sources, but hardly anyone had written down rules for it. It was held to be common knowledge.
I think for me, when I was first learning software development 12 years ago, I had heard about Linux and open source, but I didn't really understand how it operated or organized itself. I had seen Wikipedia appear in the early 00s and understood that distributed groups could develop something better than centralized entities (such as Microsoft's Encarta or Britannica's encyclopedia), but the analogy of a centrally planned cathedral, carefully coordinated, vs the organized chaos of a bazaar, was useful for me in understanding why software development is quite unlike other engineering disciplines, especially once software was augmented with the ability to update itself over the internet.
You can build software like a traditional engineering project, with a chief architect and lead engineer drawing up plans along with a team of people that map out all the specs ahead of time. But the internet changed everything. It made distributed coordination possible, and long-running, complex open source projects that could outlive or outgrow their founding team, became achievable.
But the reality today at least is that that is not how big, successful open source projects get developed. They are very much handled as large engineering projects, with architects and lead engineers for each area or feature at least. You won't get a new feature into the Linux networking stack without discussing and reviewing it with the network maintainers, for example. And ultimately you'll still need Linus' approval if it is a major feature.
It's also important to note that some of the major open source projects are currently, in practice, collaborations of multiple large corporations. At least this is clearly how Linux works (the vast majority of contributions are coming from corporate players such as Microsoft, IBM, Intel etc), Clang, Kubernetes.
Firefox and the GNU suite of tools are probably the largest exceptions, along with a few big projects developed by a single company with few outside contributions (JVM, ElasticSearch, Grafana, etc).
> You won't get a new feature into the Linux networking stack without discussing and reviewing it with the network maintainers, for example
Sure, there are trusted experts in open source development too. Of course you have to get your PR discussed to get it checked in! How would the system be resilient to bugs or hacks or scope creep if not? Of course, you're always welcome to fork it and start your own thing.
There was a 2017 report on Linux Kernel Dev [1], it states that "Since 2005 and adoption of Git DVCS, 15,637 developers from over 1,400 companies have contributed to the Linux kernel. Since last year, over 4,300 devs from more than 500 companies have contributed to the kernel. Of these, 1,670 contributed for the first time, or about a third of the contributors."
That's not a centrally controlled and planned engineering project. Microsoft Windows is centrally controlled. Apple's macOS is centrally controlled. What are your odds of getting a kernel feature added to either of those unless you work at those companies? Zero.
Not only are many successful open source projects run as cathedrals, but often it turns out that hunchbacks are in charge of crucial parts of the cathedrals.
> Perhaps this is not only the future of open-source software. No closed-source developer can match the pool of talent the Linux community can bring to bear on a problem. Very few could afford even to hire the more than 200 (1999: 600, 2000: 800) people who have contributed to fetchmail!
> Perhaps in the end the open-source culture will triumph not because cooperation is morally right or software ``hoarding'' is morally wrong (assuming you believe the latter, which neither Linus nor I do), but simply because the closed-source world cannot win an evolutionary arms race with open-source communities that can put orders of magnitude more skilled time into a problem.
Problematically I don't really see anywhere discussion where that large amount of skilled time originates; the economics of the situation. Even developers need to put food on the table.
Further, the largest open source projects --- Linux included --- are increasingly gatekept, and have their own priesthoods. And that's arguably been for the good!
But it's a different kind of gatekeeping to the increasingly exploitative and user-hostile world of commercial closed-source software. Perhaps "leaders" in a very literal sense would be a more apt term than "gatekeepers".
That's one way that the FOSS model will always have a built-in advantage: if anyone steering the ship tries to throw the users overboard then someone else can always fork the project as a last resort. This has actually happened in a few significant cases and is always there as a warning sign to anyone thinking of steering a new course for the wrong reasons.
I believe the same thing everyone else does about the superiority of open-source software development. I just don't think this article is a good explanation of where that superiority comes from. In large part, the importance of Linux's code being open is that it makes it safer for large corporations (Intel, Google) to invest time in it; that has nothing to do with a bazaar ethic!
I've come around to thinking of corporate open source as a way companies can partition what they want to compete vs collaborate on. This is sort of a variation on the commoditize your complement strategy. Most of business is not zero sum, so I don't think it's particularly surprising competitors do agree on sharing the cost of maintaining infrastructure that isn't their moat or basis of market share.
Open source has gotten infinitely more industrial-commercial, but it has only further proven again and again and again the Bazaar model.
Cathedrals develop fast & then quickly move to only ossify & stagnante. Bazaars beget diversity & experimentation & exploration, which is expanded upon & grown & reused, and can keep "feeling" at the edge.
Heck yes software organizations realized their stale dry inwarsa focused industrial processes were being outclassed, were making them uncompetitive versus software models where the world could participate suggest & expand the ideas.
That the corpoate world has assimilated some of the genetics to remain competitive at all seems unremarkable to me, versus the story of how vastly the Bazaar model has come to be necessary to be an at all longstanding player.
I am wondering if you can share a few words your skepticism around Microsoft and Open Source.
You sound pragmatic and I might learn a lot from your views.
Corporations are not people and I still see some people within MS do some dumb stuff. That said, the company overall seems to have embraced Open Source quite fundamentally and genuinely.
First off, most of .NET is firmly and truly Open Source at this point. Azure is deeply in bed with Linux. Microsoft Edge shows that MS has learned that they can add value without having to take on the entire codebase.
And the fact that basically all software is filled with dark patterns these days, so you might be able to say Microsoft is far from the worst offender
But personally I would choose to invest my time elsewhere
---
I would probably rather use commercial software with telemetry enabled as long as it was disclosed, rather than "open source" with intentional confusion around the issue
>Very few could afford even to hire the more than 200 (1999: 600, 2000: 800) people who have contributed to fetchmail!
I'm willing to bet those "200" people are 1-3 core contributors and 197-199 people who once fixed some bug and have like 2 commits in the whole codebase, or changed a comment's message.
Also "Fetchmail" as some kind of high bar? Commercial companies have produced software with 10 to 100x the scope and user base (and quality). Including back then.
It is the case that the open source model has made Linux difficult to compete with (though that is in significant ways the result of commercial development; still, those corporate contributors were coerced into open source development by Linux's success, and would have done things differently in a VxWorks alternate history).
But that same open source model produced Fetchmail. Most startups could produce something comparable to Fetchmail, even if forced to use 1999 tools. The open source model didn't turn Fetchmail into an insurmountably dominant tool in its space; Fetchmail had fallen into irrelevance even before Google Mail made all of these kinds of tools irrelevant.
OK, they started using git only in ~2010, so earlier commits are less likely to be attributed correctly, but still, that's not even the same order of magnitude.
This is very off-topic and for that I apologize, but can anyone shed some light on why so many texts of this era use 2 backticks/graves as a left double quote and 2 apostrophes (neutral single quotes) as a double right quote?
I could understand using neutral single quotes all around if the double quote character wasn't available, but why the backticks? I get that they look more out of place with proportional fonts than fixed-width, but even when fixed-width was ubiquitous and even if this usage of glyphs looked symmetrical, it would've deviated from how the code points are defined, right? Or were the definitions so multi-valued (like the "hyphen/minus" character) that this was legitimate?
Worse is better is a discussion on system design. Both parties represented in worse better are implicitly cathedrals: both are singular actors, both with the same value system even (Simple, Correct, Consistent, Complete).
None of this embodies the Bazaar to me. The spirit of diversity, chance, chaos, & above all the pluralism: the ability to get Perl style post-modern in approaches, is largely absent from the WIB discussion.
The Bazaar is multisystem, multi-agent, in a way that most people building software don't tangle with. It's fundamentally different if you can encompass much more than yourself, than any one effort.
The Cathedral & The Bazaar is a book that should make these kinds of distinctions in thinking abundantly apparent. Few of us can ever really experience or witness the Bazaar. It's too vast. The book gives us an impressive tale, where we can begin to see wide exploration & reasoning occuring far beyond the scales that even the biggest boldest widest orgs on the planet can see.
Well, very clearly "The Right Thing" (i.e., MIT school) is the approach used when building a cathedral and I don't think it's that far of a leap (especially, anyone who could read this essay in 1999 was intimately familiar with the Cold War, in part a struggle of planned economies vs market economies, and there's a bit of The End of History triumphalism here).
The cathedral and the bazaar may be a memorable metaphor, but the essential ideas (rationalism vs empiricism) are ancient.
The Cathedral and the Bazaar often comes in discussions along with the Song of Sisyphus[0], a story of how Sisyphus software became widely used within Google. The use of it spread over bar drinks, completely organically. And Google had a very hard time deprecating it.
Right. If you want something that has perhaps a more profound and interesting description of the open source model (as Zawinski would put it, the "CADT model"), it's worth going back to Richard Gabriel:
Gabriel's essay is cited in Raymond's article. It's interesting to compare them; Raymond claims that Gabriel anticipated much of what his article was going to say, but I think they have markedly different takes about it.
"Worse Is Better" is elegiac, but it can also come across as a whine: "Those dumb users can't see Quality, they can only see Done, so they ignore my Beatific Solution in favor of some New Jersey Code that isn't nearly as good as what mine is going to be." Well, in a world where Done is a myth, Quality is a horizon; it's impossible to achieve Quality when the nature of your world is changing around you. The Right Thing is only Right as long as the Thing stays still.
At some point he explained that this isn't racist because he could imagine an alternative universe where black people weren't dumb and violent, and in that universe he'd have 0 problems with black people.
Eric S. Raymond has imaginary quantum black friends.
One of the most hilariously absurd things I've ever read.
Probably tread a little lightly when we're talking about "what happened" with Raymond, since my understanding is that he's dealing with some pretty significant health problems. I won't sugarcoat my opinions of his work, but neither would I wish those kinds of problems on anyone, nor would I want to be seen as taking shots at their inability or reluctance to respond while dealing with them.
It has been a while but I feel like the political and racial content is pretty in line with stuff that used to go on his blog; that just didn't use to be his main focus.
Old articles about open source help show how dynamic the patterns of bazaar like software building and how difficult to grasp new phenomena in their entirety.
There are different types of bazaars. We have not seen them all yet. The "invasion" of corporate interests is much discussed, but its just one of many new types of legal entities that get involved: non-profit entities and the public sector.
There are two drivers that are persistent and act like heat elements that keep steering the pot of bazaar like behavior: 1) software is important and 2) software that is not closed and proprietary is possible and even necessary.
The involvement of the public sector in particular seems still quite nascent. Yet the concept of "digital public goods" is already here (there is even an initiative with that name) and will likely play more important role going forward.
The perennial issue with open source is funding, but this reflects some peculiarities of the most decentralized type of bazaar.
In other words, while the bazaar has been happening in the town square for a few decades now, it keeps changing and, importantly, it keeps growing.
I did a post-doc in one of the top polymer chemistry department. There, I noticed people tend to share the minimum amount of information. I then left this quote prominently displayed in the hall: "Alchemists turned into chemists when they stopped keeping secrets." - Eric S. Raymond
The thing I always notice about this is that nearly every piece of software in my life is Cathedral model.
A lot of amazing stuff gets invented by the community, but it generally never becomes modern polished software till a Cathedral builder gets their hands on it.
Users probably flagged your post because on HN a previous thread is only considered a dupe if it got significant attention, and that one didn't. (This is in the FAQ: https://news.ycombinator.com/newsfaq.html.)
Btw we invited wallflower to repost it (which is part of how we get things into the second-chance pool (https://news.ycombinator.com/pool, explained at https://news.ycombinator.com/item?id=26998308), because we noticed that not only did that submission not get attention, the essay really hadn't been discussed much on HN over the years.
What's notable about Cathedral is its timing; it did capture the zeitgeist of what was an important moment in the computing field, the moment where we transitioned from 386bsd-style hobby projects to an industry run on free and open source software. But Raymond isn't the reason why any of that happened, and much of his description of that moment is faulty; the rest of it is just a retrospective of the engineering decisions involved in the writing of a midlist mail processing utility (fetchmailrc syntax, password encryption, the now-largely-irrelevant distinctions between MDAs and MTAs).
Even the high-level organizing notion of "cathedrals" and "bazaars", which should have been a lay-up, hasn't really proven out.