Re the viable system model, I can recommend Dan Davis, The Unaccountability Machine, which also touches on the Allende story, and Patrick Hoverstadt, The Fractal Organisation. My own VSM-inspired book with a modern complexity lens will be published next year.
Seconding the recommendation of The Unaccountability Machine; if anyone wants a taster of the book, patio11 did a great episode of his Complex Systems podcast where he interviews the author about it: https://www.complexsystemspodcast.com/episodes/dan-davies-or...
Problem with VSM, is where is the practical evidence of it working, if it's so great surely there would be more uptake of it?
I mean it sounds like a reasonable model, I'm sure there is a lot of nuance to it, but getting it up and running at an existing enterprise would be a very difficult to do, and at great risk without some solid case studies showing significant ROI.
> At least I got 'The purpose of a system is what it does' stuck in my head.
Which seems like a rather odd understanding of "purpose" divorced from any possible use of the word. What is the point of talking about "purpose" if you can't persuade the person intending that purpose to change their mind? Why not talk about "use" or "effect" instead?
[Beer] frequently used the phrase “The purpose of a system is what it does” (POSIWID) to explain that the observed purpose of a system is often at odds with the intentions of those who design, operate, and promote it. For example, applying POSIWID, one might ask if the purpose of an education system is to help children grow into well-rounded individuals, or is it to train them to pass tests? “There is after all,” Beer observed, “no point in claiming that the purpose of a system is to do what it constantly fails to do.”
More specifically, when a system that you naively expect should have the purpose of doing X is found to actually be doing Y, do not automatically assume that it means the system is failing to fulfill its purpose. Instead, look around and see if there are people who are benefitting from it doing Y instead of X, and who are maintaining the state where the system does Y instead of X. That would mean that the true purpose of the system - what it is being deliberately made to do - is Y instead of X.
Why not just use the correct word? It seems weird to drag in an unrelated term just to redefine it. The word "purpose" divorced from intent seems to only have the utility of confusing the reader/audience.
I am pretty confused about why this conversation is happening, ie why this pithy little saying isn’t self explanatory, but the saying is supposed to be witty commentary.
For example, we make a system of speed limits to make roads safer, and we have a law enforcement system. We notice later that the roads are not safer, but that the police are vigorously enforcing the law and collecting the ticket profits from doing so. We ask: why does this system exist? What is the purpose of it? The naive answer is to make roads safer to drive on. The witty, savvy, cynical answer is: …
Ie the reason the system still exists in the way it does is because its real “purpose” is to be a revenue generating scheme for the police, regardless of the intent of whoever set it up in the first place, if indeed anyone did.
It exists in the way it does for multiple reasons in tension with one another.
If it was just trying to generate revenue for the police, it would be better at it.
Ditto, if it were just trying to make roads safer, or if its main objective was full compliance. (Which are related objectives, but not the same thing).
The reality is that political pressures exist which means neither full compliance nor the engineering interventions to make roads much safer are palatable to US voters, but there are pressures in the other direction which demand something must be done. Which is how we end up where we are.
FYI, this POSIWID concept has been heavily thought about, researched, reasoned, etc. within the cybernetics (or whatever you want to call it) community.
I am not going to do it justice, but the bottom line is that systems get complex very very fast (n! factorial complexity). Cyberniticians (or Stafford Beer at least) reason that we should just treat these systems as black boxes (and examine their inputs / outputs) as any attempt to explain or rationalize the inner working of the system itself (as you are trying to do) will never go well (again because of the complexity).
Sounds like a witty aphorism would be useful in order to express the real meaning he was trying to get at, seeing as the word you're demanding doesn't seem to exist.
cybersyn is just a dashboard if you ignore all the communist aspects he incorporated. it was a dashboard and organic feedback from the production itself. if you see production as uneducated workers and managers, then you fail to understand cybersyn. the human component is the focus. the rest is just communication improvement.
It should become clear to everyone that reads his work that "management theorist" Stafford Beer
can best be characterized without any doubt whatsoever as a charlatan.
Cybernetics came out of the Macy conferences [0] and this is where one needs to go, in order to establish context. I also highly recommend Norbert Wiener's biography "Dark Hero of the Information Age" [1] as a good introduction to one of the greatest geniuses of this age, easily eclipsing Shannon and von Neumann.
Principia Cybernetica [2] is another good resource.
>It should become clear to everyone that reads his work that "management theorist" Stafford Beer can best be characterized without any doubt whatsoever as a charlatan.
Yep.
Over the years I've found a few litmus tests for that sort of thing. Unclear or incomplete explanations; intentional vagueness, weird formatting, "meta" anything, "new language", incomprehensible diagrams. One or two and you're Stephen Wolfram; three or more and you're completely full of shit. Beer's book somehow manages to hit every single one in just a single page. Incredible!
If you claim something grand but can't explain your point clearly, you are almost always full of shit. If Susskind can explain the combined work of centuries of geniuses in The Theoretical Minimum, then you can explain your bullshit in a paragraph.
The tragedy of a declining impact from operations research continues to annoy me and I mention it whenever I think contextually, it makes sense. Usually when people trivialise complex decision points across non related competing pressures and goals but talk as if some (false) dichotomous choice exist.
I didn't do O/R but family members did at Royal Holloway and some friends I met online, from Cardiff. I have used some of the online decision support tools as a rank amateur. It feels like a specialisation joining stats over computer science and information management, with links to process design, process control, systems description and structural understanding of complex systems and processes.
The intersection of O/R, computing and socialism (state planning) is fascinating. I don't personally see Beer as a tragic figure. He made conscious choices. Allende and all of Chile suffered the consequences of violent opposition to the planning.
Fascinating how Beer's radical vision was both so ahead of its time and fundamentally constrained by the technological limitations of the 1970s. Imagine what he could have done with today's distributed systems and real-time data processing. The tragedy isn't that Cybersyn failed, but that we're still catching up to his core insights about organizational complexity.
Today you have Manufacturing Execution Systems integrated with other software. It is equivalent to Stafford Beer's vision but limited to one company and without the nice scifi room.
The Counterfactuals of his ideas are very interesting.
Chilean socialism didn't work (just ignoring the coup, it couldn't actually run the economy), but the reasons why it failed, or in other forms could have worked bear consideration.
In short, it failed for the same reasons central planning tends to, considering modern understandings of complexity theory and ideas suggested in books such as Seeing Like A State. Just having a dashboard and greater access to information is still subject to the same forms of hubris as is general central planning, even if these shortcomings are better anticipated.
Yet, many of the innovations in this project bare similarity to how large enterprises CAN run will, such as with ERP and Business Analytics in the private sector, and modern intelligence and command and control systems in the military.
So in all, they didn't completely work, but in the way is ideas did work they were very early.
Winners write the history books. There's whether or not a system fails, and there's who it fails for.
Because we can all see a heck of a lot of fail in today's system, but those failing or being failed don't tend to get much of a platform to write about it.
Disappointed in this article. I was hoping to get some introduction to Beers methods especially compared to naive centralized planning but its mostly about how Beer didnt live up to his potential.