The post pushes the view that CrowdStrike's engineers should be held responsible. That's one way of looking at it.
But there is an entire chain of responsibility here. The hospital IT department that chose to use a computer instead of dumber technologies. The IT department that chose to run Windows. The security team that chose to purchase CrowdStrike's software, possibly without vetting them.
If a software's license has clear terms stating that there is no warranty, and the buyer buys it anyway, why shouldn't this be a caveat emptor situation? If they didn't like it, they could negotiate indemnity clauses, go to a competitor, or not use the software at all.
Don't get me wrong, I absolutely think that CrowdStrike did a shitty thing. But maybe they already disclosed that in their license agreement, and the purchasers decided to overlook that to their own peril. After all, running kernel-mode software is equivalent to handing over the keys to your computers. Maybe negotiating/selling software with liability clauses should be more normalized?
Not only protections, but financial incentives. As a society and civilization, but can't depend on accidental heroes to be the ones that safeguard and correct the course of the entire group. Especially when we idolize money and power, and have a workforce that is manipulated across so many dimensions (healthcare, non-competes, historical admissions, union/anti-union, gig-economy, salary/hourly).
Imagine, to use the articles example, an anesthesiologist making you sign a ToS that says they're not responsible if they fuck up. It would not be enforceable. Licenses and terms aren't blanket protections.
At some point, some rep came and sold software to customers engaged in business that their license stated their software was not fit for.
Ths error is on indiviual IT department to allow Crowdstrike DIRECT access to their system without check and balance. Have they just delay update by 1 day, they will not get affected. Clearly the CIO allowed extrrnal company to have the power to flip the off switch. Then again, we now see the wisdom of Xi and Putin. Diversity is good!
The phrase "Responsibility without authority" comes to mind. Who ultimately has the authority to set the target quality and direction of software projects? In my experience this is usually not the programmers...
Yesterday a friend of mine was stuck in the Hospital all day. Their computer system went down and that led to a delay of care. Delays in care kill people.
It seems like the author isn't considering the reality of the professionals they are comparing.
Coders of a CrowdStrike-sized company have layers of people-exercising-company-authority above them. Coders are employees who operate within the confines they are given.
Those confines get shaped by what filters down from shareholders/investors and executives.
Conversely, anesthesiologists and structural engineers are commonly directed by people they interact with. Their expertise carries real weight. In many cases they have a degree of ownership in the biz getting paid.
Management is ultimately responsible. Licensed engineers are already responsible and we still get chronic institutional maleficence, Boeing, Enron, Ford, Purdue etc.
Programmers are seldom the ones driving the conditions that are the root cause for operational failures, but often the ones that bear the consequences.
Jailing the guard at the gate for not stopping an attack, eh? How about “corporate criminal liability extends to every employee at director and higher pay grades (management and otherwise)”?
In my experience the software engineers that often get promoted crank out as many features as possible with almost no regard to the defect rate. If you do things at a more deliberate pace, you can be regarded by product or management as slow and unproductive. Real incentives exist in software development that push for quantity over quality.
This is not analogous to the failure of an anesthesiologist to properly sedate a patient. This is a process failure. Clearly the proper amount of QA was not in place for whatever reason and they need to re-examine their approach. Of course their process needs extra diligence given the cost of failure.
It would be a real shame if an individual is punished for this instead of examining the process and system of incentives that led to this failure.
> There will be consequences, no? Billion dollar lawsuits that cause real and severe damage?
Those lawsuits will be directed at the corporation - which well insulates those who dictate corporate action.
The author seemingly wants to extract the coders from the company structure and have them bear the full weight of the consequences, instead. This would insulate not only the decision makers but the corporation as well.
If there was a gauge like this Evil <---> Ethical, the author's suggestion would land well to the left.
I think the issue was testing. Many organizations don't do enough, or the bare minimum, which is already insufficient as it's in some controlled lab that does not mimic the real world.
A lot of comments here blaming management and executives, but bear in mind: who was the person who wrote the buggy code? Who was the person who pushed the release button? Sure, broken process, poor release engineering practices, whatever. And who, ultimately, implemented those release engineering processes?
The engineer who caused all this has surely already been fired, but that's just scratching the surface--I hope this is brought to court as criminal negligence.
Why do we have operating systems? So that programs cannot cause this sort of problem.
Who required the operating system to be undermined in the name of security? It wasn't that programmer.
Who created the regulatory environment where such decisions actually look like a good idea? It sure wasn't any programmer.
What does real engineering look like?
When the decision was made to use a li-on battery instead of a RAT in the 787, the battery was designed to be fail safe. There was no way it could fail.
The engineer who created the enclosure for it in the airplane started with the assumption that it would fail catastrophically, and that the enclosure would need to contain it.
It's a good thing he did, because that battery did fail, and the enclosure kept the plane from failing.
This analogy breaks down when you actually think about its real-world applications.
Structural engineers are held at criminal fault if a bridge they signed off on collapses. Ok, now why do we have the bridge in the first place? Because city planning segregated residential and commercial zones across a body of water. And who was responsible for that decision? The city planning council?
So if a bridge collapses and hundreds of people die you put the blame on the politician who signed off on zoning laws, instead of the engineer who designed a broken bridge?
what kind of nonsense are people in the comments mentioning. it shouldn't even be remotely considered.
wtf kind of BS to allow any remote semblance of programmers deserving consequences? if programmers received EXECUTIVE level pay then yes, they deserve the rammifications, but they do not, executive pay is ridiculous with their already inflated salary, ridiculous bonuses, and stock buybacks. whoever gets paid more is the one that deserves the consequences.
Thats the WHOLE GODDAM REASON THEY ARE IN THAT POSITION.
Shit happens. What makes this so interesting is the speed at which the shit tsnuami spread. Try rolling out a little more slowly so less of the world gets affected.
For those affected, consider it a relatively painless test of your DR systems, and a chance to fix 'em up.
But there is an entire chain of responsibility here. The hospital IT department that chose to use a computer instead of dumber technologies. The IT department that chose to run Windows. The security team that chose to purchase CrowdStrike's software, possibly without vetting them.
If a software's license has clear terms stating that there is no warranty, and the buyer buys it anyway, why shouldn't this be a caveat emptor situation? If they didn't like it, they could negotiate indemnity clauses, go to a competitor, or not use the software at all.
Don't get me wrong, I absolutely think that CrowdStrike did a shitty thing. But maybe they already disclosed that in their license agreement, and the purchasers decided to overlook that to their own peril. After all, running kernel-mode software is equivalent to handing over the keys to your computers. Maybe negotiating/selling software with liability clauses should be more normalized?