Hacker News new | past | comments | ask | show | jobs | submit login

I used to be an IT guy at a structural and civil engineering firm. Those were real professional engineers with stamps and liability.

As long as "SWEs" do not have stamps and legal liability, they are not real (professional) engineers, IMHO.

My point is that I believe to earn the title of "Software Engineer," you should have a stamp and legal liability.

We done effed up. This breach of standards might be the great filter.

edit: Thanks to the conversation down-thread, the possibly obvious solution is a Software Professional Engineer, with a stamp. This means full-stack is actually full effing stack, not any bullshit. This means that ~1% to ~5% of SWE would be SWPE, as it is in other engineering domains. A SWPE would need to sign off on anything actually important. What is important? Well we figured that out in other engineering domains. It's time for software to catch the f up.




I actually looked into PE for software a while back. Are you aware that this was indeed a thing and was actually discontinued back in 2019 due to lack of participants? https://www.nspe.org/resources/pe-magazine/may-2018/ncees-en...


Ok, so overnight all programmers stop calling themselves Engineers. [1] What problem does that solve? I fix bugs all day, but I don't call myself a Software Doctor.

Frankly whether software people call themselves engineers or not matters to pretty much no-one (except actual engineers who have stamps and liabilities.)

Creating a bunch of requirements and liability won't suddenly result in more programmers getting certified and taking on liability. It'll just mean they stop using that title. I'm not sure that achieves anything useful. We'd still have the exact same software coming from the exact same people.

[1] for the record I think 'software engineer' is a daft title anyway, and I don't use it. I don't have an engineering degree. On the other hand I have a science degree and I don't go around calling myself a data scientist either.


That's fine, it just means that devs without stamps can't sign off on anything actually important. In real engineering, there is a difference between an Engineer and a Professional Engineer. The latter has a stamp.

I realize that this is the nearly the opposite of our current environment. It is a lot more regulation, a lot more professional standards... but it worked for civil and structural, and those standards were written in blood.

Maybe what I am asking for is a PE for SWE, those people have stamps, and it would be really hard to get a SW PE. Anything deemed critical (like security), by regulation, would require a SW PE stamp. [0]

Software did in-fact eat the world. Why shouldn't it have any legal/professional liability like civil and structural engineering?

[0] In this case, full stack, actually means full freaking stack = SWPE


>> That's fine, it just means that devs without stamps can't sign off on anything actually important

For some definition of important.

But let's follow your thought through. Who decides what is important? You? Me? The developer? Yhe end-user? Some regulatory body?

Is Tetris important? OpenSSL? Notepad ++? My side project on github?

If my OSS project becomes important, and I now need to find one of your expensive engineers to sign off on it, take liability for it, do you think I can afford her? How could they be remotely sure that the code is OK? How would they begin to determine if its safe or not?

>> Software did in-fact eat the world. Why shouldn't it have any legal/professional liability like civil and structural engineering?

Because those professions have shown us why that model doesn't scale. How many bridges, dams etc are built by engineers every year? How does that compare to the millions of software projects started every year?

In the last 30 years we've pretty much written all the code, on all the platforms, in use today. Linux, Windows, the web, phones, it's all less than 35 years old. What civil engineering projects have been completed in the same time scale? A handful of new skyscrapers?

You are basically suggesting we throw away all software ever written and rebuild the world based on individual's prepared to take on responsibility and legal liability for any bugs they create along the way?

I would suggest that not only would this be impossible, not only would it be meaningless, but it would take centuries to get to where we are right now. With just as many bugs as there are now. But, yay, we can bankrupt an engineer every time a bug is exploited.


This has all been done before in mechanical, structural, and civil engineering. People die and then regulatory and industry standards fix the problems.

We do not need to re-invent the concepts of train engine, bridge, and dam standards again.

I mean, I guess we actually do. The issue is that software has not yet killed enough people for those lessons to be learned. We are now at that cliff's edge [0], [1].

Another problem might be that software influence is on a far more hockey-stick-ish growth curve than what we dealt with in mechanical, civil, and structural engineering.

Meanwhile, our tolerance for professional and governmental standards seems to be diminishing.

[0] https://news.ycombinator.com/item?id=39918245

[1] https://news.ycombinator.com/item?id=24513820

... https://hn.algolia.com/?q=hospital+ransomware


No, the world's infrastructure has never been rebuilt from scratch to higher standards, not in the last few thousand years. We have always built on what already exists, grandfathered in anything that seemed ok, or was important enough even if not ok, etc.

We often live in buildings that far predate any building code, or even the state that emitted that code. We still use bits of infrastructure here and there that are much older than any modern state at all (though, to be fair, if a bridge has been around for the last thousand years, the risk it goes down tomorrow because it doesn't respect certain best practices is not exactly huge).


There have long been multiple forms of professional software engineering in aerospace, rail, medical instrumentation and national security industries such as ISO 26262, DO-178B/DO-178C/ED-12C, IEC-61508, EN-50128, FDA-1997-D-0029 and CC EAL/PP.

DO-178B DAL A (software whose failure would result in a plane crashing) was estimated at [1] to be writable at 3 SLOC/day for a newbie and 12 SLOC/day for a professional software engineer with experience writing code to this standard. Writing software to DO-178B standards was estimated in [1] to double project costs. DO-178C (newer standard from 2012) is much more onerous and costly.

I pick DO-178 deliberately because the level of engineering effort required in security terms is probably closest to that applied to seL4, which is stated to have cost ~USD$500/SLOC (adjusted for inflation to 2024).[2] This is a level higher than CC EAL7 as CC EAL7 only requires formal verification of design, not the actual implementation.[3] DO-178C goes as far as requiring every software tool used to verify software automatically has been formally verified otherwise one must rely upon manual (human) verification. Naturally, formally verified systems such as flight computer software and seL4 are deliberately quite small. Scaling of costs to much larger software projects would most likely be prohibitive as complexity of a code base and fault tree (all possible ways the software could fail) would obviously not scale in a friendly way.

[1] https://web.archive.org/web/20131030061433/http://www.euroco...

[2] https://en.wikipedia.org/wiki/L4_microkernel_family#High_ass...

[3] https://cgi.cse.unsw.edu.au/~cs9242/21/lectures/10a-sel4.pdf


With much humility, may I ask, have you been exposed to the world of PEs with stamps and liability?

Do you see the need for anything like this in the software world, in the future?


Professional engineers have been stamping and signing off on safety-critical software for decades, particularly in aviation, space, rail and medical instrumentation sectors. Whilst less likely to be regulated under a "professional association" scheme, there has also been two decades of similar stamping of security-critical software under the Common Criteria EAL scheme.

The question is whether formal software engineering practices (and associated costs) expand to other sectors in the future. I think yes, but at a very slow pace mainly due to high costs. CrowdStrike's buggy driver bricking Windows computers around the world is estimated to have caused worldwide damages of some USB$10bn+.[1] There will be cheaper ways seen to limit a buggy driver bricking Windows computers in the future other than requiring every Windows driver be built to seL4-like (~USD$500/SLOC) standards.

If formal software engineering practices are implemented more as years go by, it'll be the simplest/easiest software touched first, with the highest consequences of failure, such as Internet nameservers.

[1] https://en.wikipedia.org/wiki/2024_CrowdStrike_incident


As a header, there's clearly a liability problem in modern software, which I'll get to later.

[pre-posting comment: I've moved the semi-rant portion to the bottom, because I realized I should start with the more direct issues first, lest the ranty-ness cause you to not read the less ranty portion :D ] <snip and paste below> Now getting to the point: there is a real problem in that companies can advertise products to do a certain thing, and they can then have a license agreement that says "we're not liable if it fails to do what we said it would do", but generally despite those licenses (which to be clear are a requirement for open source to exist as a concept), the law has found companies are liable for unreasonable losses.

So the question is just how liable should a company be for a bug in their software (or hardware I guess depending on where you place the hardware vs firmware vs software lines) that can be exploited, and your position is that in addition to liability bought about by their own actions (Again despite the "we have no liability" EULAs plenty of companies have ended up with penalties for bugs in their software causing a variety of different awful outcomes.

But you're going a step further, you're now saying, in addition to liability for your errors, you're also liable for other people causing failures due to those errors, either accidentally or intentionally.

I am curious, and I would be interested in the responses from your Real Engineer coworkers.

Who is responsible if a bridge collapses when a ship crashes into it? An engineer can reasonably predict that that would happen, and should design to defend against it.

Let's say an engineer designs a building, and a person is able to bomb the building and cause it to collapse with only a small amount of fertilizer? What happens to the liability if the only reason the plot succeeded was because they were able to break past a security barrier?

Because here is the thing: we are not talking about liability if a product/project/construction fails to do what it says it will do (despite EULAs, companies generally lose in court when their product causes harm even if there's a license precluding liability). The question is who is liable if a product fails to stand up to a malicious actor.

At its heart, the problem we're discussing is not about liability for "the engine failed in normal use", it's "the engine failed after someone poured sugar into the gas tank", not "the wall collapsed in the wind" it's "the wall collapsed after someone intentionally drove their truck into it", not "the plane crashed when landing due to the tires bursting", it's "the plane crashed when landing as someone had slashed the tires".

What you're saying, is that a Professional Engineer signing off on a design is saying not only "this product will work as intended", they're saying "this product will even under active attack outside of its design window".

That's an argument that seems to go either way: I don't recall ever hearing about a lawsuit against door manufacturers due to burglars being able to break through the doors or locks, but on the other hand Kia is being sued due to the triviality of stealing their cars - and even then the liability claims seem to be due to the cost of handling the increased theft, not the damage from any individual theft.

[begin ranty portion: this is all I think fairly reasonable, but it's much more opinionated than the now initial point and I'm loathe to just discard it]

I'm curious what/who you think is signing off and what they are signing off on.

* First off, software is more complex than more or less any physical product, the only solution is to reduce that complexity down to something manageable to the same extent that, say, a car is. How many parts total are in your car? Cool that's how many expressions or statements your program can have. And because it's not governed by direct physical laws and similar interactions, then that's still more complex than a car.

* Second: no more open source code in commercial products - you can't use it in a product, because doing so requires sign off by your magical software engineers who can understand products more complex, again, than any single physical product

* Third: no more free development in what open source remains - signing off on a review now makes you legally liable for it. You might say that's great, I say that means no one is going to maintain anything for zero compensation and infinite liability.

* Fourth: no more learn development through open source contributions, as a variation of the above now every newbie that submits a change brings liability, so you're not accepting any changes from anyone you don't know, and who you don't have reason to believe is competent.

* Fifth: OSS licenses are out - they all explicitly state that there's no warrantee or fitness for purpose, but you've just said the engineer that signs off on them is liable for it, which necessarily means that your desire for liability trumps the license.

* Sixth: Free development tools are out - miscompilation is now a legal liability issue, so now a GCC bug opens whoever signed off on that code to liability.

The world you're describing is one in which the model of all software dev including free dev, now comes with liability that matches designing cars, or constructing buildings, both of which are much less complex and much more predictable than even the modest OSS projects, and those fields all come with significant cost based barriers to entry. The reason there are more software developers than there are civil or mechanical engineers is not because one is easier than the other, it's because you cannot learn civil or mechanical engineering (or most other engineering disciplines) as cheaply as software. The reason that software generally pays more than those positions is because the employer is taking on a bunch of the financial, legal, and insurance expenses required to do anything - the capital expenditure required to start a new civil or mechanical engineering companies is vastly higher than for software, and the basic overhead for just existing is higher, which means employers don't have to compete with employees deciding to start their own companies. A bunch of this is straight up capital costs, and capital, but the bulk of it is being able to have sufficient liability protection before even the smallest project is started. At that point, the company has insurance to cover the costs so the owners are generally fine, but the engineer that missed something - not the people forcing time crunches, short cuts, etc - is the one that will end up in jail. You've just said the same should apply to software: essentially the same as today company screws up and pays fine/settlement, but now with lowered pay rates for the developers, and they can go to jail.

All because you have decided to apply a liability model that is appropriate to an industry where the things that are signed off on have entirely self contained and mostly static behavior to a different industry where _by design_ the state changes constantly, so there is not, and cannot be, any equivalent "safety" model or system. Even in the industries that you're talking about, where analysis is overwhelmingly simpler and predictable, products and construction fails. Yet now you're saying the software development could just be the same as that. When developing a building, you can be paranoid in your design, and make it more expensive by overbuilding - civil engineering price competition is basically just a matter of "how much can I strip down the materials of construction" without it collapsing (noting that you can exactly model the entire behavior of anything in your design). Again, the software your new standards require are the same as that required by the space and airline industry that people routinely already berate for being "over priced".

You've made a point that there are engineers, and professional engineers, and the latter are the only ones who sign off on things. So it sounds like you're saying patches can only be reviewed by specific employees, who taken on liability for those changes, so now an OSS project has to employ a Professional Engineer to review contributions, who becomes liable for any errors they miss (out of curiosity, if a bug requires two different patches working together to create a vulnerability, which reviewer is now legally liable?). Those professional engineers now have to sign off on all software, so who is signing off on linux? You want to decode images, hopefully you can find someone to sign off on that. Actually it's probably best to have a smaller in-house product, or a commercial product from another company who have had their own professional engineers sign off, and who have sufficient insurance. Remember that you need to remind your employees not to contribute to any OSS projects, and don't release anything as OSS, because you would be liable if it goes wrong in a different product that you don't make money from now (remember your own Professional Engineers have signed off on the safety of the code you released, if they were wrong, you're now liable if someone downstream relied on that sign off).

This misses a few core details:

  Physical engineering is *not* software engineering (which yes as a title "software engineer" is not accurate in many/most cases as it does imply a degree of rigour absent in most software projects). Physical engineering does not employ the same degree of reuse and intermingling as occurs in software - the closest I can really think of is engine swaps in cars, but that's only really doable because the engine is essentially the most complex part of the car anyway (at least in an ICE), and even then the interaction with the rest of the car is extremely constrained, predictable, and can be physically minimized. For civil/structural engineering it's even more extreme: large construction (e.g. the complex cases) are not simply made by mashing together arbitrary parts of other projects - buildings fall into basically two categories: the exact same building with different dimensions and a different paint job, or entirely custom.

  Physical engineering has basically an entirely predictable state from which to work. The overwhelming majority of any physical design is completely static, the things that are dynamic have a functionally finite and directly modelable set of states and behaviors, and those states and behaviors vary in predictable ways in response to constrained options. Despite this, many (if not most, though quantifying this would be hard because there's also vastly more static engineering than dynamic) failures are in the dynamic parts of these projects than the static portions. Software is definitionally free of more or less any static behavior, the entire reason for software is the desire for constantly changing state.
A lot of failures in physical engineering are the result of failing to correctly model dynamic behavior, and again, software is entirely the part that Real Engineers have a tendency to model incorrectly.


I think in certain ways you are right we need those standards and stamps. But also colleges need to be very clear and cease calling it software engineering as it implies the expertise of a engineer.

Computer engineering degrees can sorta get away with this, as they have quite a bit of engineering classes they have to take, but I'm not sure if they can call themselves engineers legally.

Its hard because I believe certain titles are protected like doctor,lawyer,engineer, but last I check it varied by state in the USA at least.

Sidenote: sometimes people get a computer science degree, or they he a cyber security degree, or even cyber warefare degree. All three seem to be used interchangeably in the security field.

On one hand I get how formal protected titles help uphold standards and create trust, but it also enforces the grip colleges have and it creates more barriers to entering a field. Some States require 3 years experience to get a state permit for certain activities, when a federal permit is already needed so in that case its just more paperwork.

Imagine if we started demanding people who build houses, not the guy who designed it but the builders to be an engineer, so we can trust their work and ability to do a job in the correct manner?

At what point do we sacrifice and pass legislation in the name of reliability and safety?


> As long as "SWEs" do not have stamps and legal liability, they are not real (professional) engineers

When and if that happens I’ll move to carpentry. Good luck. Tech is already full of *it. The only thing short of making it even worse is stamps and a mafia-like org issuing the stamps and asking for contributions in return, like it happens in the fields of medical care, law, book keeping, architecture or civil engineering.

The companies should certify the products which require certification instead and get liability insurance.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: