Hacker News new | past | comments | ask | show | jobs | submit login

It's a moral failing to be engaged in the collection of personal data of this type and scale without having a solid, well articulated, well communicated, robust and redundant plan for managing security and mitigating the impact of security issues.

It's an ethical failure that this industry has so many examples of above.




I don't think this is a moral question.

Whether or not this information should be collected at all may be a moral question, but how it's secured is about technical competence.

I also think we should keep in mind that, even if it was well-secured, there could have still been a breach. Would that have been less bad? The result would be the same.


The moral issue isn't one of technical competence, but rather of having the integrity to perform the appropriate due diligence required of a company handling such sensitive information.

No security professional is going to argue that you can or will prevent every vulnerability from being exploited. However, when you leave a critical vulnerability open for months on end, you knowingly and unnecessarily expose yourself, and any parties associated with you (by choice or otherwise), to a level of risk that is unacceptable.

If this were a 0-day exploit, then the conversation would be different. If their exec's hadn't sold off so much stock a such a suspect moment, then the conversation would be different. If the IT department had appropriately began remediating the vulnerability within a respectable timeframe but had already been exploited, then the conversation would be different.


To my view, collecting and storing that much information is presumptively immoral. It creates a public hazard, in the same vein as stockpiling explosives or toxic chemicals.

There is, however, a degree of respect for that hazard, demonstrated in concrete safety practices, that can override that presumption.

I don't think Equifax has demonstrated that respect. To a layman's view, not many companies do. It's possible that the amount of respect necessary for a hazard as large as the one Equifax created is too onerous for a for-profit entity to realistically implement, but I don't know that there's a fundamental reason that sufficiently paranoid engineering practices couldn't make this moral.


    Would that have been less bad?
Quite plausible yes, with better systems design.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: