Hacker News new | past | comments | ask | show | jobs | submit login

If your system is objectively right, but also objectively causing accidents with humans. Well you won't fix the humans...



> If your system is objectively right, but also objectively causing accidents with humans.

It's not causing accidents in these cases. The humans are.


The causality version of "cause", not the blame version. Accidents that would not have happened without the system.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: