IANAL but I would assume FB is in the clear legally though as they're not actively blocking anything in particular, merely providing the tool with which admins can do the blocking themselves.
"it should be noted that no ethically-trained software engineer would ever consent to write a DestroyBaghdad procedure. Basic professional ethics would instead require him to write a DestroyCity procedure, to which Baghdad could be given as a parameter."
It is not that simple. There's a continuum from writing a compiler to coding DestroyBaghdad. Usually people draw the line at "Is this tool primary purpose evil?".
This tool is in a (darkish-)gray area IMHO. I don't think that employer blocking "fuck" or racial slurs is necessarily evil but it's not that good either.
Automatic filters are pretty bad at doing their job, so filtering is a lame approach to solving anything.
A tool that can block discussion of unionization is certainly a grey area. When you literally advertise "blocking discussion of unionization", I feel like you've left that grey area, though.
This is reminiscent of how Facebook used to allow people to microtarget job and housing ads to people based on protected classes like race, age, and gender. They ended up reaching a settlement and being forced to change their filter criteria due to civil rights laws. So I wouldn't be so sure that "merely providing the tool" is necessarily an entirely legal action.