I think it's not kinda sad, but outright offensive. I understand the commercial value of a fully identifiable and tracked person online, but there has to be a line to draw.
Every transparency feature is opt-in by default. Privacy is slowly becoming a dirty word for people who have something to hide. And those who fight for it are marginalized and made look as villains.
It is really bizarre to witness. At some point this rush for user identities reminds me of the Wall Street and their unregulated lust for profit despite the consequence.
Facebook is working on making that useless too by changing everyone's profiles to only list their @facebook.com address and redirecting all email from non-"friend"s to the bit bucket labeled "other" in Facebook's message center.
Having your Facebook account locked is the gift that keeps on giving.
It gives you a legitimate excuse not to use Facebook that all your friends must accept unquestioningly, and if the reason is because you sent the Zuck messages he didn't want to receive then you get an awful lot of hipster cred as well.
Why should you provide any time to Facebook to respond with regard to a feature that works as intended (but can – like most features – be used for other purposes)?
The report/block function is a classic in this regard: In the best case, it helps Facebook to identify unwanted content. In many other cases, however, enough reports simply lead to the automatic disappearance of maybe controversial but otherwise completely acceptable content. In general, the freedom of speech should of course prevail, i.e., reporting content as well as blocking content should the exception and not the norm.
It took two months for a response for me from their security team, and in the end their team dismissed my bug as a discrepancy in privacy settings (it isn't). For me at least, it's not really worthwhile trying to make an information leak a publicly known fact — nobody really cares.