Hacker News new | past | comments | ask | show | jobs | submit login

Hey, some news came out that might give you an insight to why the fact checking might not always work:

https://www.cnbc.com/2025/01/10/mark-zuckerberg-says-biden-p...






I sort of don't want to get into specific moderation examples, especially with Covid, because so many aspects of Covid are contentious. I'd just point out that this example has nothing to do with third party fact checkers, it was direct influence from a government on Meta; that can and will still happen even with Zuckerberg's policy changes. In fact I'd argue it will be even more likely now. If Meta had a robust policy on fact checking and removal of harmful content, that would be a weapon to push back on government interference.

It's also not necessary for third party fact checking to "always work" to have value, there will be some level of false positive rate that's still acceptable. Even with something like Covid, there is unambiguously false information with probably harmful consequences ("you can cure Covid by drinking a bottle of bleach", for example) that would be worth taking down by fact checkers.


If fact checking doesn't necessarily "always work", then you should not call it fact checking. That is already a great reason why Community Notes are better.

And, don't you see the connection? If the government can tell Facebook to censor content, they can also tell them what facts to publish.

Community Notes is from the community, not from Facebook, so not directly influencable by the government. The government could of course tell Facebook to censor certain Community Notes, but the community would notice. With the fact checkers, which are in the hands of Facebook, the community doesn't know.


> If fact checking doesn't necessarily "always work", then you should not call it fact checking. That is already a great reason why Community Notes are better.

You can still call it fact checking, just as you can still call air travel by that name even though a very small percentage of planes crash. Suppose the fact checkers had a 0.1% false positive rate and a 0% false negative rate. For every 1,000 pieces of reported content they review which should be left up, they take one down, and they never leave anything up that should be taken down. Wouldn't we say that the system, broadly speaking, works and has value? Even though it doesn't always work?

Do you think Community Notes will "always work"?

> Community Notes is from the community, not from Facebook, so not directly influencable by the government.

It wouldn't work like this. Meta still own the platform, profit from it, and are responsible for it.

Governments will always want to talk to Meta about the material they host, because Facebook reaches millions of people. Sometimes the governments will have a valid case, like when material on Facebook can be linked to inciting genocide[1]. And sometimes they won't, and will be trying to pressurise Facebook for political or self-serving reasons. The point is, those governments will not simply be satisfied and go away if Meta throw their hands in the air and say "Sorry, but we fired the fact checkers and have no control over the material on our platform. It's up to the community, talk to them."

If Meta are trying to solve the government interference problem, the solution is to strengthen their fact checking and moderation systems. Then they'd be able to push back and say that actually no, they are confident that the content on Facebook is appropriate and can credibly stand by it. Abdicating responsibility is just going to get them into more trouble.

[1] https://en.wikipedia.org/wiki/Rohingya_genocide#Facebook_con...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: