Revamping Content Moderation at Meta

Published On Sat Jan 18 2025
Revamping Content Moderation at Meta

Meta Pivots on Content Moderation - by Zvi Mowshowitz

There’s going to be some changes made. Out With the Fact Checkers, What Happened, Timing is Everything, Balancing Different Errors, Truth and Reconciliation, Fact Check Fact Check, Mistakes Will Be Made, Where We Go From Here.

Mark Zuckerberg’s Decision

Mark Zuckerberg has decided that with Donald Trump soon to be in office, he is allowed to care about free speech again. And he has decided it is time to admit that what was called ‘fact checking’ meant he had for years been running a giant hugely biased, trigger-happy and error-prone left-wing censorship and moderation machine that had standards massively out of touch with ordinary people and engaged in automated taking down of often innocent accounts. Zuckerberg later talked more about this, and many related and unrelated things, on the Joe Rogan podcast. He says many fun things, like that most companies need ‘more masculine energy’ to balance their feminine energy.

Meta's content moderation decision gets Hill scrutiny

Overhauling Meta's Moderation

Zuckerberg is going to overhaul Meta’s entire moderation and censorship structure, replacing it over several months with something akin to community notes. He’s going to move standards back in line with actual community standards. And he’s going to move his content moderation teams from California to Texas, and push back against censorship worldwide, highlighting Europe and Latin America. The current review process reportedly started when one of Zuckerberg’s own posts got throttled because of concerns over medical content, and then snowballed from there. How did it all go so wrong? Zuckerberg tells the story on Rogan, that he took complaints about misinformation and the need for fact-checking as sincere, then after he hired people for this the slippery slope took over and before long they were censoring things that are in the mainstream discourse.

Challenges Faced

Of course the correct response here is to fire this person and hire the people you meant to hire, but hiring is hard and the people you wanted were suddenly hard to find and the existing processes were not producing them. I know of a number of nonprofits that had an unpleasant shock waking up to this. Some said "wait, no, this isn't what we're doing" and had internal drama as they parted with illiberal employees and survived. Some did not. But I think it took a really unusual level of institutional leadership and courage to go in 2020 "what? no. that's not what we're doing here. if you want to do that, leave." And the orgs where it did happen tended to keep it quiet so they wouldn't be a target of outrage.

Shift in Policies

Now that the Biden administration is on its way out, and the vibes have shifted, it’s time for a change. Zuckerberg explicitly says he waited until after the election (partly because during one is an awkward time for major changes) and that he was deciding largely based on the vibe shifts.

Moving Forward

Zuckerberg is going to focus his filters, rather than scanning for any violation at all, on illegal and high severity violations, and only act on low severity actions if and when someone reports them. I worry that if this is the policy than there will be various people who decide it is their job to use their AIs (or just their eyes) to go searching for violations to report, but it would still have humans in the loop in every step. Whereas ‘the filters make mistakes’ so he’s going to dial them back and require a lot more confidence than before (yes this is a trade-off, he discusses it more on Rogan, but there’s a lot of ‘what the hell were we doing before?’ here), essentially also admitting that there was no way to appeal to humans when those mistakes happened, or that those answering those appeals were insane. He’s going to ‘reduce the number of innocent people’s posts and accounts that we accidentally take down.’ And accounts? Yeah, this kind of used to happen a lot, with no way to fix the mistakes.

The first reaction I saw to this change was ‘I’ll believe this change matters when [X]’s account, which was banned without explanation, gets reversed. I don’t see any indication of a plan to undo the mistakes of the past here? He says they’re going to ‘bring back more civic posts,’ because people want to see such content again. Why not let people choose which content they want to see?

Conclusion

These are all highly welcome changes, especially the move to Community Notes and generally vastly raising the bar before things are censored, even if he basically admits that he was previously ‘going with the flow’ and bowing to pressure, and now he’s bowing to a different kind of pressure. I hate to kick even this man while he’s making great changes. I want to be clear, my primary response is that these are centrally great changes. It also seems like we need more before we can properly move on.

Big Tech rolls out the red carpet for Trump ahead of his ...

The Role of Fact-Checking

Nate Silver writes of The Rise and Fall of Fact Checking. He points out that a lot of the bias in fact-checking is in selecting what to ‘fact check,’ which usually targets unresolved or unresolvable claims, because if it was resolvable you didn’t need a fact check. And that ‘fact checking’ ended up often being a way to use an argument from ‘no evidence’ to call things the fact-checkers disliked ‘misinformation,’ and the whole enterprise often aims primarily to scaffold and support a narrative.

Ethical Scaling for Content Moderation: Extreme Speech and the (In ...)