Meta drops fact-checking, loosens its content moderation rules
Meta, the parent of Facebook, Instagram, and Whatsapp, today announced a major overhaul of its content moderation policies, taking off some guardrails that it had put in place over several years, in response to criticism that it had helped spread political and health misinformation.
In a blog post called “More speech, fewer mistakes,” Meta’s new chief global affairs officer, Joel Kaplan, outlined changes in three key areas to “undo the mission creep,” as he put it:
Meta is ending its third-party fact-checking program, moving to a Community Notes model over the coming months. Community Notes is what sites like X.com use;
It is lifting restrictions around “topics that are part of mainstream discourse,” instead focusing enforcement on “illegal and high-severity violations” in areas like terrorism, child sexual exploitation, drugs, fraud, and scams; and
Users will be encouraged to take a “personalized” approach to political content, making way for considerably more opinion and slant in people’s feeds that fits whatever they want to see. Yes, Meta is leaning into letting you create the echo chamber you’ve always wanted.
The moves are significant in part because they precede a new presidential administration in the U.S. taking charge later this month. Donald Trump and his supporters have signaled their interpretation of free speech to be significantly more focused on encouraging a much wider set of opinions.
Facebook has been in the crosshairs of their criticism throughout the past few years, not least because at one point one of the people it banned from its platforms in the name of content moderation was Trump himself.
Meta’s content moderation provisions were put in place and honed over a number of years following political and public criticism of how it helped spread election misinformation, bad advice on COVID-19, and other controversies. The fact checking, for example, was first developed in 2016, with Meta working with third-party organizationson the heels of accusations that Facebook was being weaponized to spread fake news during the U.S. presidential election.
This eventually also led to the formation of an Oversight Committee, more moderation, and other levers to help people control what content they saw, and to alert Meta when they believed it was toxic or misleading.
But these policies have not sat well with everyone. Some critics argue that the policies are not strong enough, while others believe they lead to too many mistakes, and others say the controls are too politically biased.
“Experts, like everyone else, have their own biases and perspectives. This showed up in the choices some made about what to fact check and how,” Kaplan noted. He added that Meta “over-enforcing” its rules led to “limiting legitimate political debate and censoring too much trivial content.” Meta now estimates that one to two pieces out of every 10 censored items were “mistakes” that didn’t violate policies.
One could argue that some of today’s changes are a direct set of actions designed to curry favor with the new people in power, but some of it has been a while in the making.
In the last year or so, even some of Meta’s commitment to its own rules has started to fall apart. Last month, Nick Clegg, the company’s outgoing policy chief, gave a mea culpa interview in which he described the company overdoing its moderation. And the Oversight Board has never really proven to be as effective as it had wanted to be.
Now, with accountability changing with political tides, Meta seemingly wants to take a more hands-off approach.
“Meta’s platforms are built to be places where people can express themselves freely. That can be messy. On platforms where billions of people can have a voice, all the good, bad and ugly is on display. But that’s free expression,” Kaplan wrote.
For its part, the Oversight Board said it “welcomes the news that Meta will revise its approach to fact-checking, with the goal of finding a scalable solution to enhance trust, free speech and user voice on its platforms.” The board added that it would work with Meta to shape its approach to “free speech in 2025.”
The developments come at a time of change at Meta itself.
CEO Mark Zuckerberg has signaled a stronger interest in working with, not battling, the Trump administration. Yesterday, the company appointed three new board members, one of whom is UFC head Dana White, a supporter of the incoming president. And last week, Meta replaced its longtime public affairs head, Nick Clegg, promoting Kaplan into the role. Kaplan had already been a part of the policy staff and was known as one of Meta’s most prominent Republicans.
And in a signal of how it is trying to step out of its own echo chamber, Meta’s making another shift, noted Kaplan: “We will be moving the trust and safety teams that write our content policies and review content out of California to Texas and other US locations.”
source: https://techcrunch.com/2025/01/07/meta-drops-fact-checking-and-loosens-its-content-moderation-rules/