Meta, the parent company of Facebook, Instagram, and Threads, revealed a major change in its content moderation policies on Tuesday. CEO Mark Zuckerberg announced that the company is eliminating its fact-checking program in favor of a more community-driven approach. In a video posted to social media, Zuckerberg explained that the move was designed to reduce censorship and embrace free speech on Meta’s platforms.
“We’re going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms,” Zuckerberg said. The shift comes just as President-elect Donald Trump prepares for a second term in office, with Meta executives citing the incoming administration’s more supportive stance on free speech as a key factor in their decision.
Under the new system, Meta will replace its third-party fact-checkers with a program called “Community Notes,” similar to the one used by X (formerly Twitter). Community Notes allows users to flag posts they believe to be misleading or lacking context. If the note receives broad support from a cross-section of users, it will be attached to the post, giving others more context. Joel Kaplan, Meta’s global policy chief, explained that this shift gives the platform’s users the power to provide context for each other.
“We think that’s a much better approach rather than relying on so-called experts who bring their own biases into the program,” Kaplan said. “We want to make sure that discourse can happen freely on the platform without fear of censorship.”
The fact-checking program, which Meta launched after Trump’s first election in 2016, relied on independent third-party organizations to monitor content for misinformation. However, Kaplan admitted that this approach had become too restrictive and politically biased.
“It has become clear there is too much political bias in what they choose to fact-check,” Kaplan said. “We went to independent, third-party fact-checkers, but it became too politicized, with fact-checkers deciding what content should be flagged based on their own views.”
Meta’s decision to end the fact-checking program comes at a time when the company is grappling with the challenges of content moderation on its platforms. Zuckerberg explained that the company had become more concerned with censoring content over the years, partly due to pressure from governments and legacy media outlets.
“The recent elections feel like a cultural tipping point toward once again prioritizing speech,” Zuckerberg said, signaling that Meta’s new direction is aimed at allowing more open expression on its platforms.
In addition to the change in fact-checking, Meta will also shift its approach to some controversial topics like immigration and gender. Zuckerberg expressed concerns that what started as an effort to make the platform more inclusive had increasingly been used to shut down differing opinions. “What started as a movement to be more inclusive has increasingly been used to shut down opinions and shut out people with different ideas,” Zuckerberg said. “I want to make sure that people can share their beliefs and experiences on our platforms.”
Meta also announced that it would be moving its trust and safety and content moderation teams from California to Texas, where there is less concern about political bias. The company believes that this move will help ensure more neutral and balanced moderation. “It’s important to have teams that are less influenced by the political environment,” Zuckerberg explained.
This change in content moderation is widely seen as a response to the political climate, particularly as Trump prepares to return to office. Kaplan made it clear that the company’s new direction was motivated by the incoming administration’s push to free expression.
“We have a new administration coming in that is far from pressuring companies to censor and is more supportive of free expression,” Kaplan said. Meta is also looking to work with the Trump administration to protect free speech, especially in the face of international pressures on U.S. tech companies.
Meta’s shift is also viewed as part of a broader trend toward embracing more user-driven moderation, a move that mirrors the approach taken by X under Elon Musk. The policy overhaul highlights the growing tension between balancing free expression and managing misinformation.
This new approach is expected to roll out first in the U.S., with plans to expand the system over time.