Technology
Meta Ditches Fact-Checking for ‘Community Notes’
The social network will rely on users, not fact-checkers, to add notes to posts.
In a dramatic shift just days before President-elect Donald Trump’s inauguration, Meta announced it would dismantle its third-party fact-checking program and introduce a new user-driven model called “Community Notes.”
This change, revealed on January 7, 2025, aims to reaffirm the company’s commitment to “free expression” while addressing growing concerns over content moderation.
Meta, which owns Facebook, Instagram, and Threads, has relied on third-party fact-checkers since 2016 to evaluate the accuracy of posts. However, the company now argues that the system led to too many errors and unnecessary censorship.
“We’re going to change how we enforce our policies to reduce the kind of mistakes that account for the vast majority of censorship,” Meta said in a statement.
Chief executive Mark Zuckerberg elaborated in a video, stating that the company would return to its “roots,” focusing on reducing mistakes and simplifying policies.
While acknowledging that the new approach would allow more speech, he also conceded that it might lead to fewer harmful posts being flagged.
“It means that we’re going to catch less bad stuff, but we’ll also reduce the number of innocent people’s posts and accounts that we accidentally take down,” Zuckerberg said.
RELATED: Top 5 Technology Trends Likely to Shape 2025
This move comes amid growing criticism from both sides of the political spectrum. Conservative figures, including Trump and his allies, have long complained about perceived bias in Meta’s moderation practices.
In an interview on Fox & Friends, Meta’s global policy chief, Joel Kaplan, explained that the previous fact-checking system had become too politically biased.
“We’ve had too much political bias in the current system,” he said, echoing concerns that the program had disproportionately impacted conservative voices.
The new “Community Notes” model, similar to Elon Musk’s X (formerly Twitter), will allow users to add context to posts, offering a less intrusive form of content correction.
The system is set to roll out in the coming months, with Meta promising to phase out full-screen warnings and demotions of fact-checked content. Instead, posts will be labeled with a subtler indicator that additional information is available for those who wish to see it.
MORE: Chinese AI Chatbot DeepSeek Hit with Cyberattack
In another notable shift, Meta is moving its trust and safety teams from California to Texas, a decision Zuckerberg claims will reduce concerns about political bias in content moderation.
The changes are a bold gamble: while they promise to encourage more free speech, they also raise questions about how effectively Meta can control the spread of misinformation.