In a shocking move, Meta CEO Mark Zuckerberg has announced that the company will be ditching its fact-checking program on Facebook and Instagram.
Instead, the platforms will rely on user-generated “community notes” to combat misinformation.
This change comes just before US President-elect Donald Trump takes office, and it’s likely to have a significant impact on the way content is moderated online.
What’s Behind the Change?
Zuckerberg claims that fact checkers have been too biased and have destroyed more trust than they’ve created.
He believes that the current system has been used to shut down opinions and silence people with different ideas.
However, this move has been met with criticism from many, who argue that it’s a step backwards in the fight against misinformation.
Community Notes: The New Way Forward
So, how will community notes work? Essentially, users will be able to add notes to posts that they believe contain false or misleading information.
These notes will be visible to everyone who views the post, and they’ll be used to help Facebook and Instagram understand what content is credible and what’s not. It’s a system that’s similar to the one used by Elon Musk’s X (formerly Twitter).
A Shift to the Right?
The timing of this announcement has raised some eyebrows, given that it comes just before Trump takes office. Some have accused Meta of trying to curry favor with the new administration, which has been critical of the company’s content moderation policies in the past.
Meta’s newly appointed chief of global affairs, Joel Kaplan, has acknowledged that the company’s partnerships with third-party fact checkers were “well intentioned at the outset but there’s just been too much political bias in what they choose to fact check and how”.
What Does This Mean for Users?
So, what can you expect from these changes? For starters, you may see more misinformation and harmful content on your feeds. Zuckerberg has acknowledged that the new policy could create new problems for content moderation, but he believes that it’s a necessary step to promote free expression.
The company will also be adjusting its automated systems to focus on checking for illegal and “high-severity” violations, such as terrorism and child exploitation.
A Retreat from Responsibility?
Not everyone is happy with Meta’s decision. The Real Facebook Oversight Board, an outside accountability organization, has called the move “political pandering” and a “retreat from any sane and safe approach to content moderation”.
They believe that the changes will make it easier for harmful content to spread on the platform.
The Future of Content Moderation
As Meta moves forward with these changes, it’s clear that the company is taking a different approach to content moderation. While some may see this as a step in the right direction, others are concerned about the potential consequences.
One thing is certain: the way we interact with content on Facebook and Instagram is about to change in a big way.