Meta, the parent company of Facebook and Instagram, recently announced a series of major changes to its content moderation policies. Among the most relevant news, the decision to abandoning the third-party fact-checking system and adopting a “Community Notes” system, similar to the one introduced by Elon Musk on. This choice, according to Zuckerberg, aims to restore greater freedom of expression on Meta platforms.
However, this decision raised concerns about the possible spread of misinformation and harmful content. In parallel, Meta announced the closure of its diversity, equity and inclusion programs, citing recent changes in the US legal landscape.
User reactions were not long in coming. Migrations to alternative platforms such as Bluesky and Mastodon have in fact increased exponentially. Many users appear concerned about the potential decrease in content quality and the spread of misinformation on Meta platforms.
The consequences of these decisions could be multiple: an increase in misinformation, a greater polarization of debates and a migration of users to other platforms. Furthermore, abandoning fact-checkers could damage Meta’s reputation in the eyes of users and regulators.The Community Notes model, adopted by both Meta and X, represents an attempt to involve users in content moderation. However, the effectiveness of this system is yet to be demonstrated and raises doubts regarding the possibility of guaranteeing a safe and reliable online environment.