ARTICLE 19 reports on how weak content moderation of social media platforms can transform them into hotbeds of ‘disinformation’, ‘hate speech’, and discrimination.
This statement was originally published on article19.org on 23 June 2022.
Social media platforms can be a space for free expression, democratic debate, and participation. But weak content moderation can transform them into hotbeds of ‘disinformation’, ‘hate speech’, and discrimination. This is especially concerning in post-conflict countries, where tensions between groups can erupt into violence.
ARTICLE 19’s new research investigates how content is moderated on major social media platforms in three post-conflict countries – Bosnia and Herzegovina, Indonesia, and Kenya – with a particular focus on ‘harmful content’ (such as ‘hate speech’ and ‘disinformation’).
Our research has found that social media companies don’t listen to local communities. They also fail to consider context – cultural, social, historical, economic, political – when moderating users’ content.
This can have a dramatic impact, online and offline. It can increase polarisation and the risk of violence – as when Facebook allowed incitement of genocide against Rohingya in Myanmar.
Bridging this gap between global companies and local communities is therefore vital to ensuring sustainable peace and democracy in post-conflict countries.
Read our
Read our country reports
Global problem, local solution
ARTICLE 19, together with our research participants, has proposed a solution: local Coalitions on Freedom of Expression and Content Moderation.
These coalitions would allow consistent engagement between social media platforms and local civil society organisations, which would contribute to bridging the gap between global tech giants and local communities.
Our research provides more information on these coalitions. For each country, we outline practical steps for creating them, detailed risk assessments and potential members.
“A local coalition on freedom of expression and content moderation would bring together social media platforms and local civil society organisations. This would represent a win–win.
It would provide social media platforms with a one-stop shop where they could easily hear the concerns of local civil society.
And it would give a voice to local civil society actors affected by ‘disinformation’, speech that incites discrimination, and content-moderation decisions that overlook the local context.”
– Pierre François Docquir, Head of Media Freedom, ARTICLE 19
Edwin’s story
What are we asking social media companies to do?
- Comply with international standards on freedom of expression and content moderation
- Ensure content moderation reflects the local context
- Publish comprehensive transparency reports
- Participate in new, independent, self-regulatory mechanisms
- Be easily accessible to local stakeholders
For more information on each of these action points please refer to ARTICLE 19’s statement