A handful of social media and app store platforms have emerged as leaders in transparency, publicly disclosing how often and why they comply with takedown requests.
This statement was originally published on eff.org on 31 May 2018.
Facing increased demands from governments to remove user content, purportedly in the name of combating hate speech and extremism, a handful of social media and app store platforms – including the Apple App Store, Google Play Store, and YouTube – have emerged as leaders in transparency, publicly disclosing how often and why they comply with takedown requests, and notifying users when their posts are targeted for removal, an Electronic Frontier Foundation (EFF) report found.
Others major platforms, notably Facebook and Instagram, have failed to adopt truly meaningful notice practices and policies that inform users of crucial details, like which governments have come knocking and why, EFF said in its Who Has Your Back: Censorship Edition report, released today.
Prior Who Has Your Back reports, which EFF has published annually since 2011, have focused on government demands for user data. But this year, EFF focuses squarely on how major technology companies are responding to government-requested censorship.
What people can say on the Internet is increasingly being regulated, not by governments, but by social media companies whose content moderation policies and community standards are often opaque and seemingly arbitrary. Users deserve to know if a government ordered their Facebook or Twitter post removed, how to appeal censorship decisions, and what caused their speech to be flagged by government officials. The stakes are high, especially in unstable political environments or for those living under repressive regimes. Requests to take down the content or block the pages of journalists, activists, or dissidents are often a prelude to further government targeting.
“In a time when governments around the world are putting growing pressure on online platforms to crack down on speech they consider undesirable, transparency in content moderation is needed more than ever to protect free expression online,” said Jillian C. York, EFF Director for International Freedom of Expression.
EFF evaluated publicly-available policies at 16 companies and awarded stars in five categories: transparency on legal takedown requests, transparency on platform policy takedown requests, providing meaningful notice, allowing appeals, and limiting the geographic scope of takedowns. All of the categories are new this year.
To earn a star in the first two categories, companies must regularly publish detailed information about government takedown requests, for instance in their transparency reports.
“If and when companies do comply with government requests to remove content or suspend accounts, these decisions must be transparent to their users and the general public,” said Gennie Gebhart, EFF Researcher.
The other categories evaluate whether a company offers a dispute process, notifies users of takedowns and suspensions, and reasonably minimizes the geographic scope of removals.
Two of the platforms receiving five stars were app stores: Apple App Store, and Google Play Store. Because these companies have only one type of content—apps—to moderate and operate on a smaller scale than say, Facebook, it’s easier for them to implement the requirements of this year’s report, but their choices are still good ones.
The other company receiving five stars was YouTube. YouTube’s transparency report (published by parent company Google) goes above and beyond to disclose not only the number of government takedown requests overall and by country, but also the details and outcomes of individuals requests.
With only one star each, Facebook—and its subsidiary Instagram—lag behind YouTube and other large social networks and technology companies. Besides not meeting our requirements for detailed reports on all government-requested content removal, the companies do not commit to providing meaningful notice of takedowns for all categories of content or an appeals process to dispute them. Facebook took some small steps to increase transparency recently when it made its internal moderation guidelines more public. But the fact that the social media platform with the most users still denies them more comprehensive notice and appeals is cause for concern.
“It’s encouraging that, with some notable exceptions, more and more companies and platforms are adopting the principles of transparency, notice, appeal, and limited scope with regard to government-ordered censorship,” said Nate Cardozo, EFF Senior Staff Attorney. “Without these best practices it’s too easy for the tech giants to misuse their power by silencing vulnerable speakers and also obscuring how censorship takes place and who demanded it.”