“YouTube keeps deleting evidence of Syrian chemical weapon attacks”
“Azerbaijani faces terrorist propaganda charge in Georgia for anti-Armenian Facebook post”
“Medium Just Took Down A Post It Says Doxed ICE Employees”
These are just a sampling of recent headlines relating to the regulation of user-generated online content, an increasingly controversial subject that has civil society and Silicon Valley at loggerheads. Through Onlinecensorship.org and various other projects—including this year’s censorship edition of our annual Who Has Your Back? report—we’ve highlighted the challenges and pitfalls that companies face as they seek to moderate content on their platforms. Over the past year, we’ve seen this issue come into the spotlight through advocacy initiatives like the Santa Clara Principles, media such as the documentary The Cleaners, and now, featured in the latest report by Professor David Kaye, the United Nations' Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression.
Toward greater freedom, accountability, and transparency
The Special Rapporteur’s latest is the first-ever UN report to focus on the regulation of user-generated content online, and comes at a time of heated debate on the impact of disinformation, extremism, and hateful speech. The report focuses on the obligations of both State actors and ICT companies. It aims at finding user-centered, human rights law-aligned approaches to content policy-making, transparency, due process, and governance on platforms that host user-generated content.
Recognizing the complexities of governing such platforms at a time of conflicting interests and views about freedom of speech, the Special Rapporteur proposes a “framework for the moderation of user-generated online content that puts human rights at the very center” and investigates throughout the report the sometimes-conflicting laws, regulatory frameworks and other governance models that seek to find balance in corporate moderation practices. The report focuses on freedom of expression while acknowledging “the interdependence of rights, such as the importance of privacy as a gateway to freedom of expression.”
Noting that “few companies apply human rights principles in their operations”, the Special Rapporteur argues for companies to incorporate the UN Guiding Principles on Business and Human Rights into their operations (the Manila Principles on Intermediary Liability similarly call for their adoption). The Guiding Principles, which were endorsed by the UN Human Rights Council in 2011, provide a standard for States and companies to prevent and address the risk of adverse impacts on human rights.
The Special Rapporteur looks closely at the ways in which both government regulation and company content moderation practices can limit freedom of expression for users of platforms. The report specifically delves into areas of concern around content standards (from vague rules, hateful and abusive speech, and lack of context in adjudicating content decisions to real-name requirements and anonymity, and disinformation), as well as the processes and tools used by companies to moderate content (automated flagging, user and trusted flagging, human evaluation, action taken on accounts, notification given to users, and appeal and remedies).
The report further looks at the various modes of transparency (or lack thereof) undertaken by companies. Echoing our recent Who Has Your Back? research and a submission from our colleagues at Ranking Digital Rights, the Special Rapporteur notes that companies disclose “the least amount of information about how private rules and mechanisms for self- and co-regulation are formulated and carried out.” As we have previously noted, most companies avoid transparency when it comes to their own proprietary content rules and practices.
Recommendations
The Special Rapporteur’s report—which additionally cites Onlinecensorship.org research and EFF’s own submission—puts forward a set of robust, even radical recommendations for companies (as well as a slightly more standard set of recommendations for State actors).
Private norms have created unpredictable environments for users, who often don’t know or understand how their speech is governed on private platforms. Similarly, national laws like Germany’s NetzDG, create divisions on the inherently global internet. The Special Rapporteur argues that human rights standards could provide a framework that could hold companies accountable to users worldwide.
Specifically, the Special Rapporteur recommends that terms of service and content policy models should move away from a “discretionary approach rooted in generic and self-serving ‘community’ needs” (indeed, companies all too often throw around the term “community” to refer to billions of diverse users with little in common) and adopt policy commitments that enable users to “develop opinions, express themselves freely and access information of all kinds in a manner consistent with human rights law.”
Furthermore, companies should develop tools that “prevent or mitigate the human rights risks caused by national laws or demands inconsistent with international standards.” In a closing set of recommendations, the Special Rapporteur argues that companies should:
- Practice meaningful transparency: Company reporting about State requests should be supplemented with granular data concerning the types of requests received and actions taken (see our recent Who Has Your Back? report for where popular companies rank on this)
- Provide specific examples when possible. Transparency reporting should include government demands under TOS and account for public-private initiatives such as the EU Code of Conduct on countering extremism.
- Implement safeguards to mitigate risks to freedom of expression posed by the development and enforcement of their own policies. Companies should engage in consultations with civil society and users, particularly in the Global South. Such consultations could help companies recognize how “seemingly benign or ostensibly ‘community-friendly’ rules may have significant, ‘hyper-local’ impacts on communities.”
- Be transparent about how they make their rules. They should at least seek comment on their impact assessments and should clearly communicate to the public the rules and processes that produced them.
- Ensure that any automated technology employed in content moderation is rigorously audited, that users are given the ability to challenge content actions through a robust appeals mechanism, and the ability to remedy “adverse impacts” of decisions.
- Allow for user autonomy through relaxed rules in affinity-based closed groups, the ability to mute or block other users or specific types of content, and even the ability to moderate their own content in private groups.
- Develop transparency initiatives that explain the impact of their various moderation tools. A social media council—an idea detailed at length by Article 19 here— “could be a credible and independent mechanism to develop [such] transparency.”
Lastly, the Special Rapporteur argues that this is “a moment for radical transparency, meaningful accountability and a commitment to remedy in order to protect the ability of individuals to use online platforms as forums for free expression, access to information and engagement in public life.”
We couldn’t agree more. This is the time for companies to rethink content regulations, consider the implications of the status quo, and work toward creating an environment in which users are able to freely express themselves. States should repeal laws criminalizing or restricting expression, and refrain from establishing laws requiring the proactive monitoring or filtering of content, as well as models of regulation where government agencies become the arbiters of lawful expression.
Finally, as the Special Rapporteur argues, it’s time for tech companies to recognize that the authoritative global standard for ensuring freedom of expression on their platforms is human rights law, and to re-evaluate their content standards accordingly. Companies must become more transparent and accountable to their users, and ensure that the right to due process and remedy is enshrined in their policies.