Watching the Watchmen: Content Moderation, Governance, and Freedom of Expression

Date
Summary
"While social media platforms used to be perceived as providing a high level of protection to freedom of expression, they have increasingly restricted their community standards, often silencing minority voices."
This policy document sets out the first solution to a two-pronged solution proposed by ARTICLE 19 and outlines their position on social media platforms' regulation of content moderation in a way that protects the right to freedom of expression and information. As ARTICLE 19 explains, "Governments around the world are seeking to regulate how social media companies address problematic content on their platforms, especially hate speech, harassment, and disinformation. But while well-intentioned, their proposals risk doing more harm than good, and they fail to tackle the real problem: the excessive power of a few huge companies whose business models are inherently exploitative." This first policy document, therefore, sets out how governments can ensure their efforts to regulate platforms respect users' freedom of expression, improve platforms' transparency, accountability, and decision-making, and avoid giving even greater power to the handful of companies that dominate the digital sphere.
The policy builds on ARTICLE 19's previous work in this area - in particular, their policies on intermediary liability and on companies' community guidelines/terms of service. Previously, their proposals have been largely based on what might be understood as light regulation. They argued that social media platforms should continue to largely benefit from immunity from liability for the content of their users. This rule does, however, not prevent companies from being made accountable for failing to remove illegal content. At the same time, this model is predicated on companies operating terms of service in a way that is compatible with international standards on human rights. ARTICLE 19 have defended this model because they believe that it better protects the speech rights of users. It is important for not only social media platforms but also a wide range of other internet actors who deliver infrastructure-level services, such as content-delivery network services and domain-name registrars.
Today, however, as ARTICLE 19 points out, "this paradigm appears unsustainable in the face of the scale of the biggest social media platforms and their consistent failure to appropriately address the criticisms that have been levelled against them - from Facebook's handling of the Rohingya crisis in Myanmar to YouTube's unfathomable position on 'hate speech' and the relentless attacks against women journalists on Twitter. We also believe that transparency should be a basic requirement that pervades everything companies do, accompanied by greater accountability and commitment to the protection of human rights."
In this policy, ARTICLE 19, therefore, examines whether the model they have been advocating for holds water in the face of the criticisms that have been made against the biggest social media platforms. The policy is divided into four parts:
This policy document sets out the first solution to a two-pronged solution proposed by ARTICLE 19 and outlines their position on social media platforms' regulation of content moderation in a way that protects the right to freedom of expression and information. As ARTICLE 19 explains, "Governments around the world are seeking to regulate how social media companies address problematic content on their platforms, especially hate speech, harassment, and disinformation. But while well-intentioned, their proposals risk doing more harm than good, and they fail to tackle the real problem: the excessive power of a few huge companies whose business models are inherently exploitative." This first policy document, therefore, sets out how governments can ensure their efforts to regulate platforms respect users' freedom of expression, improve platforms' transparency, accountability, and decision-making, and avoid giving even greater power to the handful of companies that dominate the digital sphere.
The policy builds on ARTICLE 19's previous work in this area - in particular, their policies on intermediary liability and on companies' community guidelines/terms of service. Previously, their proposals have been largely based on what might be understood as light regulation. They argued that social media platforms should continue to largely benefit from immunity from liability for the content of their users. This rule does, however, not prevent companies from being made accountable for failing to remove illegal content. At the same time, this model is predicated on companies operating terms of service in a way that is compatible with international standards on human rights. ARTICLE 19 have defended this model because they believe that it better protects the speech rights of users. It is important for not only social media platforms but also a wide range of other internet actors who deliver infrastructure-level services, such as content-delivery network services and domain-name registrars.
Today, however, as ARTICLE 19 points out, "this paradigm appears unsustainable in the face of the scale of the biggest social media platforms and their consistent failure to appropriately address the criticisms that have been levelled against them - from Facebook's handling of the Rohingya crisis in Myanmar to YouTube's unfathomable position on 'hate speech' and the relentless attacks against women journalists on Twitter. We also believe that transparency should be a basic requirement that pervades everything companies do, accompanied by greater accountability and commitment to the protection of human rights."
In this policy, ARTICLE 19, therefore, examines whether the model they have been advocating for holds water in the face of the criticisms that have been made against the biggest social media platforms. The policy is divided into four parts:
- First, it sets out some key terms, including 'dominant' social media platforms, 'content moderation', and 'self-regulation'.
- Secondly, it outlines the applicable standards for the protection of freedom of expression online that should guide any legislative and policy efforts in this area.
- Third, it examines the pros and cons of the various proposals that have recently been made around the world. ARTICLE 19 believes that many of the solutions currently being proposed are likely to miss the mark, entrench the "dominance" of the largest players, and be open to abuse by governments by giving them more control over platforms and content. Instead, they outline some solutions that would better guarantee the protection of freedom of expression. For example, they argue that the concentration of power in the hands of a few large platforms should be addressed primarily by pro-competition tools, that the effectiveness of stricter regulatory approaches has not been established, and that independent or multi-stakeholder governance models should be set up instead. They also point out that arguments for an overarching regulatory framework are problematic, that there is a need to address the lack of media diversity, and that the platforms' business model needs reform to comply with data-protection legislation.
- Finally, the policy puts forward a revised position on the regulation of platforms and makes recommendations as to what minimum safeguards a regulatory framework governing the activities of social media platforms should include. In brief, the recommendations are:
- States should refrain from unnecessary regulation of online content moderation.
- Overarching principles of any regulatory framework must be transparency, accountability, and the protection of human rights.
- Conditional immunity from liability for third-party content must be maintained, but its scope and notice and action procedures must be clarified.
- General monitoring of content must continue to be prohibited.
- Any regulatory framework must be strictly limited in scope. Regulation should focus on illegal rather than "legal but harmful" content. Private-messaging services and news organisations should be out of scope. Measures should not have extraterritorial application.
- Obligations under any regulatory scheme must be clearly defined. These include, in particular, transparency obligations and internal due-process obligations.
- Any regulator must be independent in both law and practice.
- Any regulatory framework must be proportionate.
- Any regulatory framework must provide access to effective remedies.
- Large platforms should be required to unbundle their hosting and content-curation functions and ensure they are interoperable with other services.
Source
ARTICLE 19 website on April 28 2023. Image credit: Mariana Coan
- Log in to post comments











































