Development action with informed and engaged societies
As of March 15 2025, The Communication Initiative (The CI) platform is operating at a reduced level, with no new content being posted to the global website and registration/login functions disabled. (La Iniciativa de Comunicación, or CILA, will keep running.) While many interactive functions are no longer available, The CI platform remains open for public use, with all content accessible and searchable until the end of 2025. 

Please note that some links within our knowledge summaries may be broken due to changes in external websites. The denial of access to the USAID website has, for instance, left many links broken. We can only hope that these valuable resources will be made available again soon. In the meantime, our summaries may help you by gleaning key insights from those resources. 

A heartfelt thank you to our network for your support and the invaluable work you do.
Time to read
3 minutes
Read so far

Watching the Watchmen: Content Moderation, Governance, and Freedom of Expression

0 comments
Date
Summary
"While social media platforms used to be perceived as providing a high level of protection to freedom of expression, they have increasingly restricted their community standards, often silencing minority voices."

This policy document sets out the first solution to a two-pronged solution proposed by ARTICLE 19 and outlines their position on social media platforms' regulation of content moderation in a way that protects the right to freedom of expression and information. As ARTICLE 19 explains, "Governments around the world are seeking to regulate how social media companies address problematic content on their platforms, especially hate speech, harassment, and disinformation. But while well-intentioned, their proposals risk doing more harm than good, and they fail to tackle the real problem: the excessive power of a few huge companies whose business models are inherently exploitative." This first policy document, therefore, sets out how governments can ensure their efforts to regulate platforms respect users' freedom of expression, improve platforms' transparency, accountability, and decision-making, and avoid giving even greater power to the handful of companies that dominate the digital sphere.

The policy builds on ARTICLE 19's previous work in this area - in particular, their policies on intermediary liability and on companies' community guidelines/terms of service. Previously, their proposals have been largely based on what might be understood as light regulation. They argued that social media platforms should continue to largely benefit from immunity from liability for the content of their users. This rule does, however, not prevent companies from being made accountable for failing to remove illegal content. At the same time, this model is predicated on companies operating terms of service in a way that is compatible with international standards on human rights. ARTICLE 19 have defended this model because they believe that it better protects the speech rights of users. It is important for not only social media platforms but also a wide range of other internet actors who deliver infrastructure-level services, such as content-delivery network services and domain-name registrars.

Today, however, as ARTICLE 19 points out, "this paradigm appears unsustainable in the face of the scale of the biggest social media platforms and their consistent failure to appropriately address the criticisms that have been levelled against them - from Facebook's handling of the Rohingya crisis in Myanmar to YouTube's unfathomable position on 'hate speech' and the relentless attacks against women journalists on Twitter. We also believe that transparency should be a basic requirement that pervades everything companies do, accompanied by greater accountability and commitment to the protection of human rights."

In this policy, ARTICLE 19, therefore, examines whether the model they have been advocating for holds water in the face of the criticisms that have been made against the biggest social media platforms. The policy is divided into four parts:
  • First, it sets out some key terms, including 'dominant' social media platforms, 'content moderation', and 'self-regulation'.
  • Secondly, it outlines the applicable standards for the protection of freedom of expression online that should guide any legislative and policy efforts in this area.
  • Third, it examines the pros and cons of the various proposals that have recently been made around the world. ARTICLE 19 believes that many of the solutions currently being proposed are likely to miss the mark, entrench the "dominance" of the largest players, and be open to abuse by governments by giving them more control over platforms and content. Instead, they outline some solutions that would better guarantee the protection of freedom of expression. For example, they argue that the concentration of power in the hands of a few large platforms should be addressed primarily by pro-competition tools, that the effectiveness of stricter regulatory approaches has not been established, and that independent or multi-stakeholder governance models should be set up instead. They also point out that arguments for an overarching regulatory framework are problematic, that there is a need to address the lack of media diversity, and that the platforms' business model needs reform to comply with data-protection legislation.
  • Finally, the policy puts forward a revised position on the regulation of platforms and makes recommendations as to what minimum safeguards a regulatory framework governing the activities of social media platforms should include. In brief, the recommendations are:
    1. States should refrain from unnecessary regulation of online content moderation.
    2. Overarching principles of any regulatory framework must be transparency, accountability, and the protection of human rights.
    3. Conditional immunity from liability for third-party content must be maintained, but its scope and notice and action procedures must be clarified.
    4. General monitoring of content must continue to be prohibited.
    5. Any regulatory framework must be strictly limited in scope. Regulation should focus on illegal rather than "legal but harmful" content. Private-messaging services and news organisations should be out of scope. Measures should not have extraterritorial application.
    6. Obligations under any regulatory scheme must be clearly defined. These include, in particular, transparency obligations and internal due-process obligations.
    7. Any regulator must be independent in both law and practice.
    8. Any regulatory framework must be proportionate.
    9. Any regulatory framework must provide access to effective remedies.
    10. Large platforms should be required to unbundle their hosting and content-curation functions and ensure they are interoperable with other services.
ARTICLE 19 notes that the solution outlined in this policy - setting human rights standards for social media services - addresses only part of the problem. In the organisation's estimation, a few platforms currently dominate the social media markets, exploit their users, and violate the rights to privacy, free expression, and non-discrimination. The lack of viable alternatives locks users into these exploitative relationships. For this reason, to truly fix problems in the social media markets, ARTICLE 19 argues for the need to tackle the excessive market power of the few huge corporations that control them. This is dealt with in the second policy document "Taming Big Tech" (see Related Summaries, below).
Source
ARTICLE 19 website on April 28 2023. Image credit: Mariana Coan