Governance and regulation of content on platforms and content moderation
Research with different fronts of activities on content moderation and InternetLab’s participation in various instances and discussion forums on platform regulation.
Ensuring the right to freedom of expression on the internet is a complex exercise and is not just about discussing what content should or should not be kept on digital platforms. The enjoyment of this right depends on various adjustments, such as the existence of transparent rules, clear and understandable policies for the community, as well as access and equal opportunities for participation for all people. This is not a simple task, and there is no ready-made, fixed answer to all the challenges that moderating online expression presents.
To begin with, the amount of content produced imposes significant logistical difficulties. How can we ensure that this space does not become a breeding ground for hate speech and disinformation, given the huge number of posts in text, images and videos? On the other hand, how can we develop a regulatory arrangement that doesn’t risk censorship by giving disproportionate powers to state entities or platforms? In addition, the disparities in access to connections and the distortions and various forms of discrimination against historically minoritized populations, which originate in the offline world but are carried over to the digital ecosystem, are problems that require intentional, coordinated and multi-sectoral action from the various players who have some influence in this environment.
Understanding the issue from a broad perspective, by conducting research with an impact on public policies, InternetLab seeks to contribute to different instances and forums. We have open dialogues with members of the National Congress, the Federal Supreme Court, the Superior Electoral Court, the Internet Steering Council, among other institutions. We are also members of the Network Rights Coalition (CDR), a group that brings together more than fifty organizations that study and advocate for digital rights on various fronts and working groups. At CDR, we participate in various working groups, such as those dedicated to platform regulation, electoral rules, artificial intelligence, privacy and data protection, among others. We also take part in the Working Group “Democracy, Disinformation and Public Policies” of the National Prosecutor’s Office for the Defence of Democracy, and in the public consultation to update the Santa Clara Principles, an initiative that produced the new version of the document, which represents the consensus positions of civil society organizations and academics around the world. Finally, we have open dialogues and accompany consultation spaces promoted by platforms, as well as initiatives such as the Oversight Board, created by Facebook (now Meta), to which we send comments on cases related to the Brazilian reality.
Faced with all the challenges and all the instances of action, we think about internet governance from different points of view and have published reports that make up the Diagnostics and Recommendations series, with proposals on models for regulating platforms and moderating content.
In this sense, it is worth highlighting two studies that address these issues launched in 2023 by InternetLab. The first one is dedicated to researching the functioning of layered moderation systems, those that add different levels of analysis to certain accounts when deciding whether or not certain content should remain available on a platform. Based on the assumption that the scale of operation of content moderation can sometimes result in mistaken decisions or leave loopholes for various reasons – for example, particularities linked to the cultural, social and political contexts of a given region, we asked ourselves whether the policies that guide the operation of big techs should include additional layers of analysis for certain types of profiles or content. In addition, if the answer is affirmative, we reflect on the structure that should be used to guarantee the efficiency and legitimacy of these systems, as well as how they should be designed to protect users’ rights, with a special focus on fairness and transparency.
In addition, based on the Brazilian platform regulation process, we conducted a study on how the new platform regulation in Europe, the Digital Services Act (DSA), dialogues with other contexts and realities. The project “Devouring the DSA – Platform regulation between North and South” is based on interviews with non-European and non-US experts on the impact of the precepts of this legislation around the globe, especially in countries like Brazil. In this way, the project fosters analytical thinking about new types of regulations that have been created in the European environment, densifying the debate and deepening the study of the concepts that have landed in Brazil following the approval of foreign legislation.
Publications to date:
Pitfalls and paths in the regulation of content moderation
Equal to the platforms? Fairness and transparency in content moderation on digital platforms”