The experience of discussing content policies with Facebook

Opinion 04.24.2018 by Ana Luiza Araujo

“When dealing with the Internet, thinking about policy as an activity only conducted by the public sector is a limited view. This is because the majority of Internet users are mediated by an architecture and by terms of use defined by private stakeholders; thus, the decisions made by platforms are determining parts for the exercise of rights, the reduction of inequalities, and the protection of sub-alternized groups on the web. We believe that the building of solid bridges for dialogue with these stakeholders is essential in order to bring them closer to the demands and findings that result from our research work. For this, we developed our own methodology for acting alongside the private sector, making a direct impact on the internal reflections, and analysis that promote their practices and policies”

Quote from InternetLab’s 2016-2017 Activity Report

We, as InternetLab’s directors, were recently invited by Facebook representatives to take part in a discussion process about the reform of their “Community Standards”. As this process has come to an end and the new community standards have been published, we decided to share our views about this experience.

***

The decisions made by Internet companies regarding the structure of their services and their decision-making criteria have a profound impact on human rights. Although they are submitted to different national jurisdictions (in different levels), the platforms offered by these companies are nowadays unavoidable intermediaries to activities that imply on the exercise of citizenship, participation in the political debate, and on the access to information and knowledge. The release of a new functionality in a social network, changes in the algorithm of a search engine, or the establishment of a new privacy policy can directly impact the exercising of citizen rights, online and offline. Practical examples of the importance of these decisions appear every day, from Twitter’s efforts to fight the manipulation of its trending topics by robots to YouTube’s choice to limit ads in videos based on a series of criteria, which had an impact on the revenue of several content producers.

InternetLab is an independent research center devoted to the discussion of human rights on the Internet, which directs its activities to the formulation of better and more human rights oriented policies. The centrality of platforms leads us to the understanding that not only should the decisions made by public agents be researched and debated but also the ones made by the private sector. In our work, this concern makes more and more sense as our research identifies risks to human rights arising from the actions or the negligence of the Internet platform’s policies. Therefore, we have the aim of opening dialogue spaces with these companies to share our concerns and the conclusions from our researching works. These are, for instance, efforts to invite representatives from these companies for debating when we identify events in which their actions have or would have made a difference, to establish direct communication channels with determined areas of these companies to expose diagnoses, or even to share these channels with other stakeholders with whom we engage in our research areas. In 2016, for example, we moderated a meeting between human rights activists and Facebook representatives to discuss issues relating to online activism and counterdiscursive practices.

Roundtable of Facebook representatives with activists, moderated by InternetLab

Even when these moments reveal more standoffs than agreements (as it was the case in this 2016 meeting), our experience has so far led us to consider that these efforts are worth for the purpose of including the interlocution with diagnoses and voices that are not within the corporative imaginary in the platforms’ agenda. Ultimately, they serve to place the “elephant in the room” — which for a research center is understood as a vital part of the process: good questions are raised, even if they do not necessarily have an immediate answer.

Another example is the event we organized in partnership with the Ruth Cardoso Center and with Facebook’s support, in March of this year: Freedom of speech in a 2 billion people community: difficult questions for Monika Bickert“. The idea was to use the visit of the company’s Public Policy Vice-President to Brazil to talk about the main challenges that the platform faces from the point-of-view of six experts invited by us. The questions were, indeed, difficult.

With this spirit, we accepted the invitation from Facebook representatives to read and present our comments on the new version of the social network’s “Community Standards”. These Standards are the set of policies which determine the content that is or is not suitable for the social network and it is based on them that profiles, pages, posts, videos, and comments can be reported and removed. From the start, we noticed that the new standards offered a greater level of details on their criteria than their previously available version — a progress which is compatible with the growing demands for transparency regarding the sieves used by the content moderators hired by the company.

The guideline for our reading of the document was quite practical: what points of the document would create impactful situations to rights and why? We pointed to topics of the policy which we considered to lack transparency, others whose scope seemed inadequate to encompass situations and, mainly, terms that needed an even more detailed explanation. We also indicated the necessity of a more explicit clarification of the process of analysis of the complaints made by users and how they should be instructed with contextual information for a better evaluation by the moderators. This element of the context is essential to us: in the spaces of discussion about these policies in which we participated, we noticed a recurring asymmetry in the attention given to voices and experts that come from developed countries, which generates a series of biases within the debates that are not always easily recognizable.

This experience puts us before a paradox: while doing advocacy work on private policies from Internet platforms is essential, it has its setbacks. We invested significant institutional resources on a strategy we do not yet know whether it will be effective to generate transparency and new tools for the securing of rights, and neither do we have control over how these contributions can be appropriated by the companies to legitimate their internal decisions.

That is why we felt the need to share this journey. Our participation in the process does not mean in any way an endorsement to the newly published “Community Standards”, and neither does it mean that our monitoring activities of these policies through our research will be compromised: knowing the elaboration procedures used by companies to make these policies favors the formulation of better questions, in a way that also allows for constantly renewed answers.

By Dennys Antonialli, Francisco Brito Cruz e Mariana Giorgetti Valente

Translation: Ana Luiza Araujo

compartilhe