In the second interview of the series, Tom Barraclough talks about auditing mechanisms for platforms

In an interview with InternetLab, the director of Brainbox spoke about independent audits and their relevance when applying new laws that seek to regulate digital platforms.

News Freedom of Expression 09.18.2023 por Francisco Brito Cruz, Iná Jost and Catharina Vilela
In the second interview of the series, Tom Barraclough talks about auditing mechanisms for platforms. Video: InternetLab

In August 2023, InternetLab launched the project Devouring the DSA: Platform regulation between North and South, that looks at the dialogues between the Digital Services Act, Europe’s new platform regulation, and other contexts and realities. The project is based on interviews with non-European and non-US experts on the impact of the European legislation and its precepts around the globe, especially in countries like Brazil.

Throughout the project, InternetLab will seek to foster the debate along three axes, addressing elements of the DSA that have already inspired draft legislation in Brazil: (i) access to data for researchers, (ii) independent audits, and (iii) the analysis of systemic risks. 

The second interview published as part of “Devouring the DSA” is about independent audits and was conducted with Tom Barraclough, director of Brainbox, a New Zealand-based consultancy and think tank focused on the intersections between law, politics and digital technologies. 

After publishing the first interview on access to data for research, in the second part of the series, we will reflect on external and independent audits, one of the instruments chosen by the DSA to assess issues such as the impacts of content moderation and compliance with new rules imposed on digital platforms.

In general terms, audits are tools used to investigate complex processes to determine whether they comply with company policies, industry standards or current legislation. In the DSA, auditing processes are linked to investigating and evaluating compliance with the provisions of this legislation and are divided into different types. Among them is the independent modality, which, under Article 37 of the law, imposes to platforms the obligation to hire independent organizations to carry out annual due diligence on risk measurement and mitigation.

The relevance of this topic is not limited to the complexity of carrying out auditing processes, but also to the perception that this system tends to be reproduced by other countries and regions. This is the case of the Bill 2630/2020, the Brazilian attempt to regulate digital platforms, which includes a provision that reproduces the process. In this context, although widely used in many fields as a method for assessing the possible impacts of innovations and determining their compliance with the law, audits lack specific parameters and guidelines, which can hinder their effectiveness and reliability.

Another critical point is the search for suitable agents to carry out audits, and even who will judge this aptitude. There is a shortage of companies and organizations with expertise in the subject, and there are no clear criteria for defining these actors, either in the DSA or in the Brazilian proposal. These factors generate doubts and open the door to questions about the integrity of the audits that will be carried out in the future.

A few months ago, Brazil witnessed what was considered the biggest corporate fraud in its history when Americanas, a large retail store, declared accounting fraud with an estimated loss of R$43 billion. This case, which had significant national repercussions, generated suspicion about the credibility of internal and external audits, as well as signaling the importance of this type of instrument when it comes to building mechanisms that generate security and transparency in the operations of any company.

Thus, for the implementation of auditing processes in the context of digital platforms, in order to generate the desired results, i.e. greater transparency, legality and the production of efficient analyses, it is essential to ask questions, as well as to treat the issue carefully, especially its application to different countries and contexts. 

In this interview, we explore, based on Tom Barraclough’s contributions, some of the points we consider essential to this discussion. 

Read the full interview below:

InternetLab: Well, first, a kick-off to frame our conversation. Tom, leaving the DSA aside for a moment, how would you briefly explain the idea of carrying out mandatory audits to solve problems in the world of digital policies? In your opinion, is it something close to the function they have of supervising and evaluating processes in other fields that exist outside the DSA? Was this idea of bringing audits into this field inspired by any specific field or regulation? And, just to add to the discussion, how would you explain the role or inspiration that the United Nations (UN) Guiding Principles on Business and Human Rights have in this idea of supervision for platforms?

Tom Barraclough: That’s a complex question. And I should say as well that, so, my background is in law and public policy, but not necessarily in sort of other areas of audit. So my interaction with this area of audit and understanding audit frameworks has mainly been in the context of our work as Project Lead for the Action Coalition on meaningful transparency, as you know, which has been focused on transparency by tech companies and the role of audit frameworks under the Digital Services Act. It’s something that’s going to be implemented. 

So based on my experience having discussions with various experts about the role of audit and where the idea has come from in a tech company space. I think there has been a long standing expectation from civil society and regulators that when companies are publishing information about how they’re dealing with content moderation issues or other kinds of things that are included in transparency reports. Sometimes there are questions that people have about how those numbers are created. And there’s all kinds of issues as well in terms of comparing data, even between transparency reports from the same company. And the metaphor here is, you know, comparing apples with apples rather than apples with oranges. So there might be a statistic and you want to compare two percentages, but it’s difficult to understand how that number was created. So it’s difficult to have confidence in your comparison.

So as I understand it, the role of audit is essentially just to take a statement by a company about how they’re dealing with certain kinds of content or other matters, and to establish that there is a good faith basis for that statement. So the way I understand how audits work is essentially there’s a public statement by a company that “I’ve complied with this thing”, and then you have an independent third party say, I’ve seen the process that has been used to confirm that you comply with this thing and I back it. I have a reasonable degree of assurance, for example, that’s a reliable statement. 

And obviously, the broader context for that is in financial reporting. And I think it’s interesting that the European Commission has compared the DSA on multiple occasions to banking regulation. So the metaphor is that these companies are so big and so important and the communications and commercial infrastructure is so important that they’re almost sort of too big to fail. And they really need to be carefully checked in terms of how they’re operating and their impact on fundamental human rights needs to be assessed. The role of audits is essentially to say, as I’ve heard it described, that companies can’t mark their own homework. So if they are reporting, then they need somebody else to confirm that their reporting has been done in a useful way and complied with various kinds of legal obligations. 

The role of audit is an interesting one because I’m not an expert in audit going back a long time. But it’s interesting to hear auditors describe what an audit is. And essentially it’s just an independent third party assessment. And I think if you think of audits in that way, then you can say that there are independent third party assessments being included in a range of different platform regulation frameworks, including in New Zealand. And it seems to be a pretty core part of how platform regulation is going to work. 

One other thing about audits that’s very interesting to me is the level of transparency that creates. So I think when you see discussion papers from civil society groups, for example, and they list all of the things that they would like companies to disclose, so you know, how algorithmic systems work, how many staff they have. When you see lists from people about what they would like companies to disclose, they have very long detailed lists about how algorithmic systems work, how many staff they have, all kinds of things. There are lots of barriers to disclosing that information in a huge amount of detail. But through these independent audit frameworks, you have a trusted third party who can go into a company and ask questions. They can require documents. They can look at a huge level of detail at some of the things that companies have done. So I think audits are one of the strongest transparency measures that are being created, even though they don’t necessarily result in the disclosure of a lot of information to the public. So there is a lot of information that goes to the auditors. But even if that is not information that is made public.

ILab: Perfect. So let’s go deeper into the DSA. In your perspective, and, of course, recognizing that the term audit appears with apparent different meanings in the Digital Services Act, how would you explain the DSA’s provisions or provision of mandatory audits to confront problems of online platforms and search engines? 

So to be more straightforward, what is your reading about exactly what people in the policy circles in Brussels think when they are talking about auditing platforms as a part of this piece of regulation? And of course, what choices in those specific provisions, like Article 37, do you think are connected to European perspectives or European-specific concerns?

TB: There seems to be a consensus among auditors, the European Commission and civil society, including, you know, human rights risk assessment professionals, that what the DSA is creating is a new kind of audit. So it’s not an audit that is seen elsewhere. It’s a brand new type of thing. There is a lot of opportunity to learn from existing frameworks. However, there is also a lot of design being done to work out how these things are going to work for the first time. 

The way that audits work, under the DSA, is that auditors, so professional audit firms, are potentially working in consortiums. So working in groups with other kinds of professionals will audit companies against their legal obligations. So there are the various articles of the DSA and very large online platforms, and very large online search engines will be audited to see whether they comply with those legal obligations. As I understand that, that’s quite different to how auditing works normally. Normally you will have a series of quite clear benchmarks that you can say with a lot of certainty whether a company has complied or not. 

Whereas because of the way that the DSA is drafted, there is some uncertainty as to what it means to actually comply with the DSA. So for an auditor to say you have complied or you haven’t complied is very difficult because compliance can be something that is not that clear. For example, there might be an obligation to balance a range of fundamental human rights and comply within a reasonable time frame. So that’s not a very clear line that you can audit against. It’s going to require the company or somebody else to say, this is what we understand, compliance domain and here’s how we comply with our understanding of it. So it’s pretty much a new kind of audit which is causing some difficulties for people, I think. 

My understanding is that there would be a lot of merit in having multi-stakeholder processes that try to make it more clear what the benchmarks for compliance with the DSA would look like. So I would like to see that happen and like to be a part of that exercise. I think the specific sort of European assumptions that step behind the DSA are really interesting. 

A few features to consider would be: the European Commission has said on multiple occasions that the DSA can’t be looked at in isolation. The DSA sits within a broader constitutional structure that relies on the Charter of Fundamental Human Rights as well. So it’s kind of impossible to take the DSA without also understanding the European Charter on Fundamental Human Rights because they are part of a package. I think that’s important because if you were to pick up the DSA and drop it in another area without those fundamental human rights protections, then it would create a lot of risk to human rights. 

One of the other areas that I see as being relatively European focused, I suppose, is if I think about New Zealand, we will have some provision for independent assessment of things produced by companies. So at the moment we are looking at this in the context of a proposed platform regulation legislation that will be coming in far in the future, next year, if at all. But that will also rely on codes of practice. And there is an existing industry lead code of practice in New Zealand. And the point is that both the proposed legislation and the self-regulation include scope for independent assessment. So independent assessments are something that is being replicated in New Zealand. 

To do independent assessments in the way that the DSA requires requires a professional audit firm working with a group of professionals who are very scarce. So one of the other things is that the kinds of people who have the skills to look at company systems in an operational sense, look at policies and procedures, look at data, look at machine learning systems are a very small group of people. And so it’s going to be quite hard to find just the people required to do this. And that is something that came through in discussions with the Action Coalition. 

And another feature is that the resources required to perform an audit and the way required by the DSA might make sense in Europe for companies that have 45 million users in the European Union. But in New Zealand, for example, the threshold that is being considered is 100,000 users. So a really clear difference and it would be very hard to justify conducting a DSA type audit for a market that may be at maximum 4 million users. If you had most of New Zealand signed up to it. 

So I think the interaction with fundamental human rights charters is one component. And then the second is I think assumptions about the level of resourcing that are justified in terms of how this will work, as well as access to people who can do the work required. I think those are things that perhaps are taken for granted in the European Union that might not apply elsewhere.

ILab: And who will be the auditors under the DSA? Not only who will be, who should be the auditors in the DSA? Can you speculate a bit and give your opinion about who this auditor character should be and will be, and what type of company or organization will fulfill this role? 

And also, how should the registration/accreditation of organizations should take place? And if you believe that civil society should have a role in it? So it’s a multiple question, but the question is driven towards understanding the multiple roles and characters that are going to interact under this provision.

TB: Yes. In a European context, there are obligations to find an organization that is independent, that is subject to obligations and professional secrecy, and that has systems that are capable of maintaining the confidentiality required. Given that auditors will have extraordinary access to very sensitive commercial information. So I think the expectation in Europe is that large audit firms such as I think the Big Four or Big Six will be the organizations performing these audits. Because the type of audit is relatively new, they are not necessarily enthusiastic, I think, about performing these audits because there is a lot of work involved and a lot of risk too. 

So as I understand it and again, a lot of these insights have come from other people through the Action Coalition, so I’m just, I’m just parroting what I’ve heard. The audit firms will have to be designing these audit processes and then will also be accountable for what they say. So audit firms carry risk from performing audits. And so it’s very important that they feel confident and being able to express their conclusions. So it’s not as if the audit firms are necessarily lining up to perform these audits, even though they are probably the only ones that can do it. 

One of the really interesting things that came out of the delegated regulation on audit under the DSA was an expectation that there may need to be consortiums performing these audits. So audit firms may need to work with other specialist organizations. I think there is scope for working with civil society organizations. And there one of the difficulties is that, if an organization wanted to partner with an audit firm or an audit firm wanted to partner with an external organization, there are very stringent independence and conflict of interest and secrecy obligations. So the level of trust and confidence and legal arrangements required to form those kinds of consortiums is very high. 

And I think these kinds of organizations are not necessarily used to working together. And so it will be really interesting to see whether that is possible and what kinds of organizations will be prepared to enter those kinds of consortiums. 

I can’t remember the second part of your question. Oh, accreditation and civil society input. So accreditation is going to be a difficult one. Personally, I think that the companies have been under a lot of scrutiny until now. I think a lot of that scrutiny is going to start to turn towards the auditors. So the companies will put out their risk assessments. The auditors will be obliged to say whether those risk assessments are reliable. And so I think the attention of a lot of people is going to start to turn on the audit firms. And I think that will be quite uncomfortable for them. But I think they may be the next villains in the space, the next bad guys, which is appropriate. They should be tested to make sure they are independent and rigorous. But again, that will be, that will be an interesting thing to watch. 

There are really significant independence obligations. And one of the ones under the audit, under the DSA also relates to not being able to provide non-audit services as well. So many of the same companies may be involved in providing consultancy services or similar, or they may even be outsourcing various internal functions to the companies. And one of the other things that I’m sure the companies are considering is the revenue and risk associated with audits on the one hand and the revenue and risk associated with being able to provide non-audit services. And I’m not sure what that looks like, but it may be that it’s quite unattractive to provide audits. And instead you’d prefer to continue providing consultancy services.

ILab: And civil society is in this difficult place of having to collaborate with those four or six large firms that have never interacted continuously with it. Right? Like this is under the DSA, this is the uncomfortable role of civil society. So please comment on that. How does civil society react to that, and how do you understand those reactions?

TB: I think this is a really interesting topic that applies to transparency more generally. So there have been calls for transparency for a long time, and now the DSA is going to produce an extraordinary amount of information. And the European Commission has said on multiple occasions that it won’t be able to do this alone. It’s going to need civil society support to help it turn its attention to areas that most need scrutiny. So as I see it, there’s going to be this big wave of information coming out about all of these companies, particularly the very large ones, but then also the smaller ones. And the expectation will be that civil society organizations and researchers can look at that and use it to hold the companies accountable. That takes a really long time and it’s very complicated work. 

And as you pointed out, I think, to do an independent assessment of transparency reports, for example – – this is something that is considered under the New Zealand Code of practice that is being created – really requires an understanding of global human rights principles, the UN guiding principles on business and human rights, what the companies are doing in other jurisdictions, what they’ve done in previous years. Comparing and contrasting the different materials in this way. And I think that’s a very difficult task. I think it’s a very specialist’s task. 

But there doesn’t seem to be any sort of planning for where the resourcing for that is going to come from. So I think that’s a real risk for meaningful transparency. I think there’s a real risk that it will sort of be like: you asked for transparency, now you’ve got transparency up to you, what you’re going to do with it? And there’ll be an assumption that there’s no problem when actually it’s just impossible to keep up with all of the information required. So I would really like to see, whether it’s the companies or whether it’s the commission or governments, to proactively plan how they’re going to support civil society to deal with this volume of information. 

And one of the other points that people make in this space is if we think about the volume of information under the DSA, that’s one thing. But then there’s also other regulations of the European Union. So the digital market act, AI act, other sort of corporate social responsibility directives, that’s just the European Union. Then, for example, there’s going to be other countries following the DSA, like Brazil, like New Zealand. So that, you know, it’s not just the DSA we have to think about if we want to think about the human rights, compliance and legal compliance of tech companies, it’s also other jurisdictions, too. And, you know, that’s to say nothing of keeping an eye on the government as well and how the government is dealing with these things and how it’s using that information.

ILab: You touched on the subject of human rights and our next questions about standards. So how about the standards and procedures? 

And the main question here is, can those be similar in structure to the ones that are adopted by the traditional big auditing firms, even though you’re talking about obligations that have strong grounds on human rights language and no established standards and metrics? 

And from a human rights perspective, there is a key place, for example, for stakeholder engagement, especially in marginalized communities, as a way of mapping risks and impacts. So from a perspective of form, function, content, how should these standards and metrics be developed? And what are the challenges that we can anticipate from this process?

TB: I think that is probably the most challenging area at the moment for successful implementation of the DSA. And there’s this other question of how the DSA will be implemented, but then also how we can have standards that, to the extent possible, work in multiple jurisdictions as well. So what I mean is we could have standards and procedures for implementing the DSA, but there will be certain areas where it makes sense for the standards to be the same in different countries. Their suggestion that there should be a modular approach adopted. And that’s some work that we’ve looked at in the Action Coalition done by Chris Riley and Susan Ness. And I’m really interested to explore that further because I think particularly for New Zealand, we’re such a small company, a small country and a small market that we really need to be participating in those processes if we want to have a big impact because our own legislation just won’t be that persuasive to the companies, I think, given the size of our markets. So I think there’s a really good opportunity for countries to consider participating in the creation of both standards and procedures as a means of soft power, basically. So I think that’s going to be very important. 

I don’t have great familiarity with the way existing audit standards work, but my understanding is that the kinds of standards that will be required under the DSA will look quite different from existing audit standards. So it’s going to be a real challenge to try and replicate those. And like I said at the start, because the audit process is so different, I think it will need a completely different kind of standard. I understand that there have been various standards, bodies that have been working on content moderation and trust and safety standards for some time, at least a year, I imagine. And so I’m quite interested to see how far they’ve got, because they’ve been working on it for some time and if their professional standard bodies haven’t come up with something that looks credible, then that’s probably a pretty strong signal of how difficult this exercise is.

 I think when it comes to standard setting processes, my understanding is that they are not normally designed to incorporate civil society input. So that is going to be, it has to be something that is learnt and developed. The other component of that will be whether civil society organizations can effectively engage with those processes, given perhaps lack of understanding about how they work and also the resourcing required to participate in those particular if they are operating at a global level across multiple frameworks. Again, that’s very complicated stuff. 

We did some work recently with the GNI and the Digital Trust and Safety Partnership on implementation of risk assessments and published a discussion summary from that. And one of the interesting things that came through in that discussion summary related to how far stakeholder engagement processes and the human rights assessment mechanisms are similar to those under the DSA. And I think the conclusion expressed by some people there was that there is a high degree of similarity between the stakeholder engagement expectations in the, for example, the UN guiding principles on business and human rights and the stakeholder engagement expectations under the DSA. So stakeholder engagement would be something that really needs to be implemented alongside any kind of audit or risk assessment processes. And there’s a really good body of practice and guidance out there already on how to do this effectively, although actually doing it effectively is a real challenge as well. So that shouldn’t be underestimated. 

So I think those standards and procedures are going to be very, very important. I think it’s going to be a real challenge to do those processes well. And I think at the moment, there’s also a question about who or what organization should be leading that. And I haven’t heard many suggestions about who could be doing that. That’s something that we would like to support, within the Action Coalition if possible. And I do think a multi-stakeholder approach is important and there’s other kinds of organizations like GNI that have been managing multi-stakeholder processes for a long time. So it’s important to make sure that we’re learning from that experience too. 

ILab: Keeping in this conversation about standards, in your response, you talked about soft power. And maybe we can dig a little bit on that. Talking a bit on how you would see the global impact of these provisions for auditing in the light of the so-called Brussels effect, which of course refers to the fact that this act can be inspiring the creation of new regulation overseas. 

Do you think that the European provision and the European standards following the provision would influence the field and create room in markets for others that will plug and play? So do you think that this consolidation driven by Europe would create, for example, barriers to auditing standards that may not include, for example, and may not listen to different realities, to different stakeholders, to more decentralized ways of doing so? 

For example, if Brazil delays too much to build its standards, to build its provisions, would we at the end have to rely on the standards, the consolidated standards of Europe and not having the opportunity for creating things at home? So how do you, how do you see this discussion taking into account, of course, that there is some value in consolidation? So having standards is important for going forward? 

TB: Yeah, it’s a really interesting question. This is something I’m interested in. Again with reference to our experience in New Zealand. So. In New Zealand, there’s a group of companies along with a civil society organization called NetSafe, who have developed what’s called the “Aotearo”, a New Zealand code of practice of online safety and harms”, and that was presented to civil society and in kind of draft form with a relatively short turnaround period for consultation. And there was a lot of concern from some civil society organizations that this had really been presented as something that was finished already without any meaningful opportunity for input from civil society organizations as to how it was developed. Obviously the code of practices is not quite the same as a set of audit standards, but it is similar in the sense that it’s something that sets below legislation and does create a set of rules that are applied in multiple different, could be applied in different jurisdictions. My own take is that there’s a lot of good in the code as in the way that it’s been developed, but it has been developed in a way that doesn’t really include the same kind of democratic input that you might expect in other kinds of law. 

So we are sort of already in this situation where a code of practice has been developed based on the European Code of Practice on disinformation and based on the Australian Code of Practice around online harms and sort of presented to us as being something that is ready to go. So I would say that similar processes will result in situations where people are presented with standards that they don’t feel that they had an opportunity for meaningful input into. I think it’s important to say that that doesn’t necessarily mean that the standards or the code aren’t good. So it is useful in some ways to have a starting point for discussion. So, for example, the fact that the code’s been presented here means that we can comment on it. We can make suggestions for improvements and things like that. But I think there is a sense that the capacity for really big changes is quite limited. 

So we’re in a situation now where we are sort of being presented with this code and we sort of have to accept it or not. Interestingly, the signatories to the code have already said that they are in discussions with other countries, too, about adopting similar codes. Off the top of my head, I believe they’re having discussions with Japan, Sri Lanka and Indonesia and I think a couple of other countries too. So there is this thing going on with standards being created to some degree and then being sort of given to countries to broadly accept. But like I say, there is some degree of commonality that needs to exist in order for the platforms to operate globally. So to some degree, you sort of have to accept that there will be some similarities and that if you want to be effective and some things sort of can’t be changed as well. So it’s a tricky balance there.

 I think many countries will want to take advantage of a kind of plug and play approach. And I think that’s because designing platform regulation is quite difficult. I think in New Zealand, officials have found it difficult to not only capture a huge amount of community input and community expectations while also establishing what New Zealand needs from its own platform regulation, while also taking account of global developments and certain minimum expectations about how these things will work. So we very much do sit and watch what other people are doing and then we try to take the best bits from other countries. So to some degree, that plug and play approach is beneficial. It’s not necessarily a bad thing, but it does result in the risk that there isn’t meaningful democratic input into the design of those systems. And I do think that’s a trade off of some of the consolidation.

I also think that it’s important that platform regulation stuff doesn’t move too fast. So it’s important to act and not be left behind. But at the same time, I think that can be used as a justification by governments for ramming stuff through too fast. And I think that undermines trust and confidence and it fuels suspicion that some kind of illegitimate attempt to undermine freedom of expression rather than something else. So I’m also really cautious about anything that says that we need to move fast, otherwise we’ll be left behind, because I think there can be a justification for bad lawmaking.

ILab: You gave the hint for the last two questions that we have. That is about trust and confidence and the asymmetries we have and that you have already outlined in the last one. 

So, among others, the key question that is always present in regulatory debates transposed to other realities is how to understand the different dynamics in those realities. And here when you are talking about audit, you know, when you are talking about audit firms, we are talking about the role of the economic power, especially multinational corporations and firms. 

Is it possible to assert, for example, here in Brazil, that many of these auditing firms are more already intertwined with Alphabet and Meta abroad than close to the discussion around regulation in non-EU or non-US realities? And this is particularly important when connected to the previous discussions around human rights due diligence.

So, how do you make this discussion around economic power and using national corporations in asymmetric economic contexts? Don’t you think that if it’s uncomfortable for civil society organizations in the European Union, and we are here talking about rich countries, isn’t it too far, too far away for grassroots organizations in an equal context to have contact with those firms and develop work with them?

Can we say that those firms would be suited to conduct stakeholder engagement, taking this into account, taking this whole picture into account? Of course, this is a broader question, but if we are transplanting or importing this kind of provision, we should consider the context, right?

TB: Yeah, absolutely. So I think there is a real issue with large multinational companies or any organization that is based in one context attempting to meaningfully lead stakeholder engagement in another context. And I think about that a lot in my own work and my own position as well. And there’s really no substitute, I think, for authentic community leadership on this. So I think that has to be the starting point. I do think that it will be very difficult for organizations in the Global North or minority world to meaningfully engage with organizations in the Global South or majority world. And I do think that it’s going to be a real issue. 

And I think, as I mentioned before and as you raised, that kind of economic asymmetry is a really, really big deal. There’s the economic and then there’s also, there’s other things, too, like I’m always really struck when dealing with civil society organizations based in America and the EU, how many of them are constantly in contact with each other and know each other very well and all understand exactly what’s going on in various ways. But it’s very difficult, I think, for them to share that information. And it’s very difficult to access that if you don’t have a way into those communities. 

That’s something that we’re trying to address through the Action Coalition Transparency Initiatives portal that we’re developing. I think that will be a great resource for showing what these different networks are and what is already happening in those spaces and being able to sort of map what’s going on. So we’re really excited to put that out as a useful resource for people. 

I wanted to come back as well to this and this issue of transposing regulatory frameworks into contexts without, without being careful about how it’s being done. One example that really sticks out to me is in the European Union. There has been a long focus on the use of internet referral units and trusted flaggers. So situations where external organizations, including governments, can flag violating content to platforms. 

That issue is getting a lot of attention in the United States at the moment as well with some of the litigation that’s happening. And one thing that I’ve noticed about New Zealand is, I think when governments are proposing platform regulation, just for whatever reason, they’re very unlikely to transpose things that put obligations on them or limit their own power. So one really big glaring absence in New Zealand’s proposal is around trusted flag, oversight and quality assurance. So we know that there are sort of referrals that happen in New Zealand, but it’s kind of a pretty glaring omission that we’ve sort of pulled that from overseas legislation, but we haven’t called the trusted flag stuff, you know? So I think bits, things like that do sort of undermine trust and confidence, I think. And the way that governments are approaching these things and they are a good example of having to take the whole framework, not just individual provisions. 

I think the issue that you’re flagged here about asymmetry and sort of the need to establish better global links that actually result in or fintech design that account for local conditions is a really big one. And I think it’s something that we’re going to see a lot of. I personally see a lot of benefit from working around governments, working across international borders, having conversations like this, sharing knowledge in ways that it’s effective to sort of give everyone else a heads up about what you might like to look out for and trying to get a degree of agreement on. So I’m really enthusiastic about growing those links, going forward, because I think it’s going to be very important.

ILab: So we’re heading to the last one. And the last one is also about trust. I think it is about an even more shadowy side of it, which is credibility. Here in Brazil, for example, just a few months ago, we witnessed what was considered the biggest corporate fraud in our history with a huge retail store that disclosed an accounting fraud with an estimated loss of $10 billion. This case, with great national repercussions, generated huge distrust, not only in this company but also in external audits with big firms involved. 

In contexts like this, how can we think about audits in platform regulation? So if we are going to bring this to our world, to the digital policy world, are we bringing those kinds of problems? Are any guardrails that we can bring alongside the idea of auditing that can protect us from this kind of thing, which is, of course, when two huge companies start to mingle too much with each other and mingle too much with their economic interests and the one who should be the independent third party turn into a “cover-up partner”. So, what do you think about that?

TB: As we were talking before, I was also thinking about the recent example in Australia, where I can’t remember the name of the company, but essentially one of the companies, as I understand it, was both providing advice on tax policy to the Australian Federal Government while one of the partners was also disclosing that information to the companies that were meant to be subject to those tax proposals, which I think included some of the technology companies. So that’s just my recollection of the news. I might have got that wrong. 

So yeah, I think you’re right to identify that trust issue. I think the lawyers would just say, well, we just made, you know, more legal agreements, you know, more independence requirements, more disclosure requirements. Obviously, those things don’t always work. And when they don’t work, trust is damaged. To some extent, I suppose that’s unavoidable. There will always be breaches of the law and failures and fraud. So to some extent, maybe we just have to acknowledge that and deal with the consequences when we do identify it. But yeah, that’s very much a relevant consideration.

I don’t have a good sense whether the companies that would be performing these audits are used to trying to build trust and confidence with community organizations or civil society. In some situations they do provide sort of stakeholder engagement management services. So it’s not impossible that they could actually be very good at it. I’m not sure. I think sometimes governments are very, very bad at stakeholder engagement too. 

So I don’t want to rule out the prospect that it could be great, you know, could be great to have these audit firms doing this work, if they do a proper job of it. But at the same time, I think that sort of speaks to this need maybe for new kinds of organizations that can perform these services and they just do one thing, you know, they don’t do everything. They just specialize in providing these kinds of assurance and audit services. I think that is probably where things are hidden, mainly because the areas of expertise are so rare. You know, you sort of got the organizational expertise and how large organizations work. You’ve got the legal, you’ve got the technical. 

I think what would be great is to see specialist organizations created that are designed to build trust and confidence and transparency in the way that they do these things. I suspect there are a lot of people thinking about building this. And I read recently about, for example, the growth of trust and safety outsourcing companies coming out of trustcon and the number of sort of companies that were there offering trust and safety as a service. So I think, yeah, that trust and confidence thing is a big one. I’m sure if I predicted what would happen, we would identify issues with that. The question would be what we do to build trust and confidence afterwards when they’re identified? And the alternative would be that you have specialist organizations that grow up to meet this need, which I think would be really interesting to say.

ILab: So we’re reaching the end of our conversation. And I think we pretty much covered all of the subjects that we wanted to and with very interesting insights. I think this conversation will keep us thinking about that for some weeks, some more than days for sure. And ending that where we are, we wanted to ask you if you have anything to add. Like, what’s your recommendation for digesting the DSA provisions in realities that are not Europe and that are not the US?  Final comments.

TB: It is difficult, isn’t it, to keep up with the volume of legislation, proposals and the DSA?  I think it’s reassuring to hear that if you have, if you or anyone else has a lot of questions, but not a lot of answers that the companies and regulators also seem to have a lot of questions, but not a lot of answers. So I find that quite comforting when we have Action Coalition discussions to talk to people whose experience and expertise I really admire and they don’t have the answers either. 

So I think that that’s something to bear in mind. I’m personally really interested in how we can make legislation more accessible by making it more easy to use in a sort of digital sense. So that’s something that BrainBox is exploring. How you can create sort of machine structured marked up versions of legislation that make it much easier to access. That’s something that we’re working on in the background. But yeah, there’s just a lot out there. I think having discussions with people and I don’t know about you, but it’s always better to hear key insights from somebody in the course of an hour long conversation than it is to try and sit down with like 100 pages of legislation and try to understand it all. So I think that’s an important component of all this, as well as how we have these networks where we can share learning in ways that are useful and equally, you know, bidirectional too. So that’s really important, it’s not just the EU and the DSA sharing information in one direction. It also needs to be learning in the other direction too.

The interview was recorded and is available on our YouTube channel. Watch it here.

compartilhe