Best practices of self- and co-regulation of platforms—towards a legal framework

Related event

Session date
Session ID:
WS12

Resource type
Event reports

Author:
Ilona Stadnik

The workshop examined four cases dealing with harmful content online through frameworks of self- and co-regulation of online platforms, including multistakeholder and multidisciplinary approaches.

The moderator, Ms Elena Perotti (Executive Director of Media Policy and Public Affairs at WAN-IFRA), noted in the beginning that while Europe has soft law arrangements like voluntary codes of conduct designed to regulate harmful content, they are not sufficient to address extremist content and disinformation while ensuring the right to free speech.

The Christchurch Call

Mr Paul Ash (Coordinator of the Christchurch Call, New Zealand Government) provided details on the Christchurch Call, an example of a multistakeholder approach where not only do governments and private companies try to eliminate extremist content online, but they also consult and engage academia and civil society in the process.

He noticed that the most complicated aspect of co-regulation is the reconciliation of different accountability and power structures that exist within the various bodies involved. Clearly, a multistakeholder approach is more difficult than linear legislative solutions. Mr Ash underlined the response that New Zealand has received from European partners.

Facebook Oversight Board

The primary reasons for the creation of the Board in 2020 were: the rise of cybersovereignty issues and the effects of social media platforms on users; increased cases of incidents related to Facebook services due to its ubiquitousness; and the need to strengthen legitimacy with its users.

‘Companies should not be the final arbiter of what can and cannot be said on their platforms. Users should have a voice and their cases should be heard by an independent appeal body’, said Mr Cherine Chalaby (Member of Board of Trustees at Facebook Oversight Board)

He then shared details of the Board’s work. Composed of 20 appointed members that are ‘credible thinkers and leaders’, the Board provides independent decisions on user appeals regarding Facebook and Instagram content policy, taking into account public comments in each case. Board decisions on cases are binding for Facebook—more than 400,000 appeals have been received since January of this year. The Oversight Board reviews a selected number of highly emblematic cases that are difficult, significant, and globally relevant to inform future policy. The Board also makes recommendations and highlights systemic problems of content policies. The trustee board is an important element of the self-regulation model, providing a necessary shield from Facebook, including financing. ‘Institutions such as the Oversight Board are, in my view, necessary’. We do not want for-profit corporations regulating the global online space in their own economic interests. Nor do we want national or regional political interests balkanising the same sphere’, concluded Chalaby.

Code of Practice on Disinformation

Characterised as the first worldwide instrument to which industry has agreed on a voluntary basis by Mr Lewis McQuarrie (International Policy Manager, OFCOM UK) the Code was signed by Facebook, Google, Twitter, Mozilla, as well as by advertisers, to empower users and the research community to enable greater scrutiny of ad placement and to make political advertisement more transparent.

The Code is a hybrid of self-regulation and co-regulation. It was set by the European Commission and monitored throughout its first 12 months by several stakeholders, including regulators, civil society, academia, and others, who also participated in the 12-month review of its implementation and efficacy.

McQuarrie shared several lessons learned. Because of the dynamic nature of the market and user behaviours online, regulatory decisions should be tested to avoid unintended consequences. Platforms themselves need to have robust systems to monitor the effectiveness and outcomes of actions and to calibrate response. Self-regulation and co-regulation tools work well when involving public bodies and when transparent in their operations. McQuarrie shared plans to give the Code of Practice some legal force through the Digital Service Act.

European Digital Media Observatory (EDMO)

EDMO is an independent platform, a digital service infrastructure, and a community builder, noted Ms Paula Gori (Secretary General, EDMO).

The aim is to be a body of facts, evidence, and fact-checking tools with a multistakeholder and multidisciplinary approach. ‘On the one side, we want to protect the right for informed decisions, and, on the other, we want to protect fundamental rights. And in parallel to that we want to avoid citizens losing trust in media and platforms’, Gori stated.

In addition to fact-checking tools, EDMO also provides free training in media literacy.